Error publishing events (retrying): EOF

Dear All,

File beat is not sending events to logstash after starting my this command.

sudo ./filebeat -e -c filebeat-logstash.yml -d "publish"

It thrown an error messages..

single.go:77: INFO Error publishing events (retrying): EOF
2016/11/08 16:09:04.700334 single.go:154: INFO send fail

I'm using Filebeat -> Logstash -> Elasticsearch -> Kibana to have an overview of my glassfish log file.
Logstash+Elasticsearch+Kibana is in ELK Docker Container.

Here is my logstash output

http://pastebin.com/FP10pdq2

My Filebeat configuration
http://pastebin.com/kcHf2h1H

My Logstash configurations in ELK docker container.

root@4569cf1f66ab:/etc/logstash/conf.d# 
01-lumberjack-input.conf.backup  
02-beats-input.conf  
10-syslog.conf  11-   
nginx.backup  
30-output.conf

02-beats-input.conf

input { beats 
{ port => 5044
ssl => false
#ssl_certificate => "/etc/pki/tls/certs/logstash-beats.crt"
#ssl_key => "/etc/pki/tls/private/logstash-beats.key"}}

10-syslog.conf

filter {
 if [type] == "log" {
  grok {
   match => { "message", "(?m)\[\#\|%{TIMESTAMP_ISO8601:timestamp}\|%{LOGLEVEL:Log Level}\|%    {DATA:server_version}\|%{JAVACLASS:Class}\|%{DATA:thread}\|%{DATA:message_detail}\|\#\]" }
  add_field => [ "Log level", "%{LOGLEVEL:Log Level}" ]
}
}
syslog_pri { }
date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
}

Any idea where i'm going wrong?

Thank you for your help.

Best Regards,
Thomas

{ port => 5044
ssl => false
#ssl_certificate => "/etc/pki/tls/certs/logstash-beats.crt"
#ssl_key => "/etc/pki/tls/private/logstash-beats.key"}}

Is your logstash even running? The input config is clearly invalid (as suggested by logstash logs), because you accidentaly commented out the closing } by commenting out ssl_key. Better try

input {
  beats {
    port => 5044
    ssl => false
    #ssl_certificate => "/etc/pki/tls/certs/logstash-beats.crt"
    #ssl_key => "/etc/pki/tls/private/logstash-beats.key"
  }
}

While taking some more space, it's always a good practice to write clean configs.

Hi @steffens,

my input config look like now

input {
  beats {
   port => 5044
   ssl => false
   #ssl_certificate => "/etc/pki/tls/certs/logstash-beats.crt"
   #ssl_key => "/etc/pki/tls/private/logstash-beats.key"
  }
}

and

10-syslog.conf

filter {
 if [type] == "log" {
  grok {
   match => { "message", "(?m)\[\#\|%{TIMESTAMP_ISO8601:timestamp}\|%{LOGLEVEL:Log Level}\|%    {DATA:server_version}\|%{JAVACLASS:Class}\|%{DATA:thread}\|%{DATA:message_detail}\|\#\]" }
  add_field => [ "Log level", "%{LOGLEVEL:Log Level}" ]
}
}
syslog_pri { }
date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
}

I still have the same error..

2016/11/11 10:57:52.972269 output.go:87: DBG  output worker: publish 2048 events
2016/11/11 10:57:52.973354 single.go:77: INFO Error publishing events (retrying): EOF
2016/11/11 10:57:52.973364 single.go:154: INFO send fail

After using with filebeat-5.0 version...

2016/11/11 13:44:45.911161 single.go:91: INFO Error publishing events (retrying): EOF

2016/11/11 13:45:00.752791 logp.go:230: INFO Non-zero metrics in the last 30s: filebeat.harvester.started=1    libbeat.logstash.publish.read_errors=5 libbeat.logstash.publish.write_bytes=5547 libbeat.logstash.call_count.PublishEvents=5 libbeat.logstash.published_but_not_acked_events=10235 libbeat.publisher.published_events=2047 filebeat.harvester.open_files=1 filebeat.harvester.running=1

2016/11/11 13:45:01.913866 sync.go:85: ERR Failed to publish events caused by: EOF

Yes, my logstash is running. but still the same error messages

http://pastebin.com/Hh9ECFjd

  ==> /var/log/logstash/logstash.log <==
elk_1  | {:timestamp=>"2016-11-11T10:56:43.166000+0000", :message=>"fetched an invalid config", :config=>"input     {\n  lumberjack {\n    port => 5000\n    type => \"log\"\n    ssl => false\n    #ssl_certificate => \"/etc/pki/tls/certs/logstash-forwarder.crt\"\n    #ssl_key => \"/etc/pki/tls/private/logstash-forwarder.key\"\n  }\n}\n\ninput {\n  beats {\n   port => 5044\n   ssl => false\n   #ssl_certificate => \"/etc/pki/tls/certs/logstash-beats.crt\"\n   #ssl_key => \"/etc/pki/tls/private/logstash-beats.key\"\n  }\n}\n\nfilter {\n  if [type] == \"log\" {\n    grok {\n      match => { \"message\", \"(?m)\\[\\#\\|%{TIMESTAMP_ISO8601:timestamp}\\|%{LOGLEVEL:Log Level}\\|%{DATA:server_version}\\|%{JAVACLASS:Class}\\|%{DATA:thread}\\|%{DATA:message_detail}\\|\\#\\]\"}\n      add_field => [ \"Log level\", \"%{LOGLEVEL:Log Level}\" ]\n    }\n   }\n    syslog_pri { }\n    date {\n      match => [ \"timestamp\", \"MMM  d HH:mm:ss\", \"MMM dd HH:mm:ss\" ]\n   }\n}\n\n\nfilter {\n  if [type] == \"nginx-access\" {\n    grok {\n      match => { \"message\" => \"%{NGINXACCESS}\" }\n    }\n  }\n}\n\noutput {\n  elasticsearch {\n    hosts => [\"http://localhost:9200\"]\n    sniffing => true\n    manage_template => false\n    index => \"%{[@metadata][beat]}-%{+YYYY.MM.dd}\"\n    document_type => \"%{[@metadata][type]}\"\n  }\n}\n\n", :reason=>"Expected one of #, => at line 23, column 27 (byte 459) after filter {\n  if [type] == \"log\" {\n    grok {\n      match => { \"message\"",     :level=>:error} 

I have check also my "10-syslog.conf" filter file, seem to be ok?

Cheers,
ThomasK

I think a closing } is missing in your filter config. But didn't validate. As the logs tell there seems to be something wrong with the logstash config.

it work now. After changing the filter with the following

match => { "message",

to

match => { "message" => 

and removed "curly brackets in the date match".

filter {
if [type] == "log" {
    grok {
  match => { "message" =>  "(?m)\[\#\|%{TIMESTAMP_ISO8601:timestamp}\|%{LOGLEVEL:log_level}\|%    {DATA:server_version}\|%{JAVACLASS:Class}\|%{DATA:thread}\|%{DATA:message_detail}\|\#\]" }
    }
  }
syslog_pri { }
date {
  match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
  }
}