Hi @steffens,
my input config look like now
input {
beats {
port => 5044
ssl => false
#ssl_certificate => "/etc/pki/tls/certs/logstash-beats.crt"
#ssl_key => "/etc/pki/tls/private/logstash-beats.key"
}
}
and
10-syslog.conf
filter {
if [type] == "log" {
grok {
match => { "message", "(?m)\[\#\|%{TIMESTAMP_ISO8601:timestamp}\|%{LOGLEVEL:Log Level}\|% {DATA:server_version}\|%{JAVACLASS:Class}\|%{DATA:thread}\|%{DATA:message_detail}\|\#\]" }
add_field => [ "Log level", "%{LOGLEVEL:Log Level}" ]
}
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
I still have the same error..
2016/11/11 10:57:52.972269 output.go:87: DBG output worker: publish 2048 events
2016/11/11 10:57:52.973354 single.go:77: INFO Error publishing events (retrying): EOF
2016/11/11 10:57:52.973364 single.go:154: INFO send fail
After using with filebeat-5.0 version...
2016/11/11 13:44:45.911161 single.go:91: INFO Error publishing events (retrying): EOF
2016/11/11 13:45:00.752791 logp.go:230: INFO Non-zero metrics in the last 30s: filebeat.harvester.started=1 libbeat.logstash.publish.read_errors=5 libbeat.logstash.publish.write_bytes=5547 libbeat.logstash.call_count.PublishEvents=5 libbeat.logstash.published_but_not_acked_events=10235 libbeat.publisher.published_events=2047 filebeat.harvester.open_files=1 filebeat.harvester.running=1
2016/11/11 13:45:01.913866 sync.go:85: ERR Failed to publish events caused by: EOF
Yes, my logstash is running. but still the same error messages
http://pastebin.com/Hh9ECFjd
==> /var/log/logstash/logstash.log <==
elk_1 | {:timestamp=>"2016-11-11T10:56:43.166000+0000", :message=>"fetched an invalid config", :config=>"input {\n lumberjack {\n port => 5000\n type => \"log\"\n ssl => false\n #ssl_certificate => \"/etc/pki/tls/certs/logstash-forwarder.crt\"\n #ssl_key => \"/etc/pki/tls/private/logstash-forwarder.key\"\n }\n}\n\ninput {\n beats {\n port => 5044\n ssl => false\n #ssl_certificate => \"/etc/pki/tls/certs/logstash-beats.crt\"\n #ssl_key => \"/etc/pki/tls/private/logstash-beats.key\"\n }\n}\n\nfilter {\n if [type] == \"log\" {\n grok {\n match => { \"message\", \"(?m)\\[\\#\\|%{TIMESTAMP_ISO8601:timestamp}\\|%{LOGLEVEL:Log Level}\\|%{DATA:server_version}\\|%{JAVACLASS:Class}\\|%{DATA:thread}\\|%{DATA:message_detail}\\|\\#\\]\"}\n add_field => [ \"Log level\", \"%{LOGLEVEL:Log Level}\" ]\n }\n }\n syslog_pri { }\n date {\n match => [ \"timestamp\", \"MMM d HH:mm:ss\", \"MMM dd HH:mm:ss\" ]\n }\n}\n\n\nfilter {\n if [type] == \"nginx-access\" {\n grok {\n match => { \"message\" => \"%{NGINXACCESS}\" }\n }\n }\n}\n\noutput {\n elasticsearch {\n hosts => [\"http://localhost:9200\"]\n sniffing => true\n manage_template => false\n index => \"%{[@metadata][beat]}-%{+YYYY.MM.dd}\"\n document_type => \"%{[@metadata][type]}\"\n }\n}\n\n", :reason=>"Expected one of #, => at line 23, column 27 (byte 459) after filter {\n if [type] == \"log\" {\n grok {\n match => { \"message\"", :level=>:error}
I have check also my "10-syslog.conf" filter file, seem to be ok?
Cheers,
ThomasK