On my elk server (ubuntu) and source host (ubuntu) i've upgraded the filebeat module to 5.4.0. I send my logs using filebeat to the elasticsearch service (not to logstash). The logs from /var/log/auth.log are inserted in elasticsearch correctly, but they aren't parsed. I just stays 1 json:
Can you share your config file and the command you used to start filebeat?
i have the same issue @ruflin
@Me_Cloud Could you share your config file and the command you used to start filebeat? Also an excerpt of your auth log would be great.
here my config file @ruflin
filebeat.yml
- input_type: log
Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/messages
document_type: syslog
#- c:\programdata\elasticsearch\logs*
- input_type: log
Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/secure
document_type: sshlog
#- c:\programdata\elasticsearch\logs*
logstash input
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:$
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
if [type] == "sshlog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_date} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}(?:[%{POSINT}])?: %{WORD:login} password for %{USERNAME:username} from %{IP:ip} %{GREEDYDATA}" }
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_date} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}(?:[%{POSINT}])?: "message", "%{SYSLOGTIMESTAMP:syslog_date} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}(?:[%{POSINT}])?: message repeated 2 times: [ %{WORD:login} password for %{USERNAME:username} from %{IP:ip} %{GREEDYDATA}" }
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_date} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}(?:[%{POSINT}])?: %{WORD:login} password for invalid user %{USERNAME:username} from %{IP:ip} %{GREEDYDATA}" }
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_date} %{SYSLOGHOST:syslog_host} %{DATA:syslog_program}(?:[%{POSINT}])?: %{WORD:login} %{WORD:auth_method} for %{USERNAME:username} from %{IP:ip} %{GREEDYDATA}" }
}
date {
match => [ "syslog_date", "MMM d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601" ]
timezone => "Asia/Jakarta"
}
geoip {
source => "ip"
}
}
}
output
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
is there any wrong config in my filebeat.yml? because if i don't use document_type in filebeat.yml it can parsed the log and i can see the log in kibana, but if i use document_type it can't parsed and i can't see the log in kibana.
but in filebeat log there is a log
INFO Non-zero metrics in the last 30s: libbeat.logstash.call_count.PublishEvents=3 libbeat.logstash.publish.read_bytes=105 libbeat.logstash.
@Me_Cloud please create another topic. Your issue/configuration is not related to original discussion.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.