I'm trying to parse my error.log file through logstash into elasticsearch but the message shows up in kibana as a whole without being split. I am not sure if my logstash conf file is wrong or if the error is elsewhere.
logstash .conf file
The messages are currently going through the filebeat-* index instead of logstash. I previously connected filebeat directly to elasticsearch.
Message example
That sounds like your events are still going directly from filebeat to elasticsearch.
The grok works if the message goes through that pipeline. I would suggest adding overwrite => [ "message" ]. If you do not then [message] will be an array, with one entry being the original log line, and the second entry being whatever matches the trailing GREEDYDATA in the grok pattern. Having an array like that is unlikely to be useful.
If you want to parse out more of [message] then I would suggest using ruby
ruby {
code => '
matches = event.get("message")&.scan(/ \[(\w+) "([^"]+)"\]/)
matches.each { |k, v|
newK = "[stuff][#{k}]"
if event.include?(newK)
a = Array(event.get(newK))
a << v
event.set(newK, a)
else
event.set(newK, v)
end
}
'
}
No, filebeat will send logs to either logstash or elasticsearch. If it is sending them to elasticsearch it is because the configuration is telling it to do so. You may not be running the filebeat configuration you think you are.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.