Hi all,
I have single line log files (\n
delimiter) in which logs are entered in json format like;
{"f1":"data1","timestamp":"2018-09-10T12:33:15.878+0000","f2":"data2","f3":1234,"f4":"data4","server":"192.168.0.1","f5":"data5","f6":"1","f7":"data7","f8":"data8","f9":"data9","f10":false}
I want to publish this to kafka, and fetch using logstash, and then insert into elasticsearch. I want same attribute names in elasticsearch. How can I do this in filebeat?
Thank you.
pierhugues
(Pier-Hugues Pellerin)
September 11, 2018, 2:48pm
2
@elasticheart Are you decoding the JSON event in Filebeat? I don't see your configuration, but Filebeat log input has options to parse JSON events see the documentation at https://www.elastic.co/guide/en/beats/filebeat/master/filebeat-input-log.html#filebeat-input-log-config-json
hi @pierhugues my current filebeat configuration is as below;
filebeat.inputs:
- type: log
enabled: true
paths:
- /logs/sample1.log
output.kafka:
enabled: true
hosts: ["192.168.0.1:9092","192.168.0.1:9093","192.168.0.1:9094"]
topic: testLogs1
and I am using JSON filter in Logstash like;
filter {
json {
source => "message"
}
}
This works just fine, but is there any other easy/simpler way to achieve this, so that I can avoid filter in Logstash?
pierhugues
(Pier-Hugues Pellerin)
September 12, 2018, 12:10pm
4
Yes, you can do all the parsing in Filebeat, I've linked the documentation in my previous reply, but you can also take a look at this (blog post](https://www.elastic.co/blog/structured-logging-filebeat ) which give you a bit more details.
system
(system)
Closed
October 10, 2018, 12:10pm
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.