How to read json data using filebeat and preserve attribute names in Logstash

Hi all,

I have single line log files (\n delimiter) in which logs are entered in json format like;


I want to publish this to kafka, and fetch using logstash, and then insert into elasticsearch. I want same attribute names in elasticsearch. How can I do this in filebeat?

Thank you.

@elasticheart Are you decoding the JSON event in Filebeat? I don't see your configuration, but Filebeat log input has options to parse JSON events see the documentation at

hi @pierhugues my current filebeat configuration is as below;

- type: log
  enabled: true
    - /logs/sample1.log

  enabled: true
  hosts: ["","",""]
  topic: testLogs1 

and I am using JSON filter in Logstash like;

filter {
	json {
		source => "message"

This works just fine, but is there any other easy/simpler way to achieve this, so that I can avoid filter in Logstash?

Yes, you can do all the parsing in Filebeat, I've linked the documentation in my previous reply, but you can also take a look at this (blog post]( which give you a bit more details.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.