Regarding reading the filebeat output to a file as input to logstash

when we store filebeat data to logstash and apply filters and then ouput into a file & again pass file data as input to logstash and then to elasticsearch, all the fields will be merged into one field w.

How can we read the filebeat out written to a file from logstash correctly. There is some firewall issue dure to which we cannot directly write data from filebeat to logstash and then to elastic search.

So your setup is: FB -> LS -> File -> LS -> ES ?

It sounds like Filebeat is writing lines with JSON to a file and you're not using the json codec for the file input that reads that file.

Yes, Correcte the set is FB -> LS -> File -> LS -> .ES

We tried using json code in file input but still it's not getting the fields correctly.

There are now two threads discussing exactly the same problem (this one and Regarding Filebeat Ouput to a File in Remote Server or NFS). Please pick one thread and stick to it.


We have added code=json in the file input plugin and run logstash , But again the ouput is showing in the fillowing format.

["message": "{"message":"root 19712 19709 0 02:08 ttyS0 00:00:00 grep dtpd","@timestamp":"2016-05-02T13:36:30.654Z","offset":366,"source":"/var/log/broncos_logs/dvt_1.txt","chs":"test","date":"02-05-2016"}"

Okay i'll stick to this thread itself may be i'll attache the other one to this.

Please answer all the questions I asked in the other thread.

I mean i'll update on the thread Regarding Filebeat Ouput to a File in Remote Server or NFS). I'll check how to attach this thread to the other one.