How does filebeat (or logstash) converts single line of log to json data?

We already have a ELK cluster running. I wanted to understand how the single log (line) gets converted into json data in elasticsearch. Looking for pointers or explanation.
Thanks!
Mohan

The exported fields are documented here: https://www.elastic.co/guide/en/beats/filebeat/current/exported-fields.html

Filebeat keeps track of a files meta-data like path, inode and also puts these information in the final document. Offset and message field are put by the file readers. The complete line event with file metadata and message is finally serialized to json.

Thank you @steffens for the information.

I was curious how filebeat selects which category of exported fields to apply, so I checked the documented exported fields against my elasticsearch document and filebeat.yml. And looks like it depends on filebeat.yml?

I also have "timestamp" field added to elasticsearch document. Wondering from where it is coming.

This topic was automatically closed after 21 days. New replies are no longer allowed.