Logstash creating 2 events for each line in json codec

I want to stream each json line being written to a log file into Logstash. I am using json codec in file input plugin.

If I want to load entire file which is new at once, I can do with start_position => beginning
But I want to stream each line as soon as it is written and not entire file at once.

I tried lot of combinations.
start_position => end and sincedb_path => dev/null
Only start_position => end
Only start_position => beginning

In each case, whenever a linr is written to the file and I save the file, the previous line written is also getting written. So, this is creating duplicate events. How can I avoid this, and create one event per line and stream into Logstash?

In each case, whenever a linr is written to the file and I save the file, the previous line written is also getting written.

When you save a file with a text editor it typically writes a new file and renames it over the old file. This creates a new file that Logstash will parse from the beginning if you set start_position => beginning. When testing make sure you append to the file with e.g. echo something >> filename.log.

Thanks Magnus. That worked for me. How do we append to the file input in production? Can we try with echo?

How do we append to the file input in production?

In production I assume you have a program that appends to the log file for you? Or what's the scenario?

Can we try with echo?

You can pick any method that appends to the file instead of creating a new one.

Thanks magnus

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.