I want to stream each json line being written to a log file into Logstash. I am using json codec in file input plugin.
If I want to load entire file which is new at once, I can do with start_position => beginning
But I want to stream each line as soon as it is written and not entire file at once.
I tried lot of combinations. start_position => end and sincedb_path => dev/null
Only start_position => end
Only start_position => beginning
In each case, whenever a linr is written to the file and I save the file, the previous line written is also getting written. So, this is creating duplicate events. How can I avoid this, and create one event per line and stream into Logstash?
In each case, whenever a linr is written to the file and I save the file, the previous line written is also getting written.
When you save a file with a text editor it typically writes a new file and renames it over the old file. This creates a new file that Logstash will parse from the beginning if you set start_position => beginning. When testing make sure you append to the file with e.g. echo something >> filename.log.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.