Changes to config file not updating the data in kibana

Hello All,

I am using the latest version of logstash(7.6.2). I tried uploading a sample data and was able to successfully upload it into the elasticsearch using logstash(enabled auto-reload) and was able to see the index in the Kibana interface.

But, when I make changes to the below config file, I was unable to see the updated data in the Kibana interface. I was trying to remove the mutate filter plugin and the logstash pipeline reloaded but the data in Kibana is not updated. Interestingly it didn't throw up any errors.


    	path => "/usr/local/Cellar/sample.log"
    	start_position => "beginning"

    		match => ["message", "%{TIMESTAMP_ISO8601:timestamp_string}%{SPACE}%{GREEDYDATA:line}"]
    		match => ["timestamp_string", "ISO8601"]
    		remove_field => [message, timestamp_string]
    	hosts => ["localhost:9200"]
    	index => "sample"
    	codec => rubydebug

Any help here is appreciated. TIA

P.S. - I am new to ElasticSearch :smiley:

By default a file input runs in tail mode, so it only reads data that is appended to your input file. start_position only has an effect the first time it sees a file. You may find

sincedb_path => "/dev/null"

useful, to prevent the input from persisting the in-memory sincedb across reloads.

1 Like

Thanks a ton! This works but the index in the elasticsearch now has duplicated records i.e. rows from both old and new logstash uploads. I know deleting the index or renaming before every upload would work, but is there any way the index can be updated completely without deletion/renaming?

Use a fingerprint filter to combine appropriate fields from the event and then use the result to set the document_id option on the elasticsearch output. This will result in the documents being overwritten.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.