Hi All,
I am new to Logstash and trying to work out a config.
My scenario is that I have a utility which runs periodically and outputs a report as a multiline JSON object in a file. The filename remains the same across runs but the content is replaced with every run.
I want to take the content of the file and pass it into Elastic search as a single document. So there is a one document per run of the utility.
I have a config which does this BUT it only does it once at the startup of Logstash. When the content of the file is updated a reread of the file is not triggered.
I suspect the file is being 'unwatched' after the first run. Can someone suggest a better config please?
Thanks++
Here is my config:
input { 
    file {
        codec => json
        mode => read
        delimiter => "EOF"
        path => "/journals/journal.json"
    } 
} 
output { 
    stdout{
        codec => rubydebug
    }
    elasticsearch {
        index => "host-journals"
        document_type => "default"
        hosts => ["http://127.0.0.1:9200"]
    } 
}