Read JSON from a static file

I have a JSON file by name "output.json" which has only one entry --

CI ID : 213

I am trying to read from the json file using the below logstash configuration using input file plugin . My aim is to include the content of the JSON file and append to every document entry that i receive from filebeat. The contents of the JSON file is dynamic and changes once per day.

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/security/logstash.crt"
ssl_key => "/security/logstash.key"
}
file {
path => ["/etc/logstash/output.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
codec => "json"
}
}
filter {
grok {
match => { "message" => "^The build ID of the run is : %{NUMBER:filteredValues}" }
}
}

Note : I have already deleted the .sincedb file and tried to execute the above configuration. But no luck.

My Question : any idea on how to read from a static JSON file and append the content of the json to every document entry into logstash ??

You could do it using

ruby {
    init => "
        file = File.read('/tmp/foo.json')
        h = JSON.parse(file)
        @CIID = h['CIID']
    "
    code => '
        event.set("CIID", @CIID)
    '
}

Error handling left as an exercise for the reader.

@Badger Thank you for responding back. your input has been very helpful. However , I ran into unusual issues -

Firstly,
My "output.json" was corrupt.Here is how I am feeding data to output.json-

output {
if [filteredValues] {
file {
path => "/etc/logstash/output.json"
codec => json { format => "CI-ID : %{filteredValues}"}
write_behavior => overwrite
}
}

I get the following error if I use JSON codec -

"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.",

If I change the contents in the output.json to the below , your ruby code works -

{
"CIID": "232"
}

ISSUE : The file output plugin is not able to write json data to the output.json file.

What I have done -

I have changed the codec to line and here is my configuration -

output {
if [filteredValues] {
file {
path => "/etc/logstash/output.txt"
codec => line { format => "CI-ID : %{filteredValues}"}
write_behavior => overwrite
}
}

and here is the ruby to copy from the output.txt and write to a new field -

filter {
ruby {
init => "file = '/etc/logstash/output.json'
@CIID = File.readlines(file).each do |line|
puts line
end"

code => 'event.set("CI-ID", @CIID)'

}
}

The above configuration works perfect. However , the problem is that The ruby code is not able to fetch the latest value from the "output.txt" file (even though the output.txt holds an updated value).
I am required to restart ELK , for the ruby to fetch the latest value from the "output.txt" . If I dont restart ELK , the ruby code is fetching me the first value (i.e something like this - "CI-ID : 237") that the "output.txt " was holding after the first ELK restart.

Any idea what i am doing wrong ? Am I not closing the file somewhere ??

@magnusbaeck any opinion that you would like to give here ?

Thanks in advance.

There should be another error message output after this. The configuration you show is missing a }.

The whole point of "init =>" is to do something once when logstash starts. If you want to read the file every time an event is processed you could move that code to the "code =>" block.

@Badger

Thank you for responding back. I have moved the code to "code=>" block and everything is working fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.