Logstash : parse json input from http poller failing

I have a logstash configuration of http poller input, and elastic output, but i am struggling to store the json input from poller to index as documents. it stores the entire json as a single field but i need to store each values as a field

ex input from http poller
{
"results": [
{
"tables": [
{
"rows": [
{
"Tree Details[id]": "1984",
"Tree Details[year]": "2018",
"Tree Details[quarter_1]": null
},
{
"KPI Tree Details[id]": "1984",
"KPI Tree Details[year]": "2018",
"KPI Tree Details[quarter_1]": null
}
]
}
]
}
]
}

without any filter it stores the results as a field with entire json in single field but i would like to parse each result of rows in each document of an index.
Any suggestion is appreciated
i have tried split, json_encode and json end up with either invalid field reference or typecase error.

You could try

    json { source => "message" remove_field => [ "message" ] }
    split { field => results }
    split { field => "[results][tables]" }
    split { field => "[results][tables][rows]" }

and then use ruby to move the fields in [results][tables][rows] to the top level, as shown here.

thank you for your prompt reply.
however looks like i ran into another problem

Ruby exception occurred: Invalid FieldReference: KPI Tree Details[gender_en] {:class=>"RuntimeError", :backtrace=>["(ruby filter code):4:in `block in register'", "org/jruby/RubyHash.java:1601:in `each'", "(ruby filter code):3:in `block in register'", "C:/logstash-8.12.0/vendor/bundle/jruby/3.1.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:96:in `inline_script'", "C:/logstash-8.12.0/vendor/bundle/jruby/3.1.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:89:in `filter'", "C:/logstash-8.12.0/logstash-core/lib/logstash/filters/base.rb:158:in `do_filter'", "C:/logstash-8.12.0/logstash-core/lib/logstash/filters/base.rb:176:in `block in multi_filter'", "org/jruby/RubyArray.java:1989:in `each'", "C:/logstash-8.12.0/logstash-core/lib/logstash/filters/base.rb:173:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:133:in `multi_filter'", "C:/logstash-8.12.0/logstash-core/lib/logstash/java_pipeline.rb:304:in `block in start_workers'"]}

i made a slight change in the filter suggested by you to apply ruby.

filter {
json{
source => "message" remove_field => [ "message" ]
}
split{
field => results
}
split{
field => "[results][tables]"
}
split{
field => "[results][tables][rows]"
target => doc
}
ruby {
code => '
event.get("doc").each { |k, v|
event.set(k,v)
}
event.remove("doc")
'
}
}

Can you please help me on this ?

Remove the target option on the last split filter and use the following ruby

ruby {
    code => '
        event.remove("[results][tables][rows]").each { |k, v|
            event.set(k,v)
        }
    '
}

filter {
json{
source => "message" remove_field => [ "message" ]
}
split{
field => results
}
split{
field => "[results][tables]"
}
split{
field => "[results][tables][rows]"
}
ruby {
code => '
event.remove("[results][tables][rows]").each { |k, v|
event.set(k,v)
}
'
}
}

Ruby exception occurred: Invalid FieldReference: KPI Tree Details[quarter_en] {:class=>"RuntimeError"

same kind of exception.
Am i missing anything here ? thanks again

That's going to be a problem. What do you want that field to be called? It's name cannot contain square brackets, since that is taken as a field reference. We could make it a nested field [KPI Tree Details][id] etc.

KPI Tree Details[id] - in this if we can replace the KPI Tree Details[id] with id. that works for me
My rows data is huge. 200+ fields present in each row

Here is code to do it either way...

    ruby {
        code => '
            event.remove("[results][tables][rows]").each { |k, v|
                # Keep Tree Details as outer object
                newk = k.gsub(/\[/, "][").gsub(/^/, "[")
                # Just keep inner field names
                #newk = k.gsub(/.*\[/, "").gsub(/\]/, "")
                event.set(newk,v)
            }
        '
    }

thanks. it works

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.