Nested Json Parsing

Need help in parsing nested json .
Below is sample json : {"debug_level":"ERROR","debug_timestamp":"2018-12-21 05:15:57,559","debug_thread":"ScalaTest","debug_file":"Audit1.scala", "debug_line":"27","debug_message":{"JobEndTime":"2018-12-22"}}
{"debug_level":"ERROR","debug_timestamp":"2018-12-21 05:50:57,559","debug_thread":"ScalaTest","debug_file":"Audit1.scala", "debug_line":"27","debug_message":{"JobStartTime":"2018-12-21"}}

Logstash conf:
input
{
azureblob
{
storage_account_name => "XXXX"
storage_access_key => "XXX"
container => "cleanloggingtest"
codec => "line"

     type => azureblob
}

}
filter {
json {
source => "message"
target => "message"
}
json {
source => "[message][debug_message]"
target => "[message][debug_message]"
}
}
output
{
stdout { }
elasticsearch {
hosts => "localhost:9200"
index => "audittest-logs"
}

}

Error:
[2018-12-24T10:13:54,469][WARN ][logstash.filters.json ] Error parsing json {:source=>"[message][debug_message]", :raw=>{"JobEndTime"=>"2018-12-22"}, :exception=>java.lang.ClassCastException: org.jruby.RubyHash cannot be cast to org.jruby.RubyIO}
[2018-12-24T10:13:54,469][WARN ][logstash.filters.json ] Error parsing json {:source=>"[message][debug_message]", :raw=>{"User"=>"admin"}, :exception=>java.lang.ClassCastException: org.jruby.RubyHash cannot be cast to org.jruby.RubyIO}
[2018-12-24T10:13:54,469][WARN ][logstash.filters.json ] Error parsing json {:source=>"[message][debug_message]", :raw=>{"JobStartTime"=>"2018-12-21"}, :exception=>java.lang.ClassCastException: org.jruby.RubyHash cannot be cast to org.jruby.RubyIO}
[2018-12-24T10:13:56,013][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"audittest-logs", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x104e5fd3], :response=>{"index"=>{"_index"=>"audittest-logs", "_type"=>"doc", "_id"=>"6F6032cBrpO0OARc7kiA", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [message]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:115"}}}}}
[2018-12-24T10:13:56,023][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"audittest-logs", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x6217eab3], :response=>{"index"=>{"_index"=>"audittest-logs", "_type"=>"doc", "_id"=>"516032cBrpO0OARc7kh-", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [message]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:115"}}}}}
[2018-12-24T10:13:56,026][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"audittest-logs", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x78109d6a], :response=>{"index"=>{"_index"=>"audittest-logs", "_type"=>"doc", "_id"=>"6V6032cBrpO0OARc7kiB", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [message]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:115"}}}}}

When using the JSON filter either move the parsed JSON to root or define a new root level field, seems to create issues when you attempt to overwrite the source field. This should solve your "Can't get text on a START_OBJECT at 1:115" issue. To resolve your Cast exception only execute the filter once on the field.

/usr/share/logstash/bin/logstash -e 'input {stdin{}} filter{json {source => "message" }} output{stdout {codec => rubydebug}}'

{
           "@version" => "1",
       "debug_thread" => "ScalaTest",
         "debug_file" => "Audit1.scala",
         "debug_line" => "27",
               "host" => "cernccwes15.cernerasp.com",
    "debug_timestamp" => "2018-12-21 05:15:57,559",
        "debug_level" => "ERROR",
      "debug_message" => {
        "JobStartTime" => "2018-12-22",
          "JobEndTime" => "2018-12-22"
    },
            "message" => "{\"debug_level\":\"ERROR\",\"debug_timestamp\":\"2018-12-21 05:15:57,559\",\"debug_thread\":\"ScalaTest\",\"debug_file\":\"Audit1.scala\", \"debug_line\":\"27\",\"debug_message\":{\"JobEndTime\":\"2018-12-22\",\"JobStartTime\":\"2018-12-22\"}}",
         "@timestamp" => 2018-12-24T17:34:13.803Z
}

The View in Kibana:

Thanks ..This solved my issue

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.