Hello!
I have input json:
{
"fields":{
"created_date":1614158456696703782,
"value":"\u001B[35m-\u001B[0m | \u001B[32m2021-02-20T12:48:10.367337+0300\u001B[0m | \u001B[1mINFO \u001B[0m | application.app:<module>:98 - \u001B[1mtest info"
},
"@version":"1",
"@timestamp":"2021-02-24T09:31:50.208Z",
"tags":{
"data_tags":"[\"tag1\",\"tag2\"]",
"collection_platform":"telegraf",
"index_name":"test_alias",
"source_type":"file",
"data_type":"raw",
"path":"/opt/map/hub/logs/trace.log",
"host":"hub"
},
"timestamp":1614158456,
"name":"logparser"
}
How extract fields: created_date
, value
, if i dont know them?
If i use json plugin:
json {
skip_on_invalid_json => false
source => "fields"
}
or
mutate {
convert => { "fields" => "string" }
}
json {
skip_on_invalid_json => false
source => "fields"
}
I have error:
Feb 24 12:21:24 elk logstash[2868]: [2021-02-24T12:21:24,951][WARN ][logstash.filters.json ][main] Exception caught in json filter {:exception=>"class java.util.HashMap cannot be cast to class java.lang.String (java.util.HashMap and java.lang.String are in module java.base of loader 'bootstrap')", :source=>"fields", :raw=>{"created_date"=>1614158456696703782, "value"=>"\e[35m-\e[0m | \e[32m2021-02-20T12:48:10.367337+0300\e[0m | \e[1mINFO \e[0m | application.app:<module>:98 - \e[1mtest info"}}
Feb 24 12:21:24 elk logstash[2868]: [2021-02-24T12:21:24,977][ERROR][org.logstash.execution.WorkerLoop][main] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
Feb 24 12:21:24 elk logstash[2868]: java.lang.ClassCastException: class java.util.HashMap cannot be cast to class java.lang.String (java.util.HashMap and java.lang.String are in module java.base of loader 'bootstrap')