Hello there,
i'm trying to put a value of a nested field into a new field by the help of logstash filters. I thought that the mutate-filter would be suitable for that.
So here's what i'm trying to do:
Log4J parses a Logline into a JSON-File which arrives like this in Logstash:
{"@timestamp":"2018-03-15T07:55:09.493Z","fieldA":"someText","mdc":{"LoggingContext":{"nestedField1":"Value1","nestedField2":value2}}}
When I try to find the document in Elasticsearch the output looks like this:
"_index": "application_log-16.03.2018",
"_type": "doc",
"_id": "1sQ4PWIBDfzRTwMRiiAQ",
"_score": 0.039220713,
"_source": {
"mdc": {
"LoggingContext": {
"nestedField1": "Value1",
"nestedField2": "Value2"
}
},
"tags": [
"FNKT",
"beats_input_codec_plain_applied"
],
"type": "application_log",
"session_id": "1752B3D812159649067F27A236A8E874.be01",
"source_host": "be01",
"beat": {
"hostname": "LO01",
"version": "6.1.0",
"name": "LO01"
},
}
For a better handling in Kibana and queries in Elasticsearch I want to put the values of nestedField1 and nestedField2 in a new Field like it is done with "type" or "sessionId".
I tried the following mutate-filter but nothing seems to work:
mutate {
add_field => { "nestedField1" => "%{[mdc][LoggingContext][nestedField1]}"}
}
mutate {
add_field => { "nestedField1" => "%{[mdc][0][LoggingContext][nestedField1]}"}
}
mutate {
add_field => { "nestedField1" => "%{mdc.LoggingContext.nestedField1}"}
}
The add_field method works in case of adding the field but the value is false. When I look into Kibana there's just the value "%{[mdc][0][LoggingContext][nestedField1]}" for nestedField1.