Hello,
I have the following JSON structure that I want to parse with Logstash:
{
"messages" : [ {
"remoteReferenceId" : "133883",
"sender" : {
"name" : "User1"
}
}, {
"remoteReferenceId" : "133894",
"sender" : {
"name" : "User2"
}
} ],
"timestamp" : "2021-03-20T15:36:01.868+02:00"
}
my config is the following one:
input {
file {
mode => "read"
path => "/path-to-data/*.json"
file_completed_action => "delete"
start_position => "beginning"
sincedb_path => "/dev/null"
type => "json"
codec => multiline {
pattern => "remoteReferenceId"
negate => true
what => previous
auto_flush_interval => 1
multiline_tag => ""
}
}
}
filter{
json{
source => "messages"
target => "parsedJson"
}
mutate {
add_field => {
"remoteReferenceId" => "%{[remoteReferenceId]}"
}
}
}
output {
elasticsearch{
hosts => ["http://localhost:9200/"]
index => "index_messages"
}
stdout { codec => rubydebug }
}
The JSON file is imported in Elasticsearch. My problem is that I want to add a new field "remoteReferenceId" with the value of the node (e.g. 133883), but in Elastic I get instead "%{[remoteReferenceId]}" and not the real value.
What can be the problem?
Many thanks!