I have a json in the following format. I am reading this from kafka in logstash.
{"id":"8eba168be707e85d","kind":"PRODUCER","name":"scheduler","timestamp":1560526848185066,"duration":23645,"tags":{"class":"TestJob","method":"execute"}}
If I use the mutate filter plugin, the data inserted into elasticsearch will convert the "tags" field from object to array.
This is my current config.
input
{
kafka {
bootstrap_servers => "localhost:9092"
topics => ["log-topic-new"]
codec => json
}
}
filter {
mutate {
add_field => {"timestamp_millis"=>"%{timestamp}"}
}
mutate {
gsub => ["timestamp_millis", ".{3}$", ""]
}
mutate {
convert => {
"timestamp_millis" => "integer"
}
}
}
output {
#--------Debugging in console -------------------
stdout { codec => rubydebug }
#--------Sending logs to es index -------------------
elasticsearch {
hosts =>"192.168.45.43:9200"
manage_template => false
document_type => span
index => "span-%{+YYYY-MM-dd}"
}
}
The data in ES
_source": {
"timestamp_millis": 1560526848185,
"kind": "PRODUCER",
"tags": [
[
"class",
"TestJob"
],
[
"method",
"execute"
]
],
"duration": 23645,
"name": "scheduler",
"id": "477811021df4dadc",
"timestamp": 1560526848185066
}
But if the mutate filter plugin is removed, the tags data is inserted as object itself.