ck1
May 23, 2019, 2:14pm
1
Hello all,
I am having trouble parsing the following json input in tpc
{
"color": {
"pixel": "152",
"shade": "grey"
},
"color": {
"pixel": "157",
"shade": "dark"
}
}
Result I want : 2 documents into Elastic, 1 document for each color with the field pixel and shade
Things I've attempted
Input tcp json codec : Doesn't work, it treats the event as single line
Using multiline codec below, it doesn't work either, I am getting json failure
codec => multiline {
pattern => "^\{$"
negate => true
what => previous
}
Using plain input but a split in filter
JSON
filter {
json {
source => "message"
}
split{
field=> "colors"
}
}
Any hint, help?
A Google search suggests that JSON parsers generally overwrite the value of a duplicate key. That is certainly the case for the parser that logstash uses. I think you will need a custom parser in a ruby filter.
ck1
May 23, 2019, 3:18pm
3
Badger:
a
Yes it overrides the field with duplicated key but using a split should create new events.
You cannot split a field that only has a single value.
ck1
May 23, 2019, 3:54pm
5
I managed to split with with multine pattern => '"color"'
Now i am getting this in the message field
"color": {
"pixel": "152",
"shade": "grey"
}
Trying to split it so pixel becomes a key and shade becomes a key
If you transform that into JSON using
mutate { gsub => [ "message", "\A", "{", "message", "\Z", "}" ] }
then you can use a json filter to parse it.
system
(system)
Closed
June 20, 2019, 5:45pm
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.