pkaramol
(Pantelis Karamolegkos)
July 2, 2018, 3:23pm
1
I have several json
entries in a huge file, in the form of:
{"preview":false,"offset":0,"result":{"@timestamp":"2016-11-30T21:59:43.000Z","@version":"1","_raw":"..}}
What I want to use this file as input but keep everything under (not including ) result
, i,e I want my input to be
{"@timestamp":"2016-11-30T21:59:43.000Z","@version":"1","_raw":"..}
It goes without saying that after _raw
there are many fields following...
Any suggestions?
Do I need to work on the .json
file level with some parsing, or is there a logstash
filter appropriate for this use case?
So basically you want to remove all fields that are not result?
Start with a json filter
filter { json { source => "message" } }
Then you can remove fields using
ruby {
code => '
event.to_hash().each { |k, v|
if k != "result"
event.remove(k)
end
}
'
}
and then move the fields inside result to the top level. @timestamp needs to be parsed, which I choose not to do in ruby.
event.get("result").each { |k, v|
if k == "@timestamp"
event.set("timestamp", v)
else
event.set(k, v)
end
}
event.remove("result")
and
date { match => [ "timestamp", ISO8601 ] }
pkaramol
(Pantelis Karamolegkos)
July 3, 2018, 7:10am
3
The above seems to produce no events;
are we sure about this?
filter { json { source => "message" } }
I do not have a message
field anywhere.
Also my input codec
should be plain
or json
?
Oh, if you used a json codec on the input then indeed you will not have a message field and do not need a json filter.
system
(system)
Closed
July 31, 2018, 2:25pm
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.