Hello Community, I have been facing parsing issue for past 24 hrs,
I could not understand if log stash does not allow JSON parsing, adding new fields without GROK, KV ,Flattening
The problem is most likely is if your log file is actually json pretty meaning how you showed it and not NDJSON single line. That's what logstash is expecting it's line oriented. Can you convert your log file to ndjson?
You could test it pretty easily by passing your log through
By default a file input consumes a file one line at a time. which makes sense of line-oriented logs. If you have multi-line JSON objects in the file then reformating (as @stephenb suggested) is a good option. If that's not possible then you may be able to use a multiline codec on the file input to read each object as an event. It depends how the objects are formatted.
we can see that every time a line starts with { it is the start of a new object. So we append lines to the object until we see another line starting with {. That can be done using
codec => multiline {
pattern => "^}"
negate => true
what => next
auto_flush_interval => 5
}
If you have a non-indented format like
{ "foo":
{ "bar": 1
}
}
{ "a": 2 }
then a multiline code will not help you. You will need a program that can maintain state (a count of the { and } seen) in order to decide whether it has reached the end of an object. There is no logstash codec or input that can do that.
After I converted , it started working.But my streaming logs dont come formatted.So do we have to convert each log line on logtsash properly before parsing and adding fields?
Normally they are solved in the source, a pretty printed json is bad for automation cases, so it is going to be consumed by machines, you just do not use it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.