You could see some different in timestamp, due to timezone, but I could confirm they are correct.
The error rate is pretty high, about 1/5.
In this system, log content are put in .log file, and logstash read from the same file continuously.
What could be problem ?
I'm sorry, but I can not see what is the issue, can you give more context?
What are the source of those screenshots?
Can you share an example in plain text of a document that is working and one that didn't work? Use the Preformatted text option when posting, share your Logstash pipeline as well.
Without plain text examples and your pipeline is not possible to try to simulate and see what could be the error.
Thank for your reply, I'm tried to explain as much detail as I can.
I'm using logstash to read content from .log file and save them into DB. The input config has been shared in thread.
The above screenshots are literally plain text examples.
The screenshots with timestamp in each row are the logs from logstash itself.
It also include Logstash error, it just json parse error, due to missing data when reading.
Please more specific which part you are still don't understand.
Thank you so much
It was not clear from the screenshot that it was some tool reading the log file from Logstash.
Your error is that the message logstash tried to parse was not a Json, how is that .log file being written? It needs to have one json document per line, if for some reason your systems print a json in a pretty-printed format, it will not work.
Can you share a sample of the documents in the source file, including some that are giving you parser error? Also, share the full logstash pipeline, not just the input. Share then as plain text, not images as with images is not possible to copy anything to try to replicate, use the the Preformatted text button in the forum.
It was not clear from the screenshot that it was some tool reading the log file from Logstash.
Not sure if I understand what you mean, but that screenshot is output log of logstash.
My logstash is running on ecs fargate, and following screenshot is just took from cloudwatch log
How is that .log file being written? It needs to have one json document per line, if for some reason your systems print a json in a pretty-printed format, it will not work.
Can you share a sample of the documents in the source file, including some that are giving you parser error?
The .log file being written by simple nodejs program, and definitely write only one json document per line.
As following screenshot is .log file with 2 line.
The problem is, those 2 line is completely same as each other, but only sometime json parse error happen, due to missing content when reading file (base on logstash output log)
This file is stored by EFS, I'm not sure if is has any clue
I also trying to replicate it in local but no luck, the only different I could point out is EFS
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.