I'm trying to index json logs with a Fleet-managed agent using a custom log configuration. I successfully downloaded the agent to my computer and specified a path where the log files live.
However, when I generated a few log files, I noticed that the agent wouldn't detect log files that were json. I suspect I may need to edit the agent.logging.json and agent.logging.ecs settings in the agent's yaml file, but I don't know how!
I've tried adding them via the 'Custom configurations' setting in the Edit Custom logs integration via Kibana/Fleet, but I think the changes are not getting applied how I imagined they would.
Ohhh @Tim_Estes
I did not notice that JSON log file needs to be single json / ndjson per entry not "pretty" / expanded...
That is the expected format and normal for json logs I believe.
From the docs
"These options make it possible for Filebeat to decode logs structured as JSON messages. Filebeat processes the logs line by line, so the JSON decoding only works if there is one JSON object per line."
Yes, this seems to be the issue. Once I started saving the log files as a single json per line (and putting a newline character at the end of the file), the messages were ingested into the data stream. Thank you for you help.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.