Does anyone in the community know how I can create separate log entries using an ingest pipeline for JSON results pulled via the HTTPJSON integration?
We are pulling events via an API and the results come in JSON format with each event as an item in an array. I would just like to have each event as its own document/log in Elastic. Is this possible?
I came across a video (here) that seems to show that is possible but I have only managed to reach the point where I am ingesting the logs and processing the JSON target field I want. However, for each entry that has multiple results in the target field, they are just showing up as comma separated values in the parsed fields. See image for example:
I just need to have each entry as its own log entry in elastic (with searchable fields) and I'll be set. I would appreciate any help with this. Thanks!
Good day Tomo! Thanks for responding! Currently the logs in question are being ingested via the HTTPJSON integration on a server running the Elastic agent. We currently don't use logstash since we use the agent for all ingest. I do see a split processor as an option when creating custom ingest pipelines though. Does that serve the same function?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.