I have a logstash config file that takes input source of SQS. I then take this and apply a filter to convert the JSON message from SQS into a data structure (this is working fine)
I then apply another filter with JDBC streaming to enrich the SQS message with additional information from an external source. I want to take the results of that (which appears to be returning as a json and add to the data structure. However, it appears to be throwing an exception:
That's not valid JSON so I'm confused. Please comment out your json and mutate filters and show an example event from ES. Copy/paste from the JSON tab in Kibana's Discover panel or use a stdout { codec => rubydebug } output.
Hi magnusbaeck, yes that's the field. Sorry for any confusion
I want to take that field like I took the was message and split the fields just like I did for the was sqs message so that each field is separate as part of the document and I can filter and search and visualize later
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.