please assist with this config using Filebeat -> Elasticsearch , My Json is not being decomposed.
other fields are populating correctly, i want the message to be decomposed as well
filebeat.prospectors:
-
input_type: log
json.add_error_key: true
json.keys_under_root: true
paths:
- message.log
output.elasticsearch:
hosts:
- "http://localhost:9200"
The log is obviously already parsed into a json document. The message field is from the original document. The contents of message is no valid JSON. You will need logstash or ES ingest node for additional processing of the message field.
Format is quite funny. Plus I can't tell (by one message) if order of fields is always the same or Data is always at the end. But it looks like grok is not enough here. If Data is always at the end you can try to 'split' the document using grok before and after Data. Everything before the document kind of looks like CSV-parseable (look for CSV or kv filter in logstash). The contents in Data I can't really tell about the format due to special characters like \d or \r. Maybe you want to replace those with \t as well before applying CSV or kv filter in logstash. Good luck
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.