the issue is that the log in json format, sent from filebeat to logstash, the json data got wrapped in the message field and cannot be parsed into event fields.
I have read a lot of posts on the similar issue over this weekend, and I followed the online document on the configuration of logstash:
the following is my configuration:
input {
beats {
port => 5044
type => "mylog"
}
}
but I found that the log data is still the value of "message", and did not get parsed.
I would be appreciated it very much if anyone could shed some light.
thank you!
from Kibana, I can see the json data in "message", but I cannot find keys/values from the json data in the list of the fields.
I validated the json data via /logstash -f mytest.conf manually. The json data got parsed correctly. Is there a way that the json data gets parsed without manual way? I have read many posts but I cannot figure a solution for the specific issue. I believe I must have missed something, and I need help to know what I missed.
elasticsearch result processed directly from logstash:
"_source":{"message":"{"test1":"testvalue1","mymessage":{"mtest1":"test"}}","@version":"1","@timestamp":"2016-02-03T04:42:08.507Z","host”:”…”,”test1":"testvalue1","mymessage":{"mtest1":"test"}}
** the json data got parsed.
the result from elasticsearch shipped from filebeat to logstash:
"_source":{"@metadata":{"beat":"filebeat","type”:”my-log"},"@timestamp":"2016-02-03T04:46:30.234Z","beat":{"hostname”:”…”,”name”:”…”},”count":1,"message":"{"test1":"testvalue1","mymessage":{"mtest1":"test"}}","offset":0,"type”:”my-log"}
** the json data did not get parsed.
I guess my question is, why does the json data not get parsed by the logstash if the json data is sent by filebeat to the logstash?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.