I am trying to push some data into Elasticsearch from an application via logstash and is there anyway to search/filter the json message in Elasticsearch for a particular field in the json message.
I am using below logstash configuration
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
tcp {
port => 4560
codec => json
}
}
filter {
date {
match => [ "timeMillis", "UNIX_MS" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "mule-logs"
}
}
and the sample message is Elasticsearch is attached
For example I want to search for messageId in the json data using message.messageId or if I want to search for the code can I search it using message.code ? In simple terms I want to search with the keys in the message section. is it possible ?
It seems your data is raw, you need to parse those fields first.
Im not so good at grok, can be pain in the a** to learn. But you can check basics there
Alternatively, are you developing an app? Because its easier with Kafka.
Just configure Kafka connector in your app, with mapped fields.
Kafka will transfer all formats, and fields to logstash -> elastic. Also this will be more cpu friendly since grok takes lot of cpu/ram if not configured properly
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.