Elastic Search using Message Content Parse with Logstash

Hello,

I am trying to push some data into Elasticsearch from an application via logstash and is there anyway to search/filter the json message in Elasticsearch for a particular field in the json message.

I am using below logstash configuration


# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.



input {
  tcp {
    port => 4560
    codec => json
  }
}
filter {   
  date {
    match => [ "timeMillis", "UNIX_MS" ]
  }
}
output {
	elasticsearch {
		hosts => ["localhost:9200"]
		index => "mule-logs"	
	}	
}

and the sample message is Elasticsearch is attached


For example I want to search for messageId in the json data using message.messageId or if I want to search for the code can I search it using message.code ? In simple terms I want to search with the keys in the message section. is it possible ?

It seems your data is raw, you need to parse those fields first.
Im not so good at grok, can be pain in the a** to learn. But you can check basics there

Thanks @FALEN for your answer, can you post some code ?

Maybe some other community member help for grok,

Alternatively, are you developing an app? Because its easier with Kafka.
Just configure Kafka connector in your app, with mapped fields.
Kafka will transfer all formats, and fields to logstash -> elastic. Also this will be more cpu friendly since grok takes lot of cpu/ram if not configured properly

Hi @Luca_Belluccini can you help on this topic ?

It's a better approach to parse the messages with a json filter, check these posts:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.