Deserializing a json string and make it a name in an index to ES

Hello People,

I'm relatively new to the ELK stack and have made progress in getting data into ES, however I would really appreciate some pointers on the following:

My setup is sending data to kafka and then to ES through an input/output configuration of logstash.

Please see below:

logstash.conf:

input {
kafka {
topics => ["x"]
decorate_events => true
bootstrap_servers => "localhost:9092"
key_deserializer_class => "org.apache.kafka.common.serialization.StringDeserializer"
value_deserializer_class => "org.apache.kafka.common.serialization.StringDeserializer"
}
}

filter {
}

output {
elasticsearch {
hosts => "es:9200"
index => "latest"
manage_template => false
action => "update"
}
}

The data is fully visible in Kibana, however there is one field called message with is string of json which has several interesting entries in it. I would like these entries to also be names in the index so they can be used later for visualization (they have numerical values there). I was playing with the "key_deserializer_class" thinking that this might be able to deconstruct a string and auto create an aggregatable name in the index, however no joy.

Any general pointers would be greatly appreciated.

Thanks

I got it - Had to add codec => "json" to the input section for kafka in the logstash.conf - trust this helps others also :slight_smile: .

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.