Reading beats fields from Kafka input

Hi,
in my infrastructure I have something like:
Filebeat -> Kafka -> Logstash -> Elasticsearch

I'm having problems reading the custom fields set up in Filebeat when filtering in Logstash.

I configured the kafka input in Logstash, I'd like to be able to read content of "my_custom_field", but I'm unable to...

{
	"message" : {
		"@timestamp":"2017-02-22T17:13:22.346Z",
		"beat":{
			"hostname":"host-01",
			"name":"name-01",
			"version":"5.2.1"},
			"fields":{
				"my_custom_field":"XXXXX"
			},
		"input_type":"log",
		"message":"....................",
		"offset":6894303,
		"source":"...",
		"type":"log"
	}
}

How can I access this field in order to create my index name (output) using it?

I'm confused as to why all fields are nested under message, but if this indeed is what your event looks like you can access my_custom_field with [message][beat][fields][my_custom_field].

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references

Thanks!
Actually, what I'm doing now to get this field is:

json {
	source => "message"
	target => "beat_details"
}	
mutate { 
	add_field => { "type" => "%{[beat_details][fields][my_custom_field]}" } 	
}	

Then I can use this "type" in my filters.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.