Dynamically change the index for Elasticsearch in Logstash output

'Hi All,
I'm trying to change the logstash.config file according to a log file. The log file consists three different types of log records which generated through my java application.
logstash.config file is as follows:

logstash configuration

input {
beats {
port => 5044

}
}

filter {

if[fields][messagetype] == "customer_request"{
	grok{
		match =>{
			"message" => [ "%{WORD:apptime}::%{WORD:messagetype}::%{WORD:correlationId}::%{WORD:user_id}::%{WORD:user_gender}::%{WORD:user_type}::%{WORD:message}" ]
		}						   
	}
	mutate{
		add_field => [ "index_key" => "customer-request" ]
	}
}	

}

output {
stdout {
codec => rubydebug
}

  elasticsearch {
  index => "%{[index_key]}-%{+YYYY.MM.dd}"
      hosts => ["localhost:9200"]
  }

}

But the issue is index_key related value is not assigning which I added in add_filed in mutate section. It shows as %{[index_key]}-2020-08-10. I'm new to ELK stack and if anyone can help me, that would be a great.
Thanks...!

'

In Kibana, look at some of the records in that index. Do they have an index_key field?

You need to send index_key without rectangular brace.
Only in your filter block field names need to be passed in [ ]

Use this:

index => "%{index_key}-%{+YYYY.MM.dd}"

I changed it to index => "%{index_key}-%{+YYYY.MM.dd}" . But it does not work.

What exactly you mean by this. "Docs count" refers to the individual events coming in your index. The health being "yellow" means your cluster is unstable. Verify your input block as the output block seems fine.

Again, look at the documents in the index called "%{index_key}-2020.08.13". Do they have a field called index_key? If they do not, then this is exactly what you should expect.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.