Could not index event to elasticsearch error

Hello guys, I am getting following error about my filter and I have no clue why it is happening. I have included details.

Filter

filter {
	if "cost_management" in [tags] {
		fingerprint {
			source => "message"
			target => "[@metadata][fingerprint]"
			method => "MURMUR3"
		}
		# Parsing of json events.
		json {
			source => "message"
			tag_on_failure => [ "_grok_error_log_nomatch" ]
		}
		# If line doesnt matched then drop that line.
		if "_grok_error_log_nomatch" in [tags] {
			drop { }
		}
		# Set timestamp as per input log events.
		date {
			# date format : 2019-12-23 23:28:42
			match => [ "timestamp", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ", " yyyy-MM-dd HH:mm:ss", "ISO8601" ]
			target => "@timestamp"
		}
		# Set index name as per given tags.
		mutate {
			add_field => [ "index_name", "cost_management" ]
		}
		mutate {
			convert => {
				"monthly_total_bill" => "float"
				"last_day_bill" => "float"
				"monthly_service_bill" => "float"
				"last_day_service_bill" => "float"
			}
		}
		# Remove unwanted tags from sorted logs.
		mutate {
			remove_tag => ["beats_input_codec_plain_applied", "beats_input_codec_plain_applied", "_grokparsefailure", "_geoip_lookup_failure", "multiline", "_jsonparsefailure"]
		}
	}
}

Error

[2021-03-16T18:34:12,627][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"3618653390", :_index=>"cost_management,cost_management-2021.03.16", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0xa8069bb>], :response=>{"index"=>{"_index"=>"cost_management,cost_management-2021.03.16", "_type"=>"_doc", "_id"=>"3618653390", "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [cost_management,cost_management-2021.03.16], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"cost_management,cost_management-2021.03.16"}}}}

My Output Filter

output {
	file {
		create_if_deleted => true
		path => "/var/log/logstash/logstash_log"
		codec => "rubydebug"
	}
	elasticsearch {
		hosts => ["localhost:9200"]
		user => "admin"
		password => "passwd"
		manage_template => false
		index => "%{[index_name]}-%{+YYYY.MM.dd}"
		document_id => "%{[@metadata][fingerprint]}"
		http_compression => true
	}
}

What I have understood is that because of my output filter, logstash is inserting two index names separated by comma and elasticsearch is rejecting the indexing request. But I am unable to debug it. But my final index name should be cost_management-2021.03.16 only.
Please help me out. Thank you.

mutate { add_field => [ "index_name", "cost_management" ] }

"reason"=>"Invalid index name [cost_management,cost_management-2021.03.16], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"cost_management,cost_management-2021.03.16"}}}}

If that mutate gets executed twice, then [index_name] will be an array, which the code will present as "cost_management,cost_management".

You could avoid this by using

mutate { replace => { "index_name" => "cost_management" } }

but you should really try to find why it is getting executed twice, since that may mean other things are being done twice. For example, if you point path.config at a directory then it will concatenate all the files in the directory into a single configuration. If the directory contains my.conf and my.conf.bak then the contents of both are part of the configuration.

1 Like

My bad. My conf.d/ had same filter with different name. Removing that worked for me. Thank you very much.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.