Hello guys, I am getting following error about my filter and I have no clue why it is happening. I have included details.
Filter
filter {
if "cost_management" in [tags] {
fingerprint {
source => "message"
target => "[@metadata][fingerprint]"
method => "MURMUR3"
}
# Parsing of json events.
json {
source => "message"
tag_on_failure => [ "_grok_error_log_nomatch" ]
}
# If line doesnt matched then drop that line.
if "_grok_error_log_nomatch" in [tags] {
drop { }
}
# Set timestamp as per input log events.
date {
# date format : 2019-12-23 23:28:42
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ", " yyyy-MM-dd HH:mm:ss", "ISO8601" ]
target => "@timestamp"
}
# Set index name as per given tags.
mutate {
add_field => [ "index_name", "cost_management" ]
}
mutate {
convert => {
"monthly_total_bill" => "float"
"last_day_bill" => "float"
"monthly_service_bill" => "float"
"last_day_service_bill" => "float"
}
}
# Remove unwanted tags from sorted logs.
mutate {
remove_tag => ["beats_input_codec_plain_applied", "beats_input_codec_plain_applied", "_grokparsefailure", "_geoip_lookup_failure", "multiline", "_jsonparsefailure"]
}
}
}
Error
[2021-03-16T18:34:12,627][ERROR][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"3618653390", :_index=>"cost_management,cost_management-2021.03.16", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0xa8069bb>], :response=>{"index"=>{"_index"=>"cost_management,cost_management-2021.03.16", "_type"=>"_doc", "_id"=>"3618653390", "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [cost_management,cost_management-2021.03.16], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"cost_management,cost_management-2021.03.16"}}}}
My Output Filter
output {
file {
create_if_deleted => true
path => "/var/log/logstash/logstash_log"
codec => "rubydebug"
}
elasticsearch {
hosts => ["localhost:9200"]
user => "admin"
password => "passwd"
manage_template => false
index => "%{[index_name]}-%{+YYYY.MM.dd}"
document_id => "%{[@metadata][fingerprint]}"
http_compression => true
}
}
What I have understood is that because of my output filter, logstash is inserting two index names separated by comma and elasticsearch is rejecting the indexing request. But I am unable to debug it. But my final index name should be cost_management-2021.03.16
only.
Please help me out. Thank you.