I am trying to parse my logs and trying to create a dynamic index. But somehow logstash is itself creating the index with a ,
in the pattern hence its failing. Can you please help me remove the comma.
filter{
grok{
match => [
"message",'^%{TIMESTAMP_ISO8601:betimestamp} %{DATA:tenantId} % {EMAILADDRESS:userId} (\[%{DATA:threadName}\])']
}
mutate {
lowercase => [ "tenantId" ]
}
}
output {
if [tenantId] {
if [type] =~ /UI$/ {
elasticsearch {
hosts => ["localhost:9200"]
index => "atest-%{tenantId}-%{+YYYY.MM.dd}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
index => "atest-%{tenantId}-%{+YYYY.MM.dd}"
}
}
}
}
[WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"atest-Wipro,Wipro-be-log-2018.08.27", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x57f204da>], :response=>{"index"=>{"_index"=>"atest-Wipro,Wipro-be-log-2018.08.27", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [atest-Wipro,Wipro-be-log-2018.08.27], must not contain the following characters [ , \", *, \\, <, |, ,, >, /, ?]", "index_uuid"=>"_na_", "index"=>"atest-Wipro,Wipro-be-log-2018.08.27"}}}}
The problem was with uppercase letter, I have updated my configuration but tenantId
is not changing.