We are using ELK 7.6.2 stack.
I am trying to configure Logstash to parse inputs based on tcp plugin. My config looks like this:
input {
tcp {
port => 6789
codec => json_lines
tags => ["urlShtner"]
}
tcp {
port => 6790
codec => json_lines
tags => ["devDF"]
}
}
filter {
if "devDF" in [tags] {
mutate { add_field => { "[@metadata][indexPrefix]" => "uat_tv-dev-datafeed" } }
} else if "urlShtner" in [tags] {
mutate { add_field => { "[@metadata][indexPrefix]" => "uat_tv-shortener" } }
}
}
output {
if [@metadata][indexPrefix] {
file {
path => "/opt/elasticsearch/logs/multi_debug.txt"
codec => rubydebug
}
elasticsearch {
hosts => [ "xx-xxx-xxx-xx:12345" ]
hosts => [ "xx-xxx-xxx-xx:12345" ]
user => "elastic"
password => "xxxxxxxxxxxxxxxxx"
index => "%{[@metadata][indexPrefix]}-%{+YYYY.MM.dd}"
action => "index"
}
}
}
As a result of running this I do see the index uat_tv-shortener
getting created and populated correctly.
uat_tv-dev-datafeed
however gets created but does not populate. I see the following error in elasticsearch log:
[2022-07-13T13:25:17,222][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"uat_tv-dev-datafeed-2022.07.13", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x296ea012>], :response=>{"index"=>{"_index"=>"uat_tv-dev-datafeed-2022.07.13", "_type"=>"_doc", "_id"=>"DFqX-IEB8Zh2MUoN9ZUD", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Can't merge a non object mapping [DbUtils.deleteWrapperSQL.begin] with an object mapping [DbUtils.deleteWrapperSQL.begin]"}}}}
DbUtils.deleteWrapperSQL.begin
does exist as a field in the incoming message.
Please guide on how to fix this?
I took lead from How to create multiple indexs with multiple input in logstash