Using tcp plugin to parse logs from multiple sources

We are using ELK 7.6.2 stack.

I am trying to configure Logstash to parse inputs based on tcp plugin. My config looks like this:

input {
    tcp {
        port => 6789
            codec => json_lines
            tags => ["urlShtner"]
    }
    tcp {
        port => 6790
            codec => json_lines
            tags => ["devDF"]
    }
}

filter {

    if "devDF" in [tags] {
            mutate { add_field => { "[@metadata][indexPrefix]" => "uat_tv-dev-datafeed" } }
    } else if "urlShtner" in [tags] {
        mutate { add_field => { "[@metadata][indexPrefix]" => "uat_tv-shortener" } }
    }
}

output {

    if [@metadata][indexPrefix] {

        file {
                path => "/opt/elasticsearch/logs/multi_debug.txt"
                codec => rubydebug
                }

    elasticsearch {
     hosts => [ "xx-xxx-xxx-xx:12345" ]
     hosts => [ "xx-xxx-xxx-xx:12345" ]
     user => "elastic"
     password => "xxxxxxxxxxxxxxxxx"
     index => "%{[@metadata][indexPrefix]}-%{+YYYY.MM.dd}"
     action => "index"
                        }
                }

        }

As a result of running this I do see the index uat_tv-shortener getting created and populated correctly.

uat_tv-dev-datafeed however gets created but does not populate. I see the following error in elasticsearch log:

[2022-07-13T13:25:17,222][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"uat_tv-dev-datafeed-2022.07.13", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x296ea012>], :response=>{"index"=>{"_index"=>"uat_tv-dev-datafeed-2022.07.13", "_type"=>"_doc", "_id"=>"DFqX-IEB8Zh2MUoN9ZUD", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Can't merge a non object mapping [DbUtils.deleteWrapperSQL.begin] with an object mapping [DbUtils.deleteWrapperSQL.begin]"}}}}

DbUtils.deleteWrapperSQL.begin does exist as a field in the incoming message.

Please guide on how to fix this?

I took lead from How to create multiple indexs with multiple input in logstash

As I understand it ... your event, as you say, has a [DbUtils][deleteWrapperSQL][begin] field. The index has a mapping for that field which says it should be an object. That is, there would be a field such as [DbUtils][deleteWrapperSQL][begin][someField] within it. However, in your event [DbUtils][deleteWrapperSQL][begin] is not an object, but a concrete value, such as "someValue".

Use the Get Mapping API to check what the mapping is.

Thanks @Badger

When I run it in Dev tools I see it as blank.

GET /uat_tv-dev-datafeed-2022.07.13/_mapping
{
  "uat_tv-dev-datafeed-2022.07.13" : {
    "mappings" : { }
  }
}

In the rubydebug log I see the occurrence of the parameter as:

"DbUtils.deleteWrapperSQL.begin" => "DECLARE v_last_updated_by varchar2(100); BEGIN  DBMS_SESSION.SET_IDENTIFIER('USERHERE'); select sys_context('USERENV','CLIENT_IDENTIFIER') into v_last_updated_by FROM DUAL;",
 

There is also:

"DbUtils.deleteWrapperSQL.begin.end" => ";END;",

I do not have an elasticsearch instance to test with. At one time elasticsearch stopped allowing periods in field names, because it caused ambiguity about whether a field was a value or an object. They they improved the disambiguation rules and started allowing them again. I am wondering if having two fields names containing periods where one is a prefix of the other is a corner case where it is broken. Hopefully someone with an elasticsearch instance will see this and test it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.