Elastic Cloud Setup : Pipeline Worker Stops in Logstash

I have a setup of ES, Kibana on the cloud and LS and Filebeat (2 nodes). It was all working fine but suddenly i I see today that the pipelines stop in LS with the following error:

[2019-01-21T12:51:07,188][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {:pipeline_id=>"main", "exception"=>"Could not set field 'city_name' on object '49.35.23.207' to value 'Mumbai'.This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string",

Here is my logstash configuration
if [type] == "api" {
grok {
match => { "message" => "%{IP:s2sip} %{USER:user} %{USER:auth} [%{HTTPDATE:timestamp}] %{GREEDYDATA:botmanpath}" }
}
grok {
match => { "botmanpath" => "%{URIPARAM:botman_query}" }
}

kv {
source => "botmanpath"
field_split => "&?"
prefix => "bq_"
}

geoip {
source => "bq_ip"
add_field => [ "[bq_ip][coordinates]", "%{[bq_ip][longitude]}" ]
add_field => [ "[bq_ip][coordinates]", "%{[bq_ip][latitude]}" ]
}
mutate {
convert => [ "[bq_ip][coordinates]", "float"]
}

useragent {
source => "bq_s6"
}
}
}

output {
if [type] == "tag" {
elasticsearch {
hosts => [ "--" ]
user => [ "--" ]
password =>[ "--" ]
index => [ "%{[parsed][bizid]}-%{+YYYY.MM.dd}" ]
}
}
if [type] == "api" {
elasticsearch {
hosts => [ "--" ]
user => [ "--" ]
password =>[ "--" ]
index => [ "%{[parsed][bizid]}-%{+YYYY.MM.dd}" ]
}
}

bq_ip is probably a string, so it does not have latitude/longitude sub-fields, and you cannot add a city_name to it. Did you mean

    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]

Yes. that's right

In fact, I added those after the error. If I remove the add_fields, the error persists

Can you show us your configuration?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.