Logstash filter JSON parse error

Hi,

I keep getting json parse error and I’ve been playing with filter for hours.

[logstash.codecs.json ][main][68389d] JSON parse error, original data now in message field {:message=>"Could not set field 'ip' on object 'gitlab.domain.si' to value '10.0.1.22’This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string"

I am getting my logs from logspout

input {

udp {

      port  => 5044

      codec => json

   }

}

Any idea how to adjust my filter to avoid this or do something about it?

Thanks!

Welcome to the community!

Try to set ecs_compatibility: disabled , default value is pipeline.ecs_compatibility: v8 in logstash.yml.It's enough to set on the plugin level.

Without ECS you will review your data to find what cause an error. Of course you can use ruby debug in output.

udp {
      port  => 5044
      ecs_compatibility => disabled
      codec => json
   }
1 Like

Hello and welcome,

Can you share a sample of your message?

2 Likes

I know it can be frustrating,

This answer might help, but it doesn’t go into details, so you will have to work on it.

Providing a single reproducible example will allow me, Leandro, Rios, David, or others to provide much more customised advice.

Please show us the problem, not just the error message.

2 Likes

Thank you all. This is my ELK (elastic, logstash, kibana and logspout stack on my docker swarm developer stack. Gitlab service is giving me this messages):slight_smile:

my config logstash:

input {
udp {
port  => 5044
codec => json
}
}

filter {

#Drop internal Logstash logs
if [docker][image] =~ /^logstash/ {
drop { }
}

}

and example error:

elk_logstash.0.xxxxx@node | [2026-02-05T07:46:38,710][ERROR][logstash.codecs.json][main][<event_id>]
JSON parse error, original data now in message field {
:message => "Could not set field 'ip' on object 'example.host'
to value 'X.X.X.X'.
This is probably due to trying to set a field like [foo][bar] = someValue
when [foo] is not either a map or a string",
:exception => Java::OrgLogstash::Accessors::InvalidFieldSetException,
:data => "{
"backend_id":"rails",
"body_limit":104857600,
"correlation_id":"<correlation_id>",
"docker":{
"name":"/stack_service.1.xxxxx",
"id":"<container_id>",
"image":"gitlab/gitlab-ee:",
"hostname":"<container_hostname>",
"labels":{
"com_docker_stack_namespace":"",
"com_docker_swarm_service_name":""
}
},
"duration_ms":0,
"host":"example.host",
"level":"info",
"method":"POST",
"msg":"access",
"proto":"HTTP/1.1",
"read_bytes":1659,
"remote_addr":"X.X.X.X:0",
"remote_ip":"X.X.X.X",
"route":"^/api/v4/jobs/request\\z",
"status":204,
"stream":"stdout",
"system":"http",
"time":"2026-02-05T07:46:38Z",
"uri":"/api/v4/jobs/request",
"user_agent":"gitlab-runner ",
"written_bytes":0
}"
}

"Could not set field 'ip' on object 'example.host'
to value 'X.X.X.X'.

You have the hostname in the ip address field. Please disable ECS and try again.

1 Like

With: “ecs_compatibility => disabled”

input {
udp {
port => 5044
codec => json
ecs_compatibility => disabled
}
}

My logs are not going to the elastic and kibana

Could not index event to Elasticsearch. {:status=>400, :action=>["create", {:_id=>nil, :_index=>"logs-audit-default", :routing=>nil}, {"message"=>"23.88.100.84 - - [06/Feb/2026:08:58:11 +0000] "POST /api/v4/jobs/request HTTP/1.1" 204 0 "" "gitlab-runner 18.3.0~pre.154.g3ecc71b2 (main; go1.24.4 X:cacheprog; linux/amd64)" -", "stream"=>"stdout", "tags"=>, "@version"=>"1", "@timestamp"=>2026-02-06T08:58:11.058178336Z,……

"reason"=>"[1:2937] object mapping for [host] tried to parse field [host] as object, but found a concrete value"}

You have issue with data mapping because ECS and non ECS data structure is the same.

Can you use another index i.e. testindex or delete existing with data mapping/view?

As Badger said please provide us with:

  • data sample
  • full input/output section, it's not the same if you sending data in filebeat or logstash indices because data mapping is not same and not possible to change, except in some cases.