I keep getting json parse error and I’ve been playing with filter for hours.
[logstash.codecs.json ][main][68389d] JSON parse error, original data now in message field {:message=>"Could not set field 'ip' on object 'gitlab.domain.si' to value '10.0.1.22’This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string"
I am getting my logs from logspout
input {
udp {
port => 5044
codec => json
}
}
Any idea how to adjust my filter to avoid this or do something about it?
Thank you all. This is my ELK (elastic, logstash, kibana and logspout stack on my docker swarm developer stack. Gitlab service is giving me this messages)
my config logstash:
input {
udp {
port => 5044
codec => json
}
}
filter {
#Drop internal Logstash logs
if [docker][image] =~ /^logstash/ {
drop { }
}
}
and example error:
elk_logstash.0.xxxxx@node | [2026-02-05T07:46:38,710][ERROR][logstash.codecs.json][main][<event_id>] JSON parse error, original data now in message field { :message => "Could not set field 'ip' on object 'example.host' to value 'X.X.X.X'. This is probably due to trying to set a field like [foo][bar] = someValue when [foo] is not either a map or a string", :exception => Java::OrgLogstash::Accessors::InvalidFieldSetException, :data => "{ "backend_id":"rails", "body_limit":104857600, "correlation_id":"<correlation_id>", "docker":{ "name":"/stack_service.1.xxxxx", "id":"<container_id>", "image":"gitlab/gitlab-ee:", "hostname":"<container_hostname>", "labels":{ "com_docker_stack_namespace":"", "com_docker_swarm_service_name":"" } }, "duration_ms":0, "host":"example.host", "level":"info", "method":"POST", "msg":"access", "proto":"HTTP/1.1", "read_bytes":1659, "remote_addr":"X.X.X.X:0", "remote_ip":"X.X.X.X", "route":"^/api/v4/jobs/request\\z", "status":204, "stream":"stdout", "system":"http", "time":"2026-02-05T07:46:38Z", "uri":"/api/v4/jobs/request", "user_agent":"gitlab-runner ", "written_bytes":0 }" }
You have issue with data mapping because ECS and non ECS data structure is the same.
Can you use another index i.e. testindex or delete existing with data mapping/view?
As Badger said please provide us with:
data sample
full input/output section, it's not the same if you sending data in filebeat or logstash indices because data mapping is not same and not possible to change, except in some cases.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.