Logstash not inserted data into elasticsearch

Hello Team,

I had Elasticsearch, Logstash and Kibana v7.16.2 with xpack security based login enabled, Yesterday i had upgraded my ELK versions to 8.6.1 using my docker-compose file.

Current problem: My logstash fetched the data but not inserted in to elasticsearch. But no error shown in my ELK containers.

my logstash.conf as follows.

input {
	beats {
		port => 5044
	}

	tcp {
		port => 5000
	}
 
  file {
    path => "/usr/share/logstash/pipeline/*.csv"
    start_position => "beginning"
    sincedb_path => "/usr/share/logstash/pipeline/sincedb.txt"
  }
}
filter {
  csv {
      separator => ","
      columns => ["build_date", "build_start_time", "build_end_time", "build_duration", "build_requester", "fullname", "build_id", "build_conf", "build_status", "build_site"]
  }
}
output {
   elasticsearch {
     action => "index"
     hosts => "http://elasticsearch:9200"
     index => "employee-data"
     user => "elastic"
     password => "changeme"
     document_id => "%{build_id}"
      }
stdout {}
}

It was working fine with my previous ELK version - 7.16.2, So what could the issue with my current version & configurations?

Any help on this will be helpful.

AFAIK nothing reported on the latest LS version.
Maybe is already read the file. Use ruby debugger: stdout { codec => rubydebug{} }

@Rios - I have added stdout { codec => rubydebug{} } then removed all the entries from sincedb_path => "/usr/share/logstash/pipeline/sincedb.txt" and tried, but still i can't see the data in elasticsearch that i have loaded into my logstash.

I can see the below warnings and errors in my logstash container now.

[WARN ][logstash.outputs.elasticsearch][main][13ed313a5675abd23c078edd33a0a3c4be86d627339fd3bf53181d4183411c94] Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>"61975163", :_index=>"employee-data",

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '61975163'. Preview of field's value: '{name=8cb532a93e4e}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:416"}}}}

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '61959101'. Preview of field's value: '{name=8cb532a93e4e}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:419"}}}}

Are you using somewhere the filed: host?
Can you try with next setting in logstash.yml or if you are using pipelines in pipelines.yml or directly in plugin setting.
pipeline.ecs_compatibility: disabled

Default value v8. Click here for more details.

This is probably the issue, the file input adds the host field.

On version 7 it was added into the field host, on version 8 ecs compatibility is enabled by default and the value of the hostname is added into the host.name field, which will lead to mapping issues.

This is a breaking change from 7.X to 8.X.

Just add the setting suggested and it should work.

Hi @Rios and @leandrojmp - Have updated pipeline.ecs_compatibility: disabled settings and i could see the data in elasticsearch.

Thanks a lot for the support.

1 Like

Long live the king and the Elastic team.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.