Can't get text on a START_OBJECT at 1:1043

Hey guys,

I'm facing a problem with my elasticsearch-output in Logstash. What I'm trying to archieve is indexing my logs, sent by filebeat, via logstash in elasticsearch. My logs are in json-codec and I'm using version 6.2.3. for Filebeat, LS and ES. When Logstash tries to index my loglines I'm getting this error:

[2019-08-08T11:05:03,798][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"application_log-08.08.2019", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x6c6a3dae>], :response=>{"index"=>{"_index"=>"application_log-08.08.2019", "_type"=>"doc", "_id"=>"bA15cGwBBfwccSsXdQAy", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [vae_message.ergebnis]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:1043"}}}}}

Here's the original logline:

{"@timestamp":"2019-08-08T11:04:27.677","level":"INFO","modul":"__main__","process":"12291","vae_message":{"action": "VAE-NN abfragen", "parameter": {"corrid": "MyCorrelationId", "uuid": "d19bcd68-e389-4414-989b-2ba5a85e274a", "volltext_len": 83}, "ergebnis": {"vorgangsart": "Rechnung", "points": 0.7539469585414332, "error_text": ""}}}

Here's the logline Logstashs receives from Filebeat:

{"message":"{\"@timestamp\":\"2019-08-08T11:04:28.390\",\"level\":\"INFO\",\"modul\":\"__main__\",\"process\":\"12291\",\"vae_message\":{\"action\": \"VAE-NN abfragen\", \"parameter\": {\"corrid\": \"MyCorrelationId\", \"uuid\": \"d19bcd68-e389-4414-989b-2ba5a85e274a\", \"volltext_len\": 83}, \"result\": {\"vorgangsart\": \"Rechnung\", \"points\": 0.7539469585414332, \"error_text\": \"\"}}}","@version":"1","type":"application_log","process":"12291","source":"/vae-service/predict-main-log.txt","offset":48673,"tags":["HDLT","MS-VAE","beats_input_codec_plain_applied"],"level":"INFO","vae_message":{"result":{"error_text":"","points":0.7539469585414332,"vorgangsart":"Rechnung"},"parameter":{"corrid":"MyCorrelationId","volltext_len":83,"uuid":"d19bcd68-e389-4414-989b-2ba5a85e274a"},"action":"VAE-NN abfragen"},"modul":"__main__","@timestamp":"2019-08-08T09:04:28.390Z","beat":{"version":"6.2.3","hostname":"MM01","name":"MM01"},"host":"localhost"}

My Logstash-Configuration looks like this:

input {
    beats {
      host => "192.168.xxx.xxx"
      port => "5044"
      type => "application_log"
    }
}

filter {
    json {
        source => "message"
    }
}

output {
    file {
        path => "/var/log/logstash.output.log"    
    }

    elasticsearch {
        hosts => ["192.168.xxx.xxx:9200"]
        index => ["application_log-%{+dd.MM.YYYY}"]
    } 
}

Can anybody help me please?

What does the mapping of your ES index look like? It seems like it is expecting “ergebnis” to be text, but it is an object. So you'd either have to change it to a string in Logstash or recreate your index with the field ergebnis as a nested datatype.

Right! I haven't got the mapping in mind. I'll prove it and let you know if that was the problem.

Thank you so much! I worked on that the whole morning. The mapping in Elasticsearch was the problem.

I'm glad that I could help. Viel Erfolg :wink:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.