Filebeat crashing on JSON decoder and multiline together specifying a message_key value error

Filebeat Version: 6.5.4 and Logstash Version: 6.5.4

Hello, here is the Filebeat configuration I'm using

- type: log
    - /var/lib/docker/containers/*/*.log
  symlinks: true
    pattern: ^[[:space:]]
    negate: false
    match: after
  ignore_older: 1h 
  clean_inactive: 65m
  close_inactive: 5m
  scan_frequency: 20s
  json.message_key: log
  json.keys_under_root: true
  tail_files: true
    clustername: ${Clustername}
    - add_kubernetes_metadata:
        in_cluster: true
    - drop_event:

Giving json message key as "log" is showing up a failure on logstash with error:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.04.10", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x20018e3e>], :response=>{"index"=>{"_index"=>"logstash-2019.04.10", "_type"=>"doc", "_id"=>"uKwYBWoBV9VbumyOAOsx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [log] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:527"}}}}}

So, to overcome this I tried commenting json.message_key: log which was crashing filebeat.

Exiting: When using the JSON decoder and multiline together, you need to specify a message_key value accessing '0' (source:'/usr/share/filebeat/inputs.d/kubernetes.yml')

For a workaround to this, I just tried putting some random name in message key which fixed the crashing of filebeat however every log line is being inserted with that message key field and value is empty. In kibana, I'm seeing the default field "log" which is actually showing up the log lines, May I know the reason that is causing it and how to overcome this. Am I doing anything wrong?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.