How to fix Filebeat mapper parsing error - tried to parse field[x] as object, but found a concrete value?

I am using filebeat 6.2.1 to pick up logs, parse as JSON and send them to elasticsearch 6.5.4 and I am getting the following error:

WARN    elasticsearch/client.go:520
Cannot index event publisher.Event{Content:beat.Event{Timestamp: 
[...]
"mapper_parsing_exception","reason":
"object mapping for [event] tried to parse field [event] as object, but found a concrete value

I think this is a field mapping issue and also related to nested json parsing but I am not sure how to approach solving it.

filebeat.yml:

# ======================================
# Filebeat prospectors

filebeat.prospectors:

- type: log

# Change to true to enable this prospector configuration.
enabled: true

# Paths that should be crawled and fetched. Glob based paths.
paths:
 - /var/log/bridge/*.json
 #- c:\programdata\elasticsearch\logs\*

 ignore_older: 24h
 scan_frequency: ${FilebeatScanFrequency}

 json:
   message_key: event
   keys_under_root: true

# ======================================
# Elasticsearch template setting 

setup.template.settings:
index.number_of_shards: 1

Example filebeat output:

{
  "_index": "filebeat-6.2.1-date",
  "_type": "doc",
  "_id": "nh6Z9WcBwAmR4kjdfofdk-H",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2018-12-28T16:15:48.012Z",
    "event": "",
    "Timestamp": "2018-12-28T21:45:42.7876369+05:30",
    "Level": "Information",
    "MessageTemplate": "{AuthenticationScheme} was not authenticated. 
Failure message: {FailureMessage}",
    "beat": {
      "name": "DB-01",
      "hostname": "DB-01",
      "version": "6.2.1"
    },
    "Properties": {
      "EventId": {
        "Id": 7
      },
      "Country": "Canada",
      "Format": "json",
      "MachineName": "DB-01",
      "ThreadId": 57,
      "Source": "Example API",
      "FailureMessage": "No authorization header.",
      "SourceContext": 
      "Odachi.AspNetCore.Authentication.Basic.BasicMiddleware",
      "RequestPath": "/status",
      "AuthenticationScheme": "Basic",
      "ProcessId": 4627,
      "EnvironmentUserName": "EXAMPLEDNS\\example.api",
      "RequestId": "0NKDFUJDFKDL",
      "Environment": "Production"
    },
    "source": "D:\\Logs\\Company Example-api.json",
    "offset": 9575044,
    "RenderedMessage": "\"Basic\" was not authenticated. Failure message: 
\"No authorization header.\"",
    "prospector": {
      "type": "log"
    }
  },
  "fields": {
    "@timestamp": [
      "2018-12-28T16:15:48.012Z"
    ]
  },
  "sort": [
    154601321324346
  ]
}

I am looking for a way to correctly fix the mapping error and successfully send parsed JSON logs to elasticsearch. Thanks in advance for any help or suggestions.

Do you have other filebeat versions running as well?

What is the exact index name you are indexing into?

In Elasticsearch each index has an index mapping + we normally provide a template with known fields. The mapping is kind of a schema, as it configures the type for each field. If a field is not known (via the initial template or not seen yet), then Elasticsearch dynamically tries to derive the type of the 'event' field.

In your current index the field event is said to be a JSON object, but your actual document says event is a string ("event": ""). This is a mapping conflict and your event can not be indexed.

Filebeat 6.5 introduces the 'event' in it's mapping, as it's part of ECS, but 6.2.1 should not reserve the event namespace yet.

The error might occur due to (one of):

  • you are actually using filebeat 6.5.x
  • you might index filebeat 6.2.x data into the same index used by 6.5.x
  • you might index filebeat 6.2.x data into an index that matches a template configuration from a former/other 6.5.x installation
  • The type in your json files is not always string
1 Like

Thank you @steffens - very helpful and yes, the conflict was between versions and the one that fixed it for me (for now) was simply downgrading back to 6.2.x (and this corrected the event field mapping error - object as opposed to string). Thanks again for taking a look!

I'm a bit confused with the backward compatibility
I thought there was backward/forward compatibility on the major version number.

So elastic search 6.3 can not safely work with a 6.5 beat (?)

It seems by your answer that the major and minor version have to match and only the revisions can be different?

In a production system I definitely anticipate that beat versions and elasticsearch version becomes out of sync. It would be nice that elasticsearch could give some details of the event that is causing the issue. It might be an old running beat that was not updated yet. Verbosing hostname and beat version would help us identify where the culprit is

1 Like

I can't say I completely understand it myself and while the release docs help there always seems to be some fiddly bits. The releases often alter/add/delete fields in the index patterns which then cause issues. I didn't have enough time to go deep but rolling back versions corrected the index mapping error and started posting logs again.

I believe I am also seeing this issue with out of sync filebeat versions. Theses seem to be They are being sent by logstash to version specific indices in the elasticsearch output plugin ( index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"), but I am seeing the following errors in the logstash logs for multiple bulk index events associated with the filebeats-6.5.4 indices:

Feb 05 16:43:01 sys-logstash logstash[3875]: [2019-02-05T16:43:01,545][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-6.5.4-2019.01.28", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x71fe052d], :response=>{"index"=>{"_index"=>"filebeat-6.5.4-2019.01.28", "_type"=>"doc", "_id"=>"Y0mdv2gBHfYDK-4_RPpM", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [system.auth.user] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:85"}}}}}
Feb 05 17:09:26 sys-logstash logstash[3875]: [2019-02-05T17:09:26,785][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-6.5.4-2019.02.01", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x50e6c551], :response=>{"index"=>{"_index"=>"filebeat-6.5.4-2019.02.01", "_type"=>"doc", "_id"=>"WXa1v2gBiK2CpjzRdPrF", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [system.auth.user] tried to parse field [user] as object, but found a concrete value"}}}}

We are running Filebeats 6.2.4, 6.3.2, and 6.5.4, Logstash 6.5.4, ElasticSearch 6.5.4, and Kibana 6.5.4. The puppet module for filebeats does not differentiate between minor versions or patch levels, so it installs whatever latest 6.x is available at the time of installation. We using separate grok filters for syslog_auth events and syslog events.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.