Unable to push nested json from filebeat to elasticsearch

So my requirement is fairly simple. I have logs in the form of json where each line corresponds to one document. The json is in nested format. When I use a simple json, i.e. without any nestings, it works fine and the same is pushed to elastic search as well. However, the nested json throws error.
My stack details :
Elastic stack : 7.2.0 (Docker container)
Filebeats : 7.3.0 (local setup, non-docker)

My filebeats config file is :

- type: log
  - /Users/quiqua/Downloads/test.json
  json.keys_under_root: true
  json.add_error_key: true
- decode_json_fields:
    fields: ["unittests"]
    process_array: true
  hosts: ["localhost:9200"]

My log file :

{"test":{"nested":"this doesn't work"}}

Error trace :

2019-08-13T02:23:15.082+0530 WARN elasticsearch/client.go:535 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbf4c924e8416e7d0, ext:190176101272, loc:(*time.Location)(0x62bc980)}, Meta:common.MapStr(nil), Fields:common.MapStr{"agent":common.MapStr{"ephemeral_id":"78cf1fd6-d809-4bb2-8f91-c5020a3baa67", "hostname":"quiQUAs-MacBook-Pro.local", "id":"17690335-33e8-4690-b8e0-308062e40507", "type":"filebeat", "version":"7.3.0"}, "ecs":common.MapStr{"version":"1.0.1"}, "host":common.MapStr{"name":"quiQUAs-MacBook-Pro.local"}, "input":common.MapStr{"type":"log"}, "log":common.MapStr{"file":common.MapStr{"path":"/Users/quiqua/Downloads/test.json"}, "offset":17}, "test":common.MapStr{"nested":"this doesn't work"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc0003241a0), Source:"/Users/quiqua/Downloads/test.json", Offset:57, Timestamp:time.Time{wall:0xbf4c924e840b2418, ext:190175330707, loc:(*time.Location)(0x62bc980)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0xa18955, Device:0x1000007}}, TimeSeries:false}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [test] of type [keyword] in document with id 'Ajqbh2wBbkh2UY59QjrF'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:49"}}

I have tried all possible solutions and yet the issue persists. Any help would be great. Thank you.

Even though Elasticsearch is sometimes (incorrectly) called schemaless, it does in fact have a data schema, called "mapping". If you try to index a document that contains fields that have not been mapped, Elasticsearch will dynamically map those fields.

In your case, Filebeat ingests the first line into Elasticsearch which contains the test field that contains a string value. Elasticsearch will dynamically map that field as a "keyword" string.

Then, Filebeat tries to ingest the second line. This line also contains a test field, but that field contains a JSON object instead of a string. This now leads to a mapping conflict: Elasticsearch cannot index an object into a field that has already been mapped as a keyword string.

What's the solution? Give your fields a different name, depending on the type of value that they contain. The following file should work without any problems:

{"test2":{"nested":"this does also work"}}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.