Filebeat send json data to Elastiksearch isuue

I am using filebeat to send log data in ELK kibana.

Log example:- error_log.log

{"type":"ERROR","errorType":"ERROR","message":"Error in add","others":{"error": "something went wrong","source":"/public/new/user"}}
{"type":"ERROR","errorType":"ERROR","message":"Error in update","others":{"error": "something went wrong","source":"/public/users/edit"}}


    - type: log
         enable: true
             - error_log.log
         json.keys_under_root: true

        - decode_json_fields:
              fields: ["others"]

        path: ${path.config}/modules.d/*.yml

        hosts: [""]

kiban logging not working but if I remove json.keys_under_root: true & processors then it's work but only as text format not as json

Hi @Partha_Biswas,

the decode_json_fields processor decodes JSON values from string fields. Looking at the example log lines you gave, the value assigned to the others key is not a string, but already an object. That means that the initial JSON parsing via json.keys_under_root should have parsed it already and the that the processor likely fails.

Have you tried removing the processor but leaving the json.keys_under_root in?

:information_source: Unrelated to the indexing, but important for querying later, in order to properly search the resulting documents you will likely want to declare the others field as a nested-type field in your mapping.

Yes, I tried with json.keys_under_root and removed processors
But same kibana logging not work

Is there any problem with my error_log.log because it's json formats but it's file extention not .json and it's all log is not warped with { }

I don't expect the file name extension to be of any significance. Could you check if there is anything in the Filebeat logs or Elasticsearch logs that correlates with the parsing failures?

I got my error and now it's working

That's good to hear. Is it something you can share with the community so we can all learn from it?