Ingest Json data into Elastic Search using filebeat

I attempted to import data from a json file into Elasticsearch, but my filebeat had a configuration issue in filebeat.yml or an issue with the input Because my json is regular json, but filebeat only supports ndjson,


    - type: log
      enabled: true
      - Metrics.json
      multiline.pattern: '^{'
      multiline.negate: true
      multiline.match:  after
    - decode_json_fields:
       fields: ["message"]
       process_array: false
       max_depth: 2
       target: Json
       overwrite_keys: true
       add_error_key: false
      hosts: ["localhost:9200"]

Json Input

      "Sourcedata": [
          "Language": " C",
          "Code": 106026
          "Language": " C++",
          "Code": 52166
  1. How to convert JSon to ndjson?
  2. What problem with my filebeat configuration?

U could try using the Gsub processor | Elasticsearch Guide [7.16] | Elastic processor prior to decoding to remove the new line characters

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.