Using filebeat, cannot read json log to custom index

Index template already created. It runs but doesn''t tell me any error and can't see new index which should be call "fb-message-7.12.0-yyyy.mm.dd"

I notice other issues i'm missing the param to indicate this is a custom json file trying to read, added the Processor part.

Can anybody help? still not telling me what is my issue, trying to load this custom json file to ES using custom index template

test-es.json sample:
{"communication_medium_id":1,"category_name":"Appointment reminders","processed_date":"2020/01/14 00:00:01","message_fk":1182176}

Elasticsearch version 7.12.0

filebeat.yml below:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    - /home/test-es.json

# ======================= Elasticsearch template setting =======================
setup.template.overwrite: true
setup.ilm.enabled: false
setup.template.enabled: true
setup.template.name: "fb-message-%{[agent.version]}"
setup.template.pattern: "fb-message-%{[agent.version]}-*"


# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["ip:9200"]
  index: "fb-message-%{[agent.version]}-%{+yyyy.MM.dd}"
  # Protocol - either `http` (default) or `https`.
  #protocol: "https"
  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  username: "xxx"
  password: "xxx"


# ================================= Processors =================================
processors:
  - decode_json_fields:
      fields: ["communication_medium_id","category_name","Appointment","processed_date","message_fk"]
      process_array: false
      max_depth: 1
      target: ""
      overwrite_keys: false
      add_error_key: true

There 2 things that could/would be the issue here:

  1. When data is being read by a filebeat log input, the logline is put in the field message, so try to first change your processor to this:

     processors:
       - decode_json_fields:
           fields: ["message"]
           process_array: false
           target: ""
           overwrite_keys: false
           add_error_key: true
    
  2. Is your json in your logfile, one json per line (ndjson), or is one JSON spread over multiple lines? If it is the latter, you need to add support for multiline like so:

    - type: log
      enabled: true
      paths:
        - /home/test-es.json
      multiline.pattern: '^{'
      multiline.negate: true
      multiline.match:  after

thanks for replying...actually my json suppose to be like this which works just testing ES in json file but even adding your suggestion doesn't work..

{"data": [{"communication_medium_id":1,"category_name":"Appointment reminders","processed_date":"2020/01/14 00:00:01","message_fk":1182176},{"communication_medium_id":221,"category_name":"Appointment reminders","processed_date":"2020/01/14 00:00:01","message_fk":1182176}]}

Not sure what i'm not getting but if I test with only one record json without the data: it works...but my file has many records..

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.