Adding config for custom log integration on an agent

Hello,

I'm trying to index json logs with a Fleet-managed agent using a custom log configuration. I successfully downloaded the agent to my computer and specified a path where the log files live.

However, when I generated a few log files, I noticed that the agent wouldn't detect log files that were json. I suspect I may need to edit the agent.logging.json and agent.logging.ecs settings in the agent's yaml file, but I don't know how!

I've tried adding them via the 'Custom configurations' setting in the Edit Custom logs integration via Kibana/Fleet, but I think the changes are not getting applied how I imagined they would.

Here is my policy that I currently have (notice how the agent.logging.json is under the data_stream field instead of the agent field):

id: 151cb5e0-5f69-11eb-84d5-c3f10edecab1
revision: 2
outputs:
  default:
    type: elasticsearch
    hosts:
      - 'XXXX'
agent:
  monitoring:
    enabled: false
    logs: false
    metrics: false
inputs:
  - id: 4fd8dfb0-5f69-11eb-84d5-c3f10edecab1
    name: python-log-integration
    revision: 1
    type: logfile
    use_output: default
    meta:
      package:
        name: log
        version: 0.4.6
    data_stream:
      namespace: testing
    streams:
      - id: logfile-log.log
        data_stream:
          dataset: tbd
        paths:
          - /Users/**/briqLogs/*.log
        agent.logging.json: true
        agent.logging.ecs: true
fleet:
  kibana:
    protocol: https
    hosts:
      - XXXX

Here is an example of a json file that I want to index:

And here is the non-json file that made it into the data stream:

How do I index ECS formatted JSON files with a custom log integration?

Perhaps editing these particular fields is a "standalone configuration" only feature.

However, even after adding these settings to the elastic-agent.yml file and rolling my own agent, the json log is still not picked up by the agent.

If I understand it is basically filebeat settings so try, I think you want to use these settings...

try setting...

  json.keys_under_root: true
  json.add_error_key: true

I've successfully added some json filebeat settings , but after re-installing the agent, its still not able to detect the troublesome json logs :confused:

Terminal: sudo elastic-agent inspect output --output default --program filebeat

filebeat:
  inputs:
  - id: logfile-log.log
    index: logs-tbd-testing
    json:
      add_error_key: true
      expand_keys: true
      keys_under_root: true
      overwrite_keys: true
    meta:
      package:
        name: log
        version: 0.4.6
    name: python-logs
    paths:
    - /Users/**/briqLogs/*.log
    processors:
    - add_fields:
        fields:
          dataset: tbd
          namespace: testing
          type: logs
        target: data_stream
    - add_fields:
        fields:
          dataset: tbd
        target: event
    - add_fields:
        fields:
          id: 56e6fd15-b7ec-4c87-9c0e-bcc32549e9c8
          snapshot: false
          version: 7.10.2
        target: elastic_agent
    revision: 1
    type: log
output:
  elasticsearch:
    hosts:
    - XXXX
    password: XXXX
    username: XXXX

---

Ohhh @Tim_Estes
I did not notice that JSON log file needs to be single json / ndjson per entry not "pretty" / expanded...
That is the expected format and normal for json logs I believe.

From the docs

"These options make it possible for Filebeat to decode logs structured as JSON messages. Filebeat processes the logs line by line, so the JSON decoding only works if there is one JSON object per line."

Yes, this seems to be the issue. Once I started saving the log files as a single json per line (and putting a newline character at the end of the file), the messages were ingested into the data stream. Thank you for you help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.