Filebeat 7.2 shipping to elastic cloud .yml - Error Decoding Json logs

Hi, we have moved to elastic from our on prem 6.2 version elk cluster. We are currently having some problems into decoding json docker logs and renaming indices. My filebeat.yml test file is as follows:

filebeat.inputs:

  • type: log
    paths:
    '/var/lib/docker/containers//.log'
    document_type: docker
    json.message_key: log
    json.add_error_key: false
    encoding: utf-8

    tags: ["game-gateway"]

processors:

  • add_docker_metadata: ~

json.ignore_decoding_error: true

cloud.id: XXXXX
cloud.auth: XXXXX
index: gamegateway1

before we were going though logstash and renaming the indices were quite easy and flexible. Had to include json.ignore_decoding_error: true to cut off those decoding error messages. Can someone please advise?

Thanks

Please format logs and configs using the </> button.

There should be no decoding error if the log is using JSON. Have you checked the log files with decoding errors, if they are JSON for real? Do these have very very long lines by any chance?

You'd need to add the json.ignore_decoding_error: true to the input configuration block. It has no effect at the place you put it in.

Have you tried the docker input?

Doing some testings with json only log files and it works, i believe the error was due to other formats of logs, however, still cannot change the index names, its always sending data to filebeat-7.2. Any suggestions?

This is the write alias. Filebeat uses it because you have ILM enabled.

Yep managed to fix that. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.