Collecting docker logs using Filebeats

Hi, I am trying to collect this kind of logs from a docker container:

[1620579277][642e7adc-74e1-4b89-a705-d271846f7ebc][channel1][afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set] ex02 set
[1620579277][ac9f99b7-0126-45ed-8a74-6adc3a9d6bc5][channel1][afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set][Transaction] Aval =201 Bval =301 after performing the transaction
[1620579277][9211a9d4-3fe6-49db-b245-91ddd3a11cd3][channel1][afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set][Transaction] Transaction makes payment of X units from A to B
[1620579280][0391d2ce-06c1-481b-9140-e143067a9c2d][channel1][1f5752224da4481e1dc4d23dec0938fd65f6ae7b989aaa26daa6b2aeea370084][usecase_cc][get] Query Response: {"Name":"a","Amount":"200"}

I have set the filebeat.yml:

    filebeat.inputs:
    - type: container
        paths:
          - '/var/lib/docker/containers/container-id/container-id.log'
    
    processors:
    - add_docker_metadata:
         host: "unix:///var/run/docker.sock"
    - dissect:
         tokenizer: '{"log":"[%{time}][%{uuid}][%{channel}][%{id}][%{chaincode}][%{method}] %{specificinfo}\"\n%{}'
         field: "message"       
         target_prefix: ""

    output.elasticsearch:
      hosts: ["elasticsearch:9200"]
      username: "elastic"
      password: "changeme"
      indices:
          - index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"
    logging.json: true
    logging.metrics.enabled: false

Although elasticsearch and kibana are deployed successfully, I am getting this error when a new log is generated:

{"error":{"root_cause":[{"type":"index_not_found_exception","reason":"no such index 
[filebeat]","resource.type":"index_or_alias","resource.id":"filebeat","index_uuid":"_na_","index":"filebeat"}],"type":"index_not_found_exception","reason":"no such index 
[filebeat]","resource.type":"index_or_alias","resource.id":"filebeat","index_uuid":"_na_","index":"filebeat"},"status":404}

Note: I am using version 7.12.1 and Kibana, Elastichsearch and Logstash are deployed in docker.

Thanks in advance,

Santiago.

Hi @sfigueroa :slightly_smiling_face:

Looking at your config and checking the docs, it seems you are lacking some more info to use indices on the elasticsearch output Configure the Elasticsearch output | Filebeat Reference [7.12] | Elastic

Maybe you want to just remove indices key and leave a single index field. Also ensure that your events contains a agent.version field, I'm not fully sure if it's included when using Filebeat (it's however included when using Elastic's Agent)

Hi @Mario_Castro,

Thanks for the response :wink:.

I have introduced filebeat.autodiscover in the filebeat.yml file without results yet:

    filebeat.autodiscover:
    providers:
    - type: docker
      templates:
        - condition:
            contains:
              docker.container.image: dev-peer0.org1.example.com-usecase_cc-v1.0-b38729db0bb1b247332531ef780a441eb3ea85460180de4eb8ddb7d1ff7dd202
          config:
            - type: container
              paths:
                - /var/lib/docker/containers/b202d819663ad0805ebb6360119d4c365cc1b5c03d12516793d1386983d27a89/b202d819663ad0805ebb6360119d4c365cc1b5c03d12516793d1386983d27a89.log
              exclude_lines: ["^\\s+[\\-`('.|_]"]  # drop asciiart lines

    processors:
    - add_docker_metadata:
       host: "unix:///var/run/docker.sock"
    - dissect:
       tokenizer: '{"log":"[%{type}][%{time}][%{uuid}][%{channel}][%{id}][%{chaincode}][%{method}] %{specificinfo}\"\n%{}'
       field: "message"       
       target_prefix: ""

    output.elasticsearch:
      hosts: ["localhost:9200"]
      username: "elastic"
      password: "changeme"

    logging.json: true
    logging.metrics.enabled: false

I am still getting the same error in elasticsearch.

The point is that i am not receiving errors in filebeat container. Is there a way to test it? E.g., using curl?

Hi @Mario_Castro,

I have an update.

First, I have returned to the previous version where I was using filebeat.inputs.

In addition, I am using output.console to debug the filebeats logs:

output.console:
enabled: true

Now I have two scenarios:

  1. When I use type: log
    filebeat.inputs:
    - type: log
       enabled: true
    

returning:

{"@timestamp":"2021-05-11T21:28:09.545Z","@metadata": "beat":"filebeat","type":"_doc","version":"7.12.1"},"ecs":{"version":"1.8.0"},"log": {"offset":0,"file": {"path":"/var/lib/docker/containers/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18-json.log"}},"stream":"stderr","time":"2021-05-11T20:01:00.818112662Z","input":{"type":"log"},"host":{"name":"filebeat"},"agent":{"ephemeral_id":"84e8da1e-d2a7-4be8-9828-35ea81dc88c4","id":"860fbab2-a23e-4241-8675-144ae9e5a353","name":"filebeat","type":"filebeat","version":"7.12.1","hostname":"filebeat"}}
{"@timestamp":"2021-05-11T21:28:09.545Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.12.1"},"time":"2021-05-11T20:01:00.823706945Z","input":{"type":"log"},"ecs":{"version":"1.8.0"},"host":{"name":"filebeat"},"agent":{"name":"filebeat","type":"filebeat","version":"7.12.1","hostname":"filebeat","ephemeral_id":"84e8da1e-d2a7-4be8-9828-35ea81dc88c4","id":"860fbab2-a23e-4241-8675-144ae9e5a353"},"log":{"file":{"path":"/var/lib/docker/containers/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18-json.log"},"offset":249},"stream":"stderr"}
  1. When I use type: container
    filebeat.inputs:
    - type: container
       enabled: true
    

returning:

{"@timestamp":"2021-05-11T20:01:00.818Z","@metadata": "beat":"filebeat","type":"_doc","version":"7.12.1"},"ecs": {"version":"1.8.0"},"stream":"stderr","error":{"message":"Error decoding JSON: json: cannot unmarshal number into Go value of type map[string]interface {}","type":"json"},"log":{"offset":0,"file": {"path":"/var/lib/docker/containers/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18-json.log"}},"input":{"type":"container"},"host":{"name":"filebeat"},"agent":{"ephemeral_id":"3604feac-de36-43de-a304-36c770b3fc60","id":"56b1d846-9112-49f2-86e4-6b87befaf59a","name":"filebeat","type":"filebeat","version":"7.12.1","hostname":"filebeat"}}
{"@timestamp":"2021-05-11T20:01:00.823Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.12.1"},"input":{"type":"container"},"ecs":{"version":"1.8.0"},"host":{"name":"filebeat"},"agent":{"id":"56b1d846-9112-49f2-86e4-6b87befaf59a","name":"filebeat","type":"filebeat","version":"7.12.1","hostname":"filebeat","ephemeral_id":"3604feac-de36-43de-a304-36c770b3fc60"},"log":{"offset":249,"file":{"path":"/var/lib/docker/containers/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18/d89f4cf865e6dd24d275c15660b7ef7cfdb2afb867c59bcb471c7dc8c9ceeb18-json.log"}},"stream":"stderr","error":{"message":"Error decoding JSON: json: cannot unmarshal number into Go value of type map[string]interface {}","type":"json"}}

However, I still have not solved the issue.

I ended up using an alternative way, which has been to use logstash directly. However, one mistake I was making was when adding the path inside filebeats, I wasn't considering that I wasn't mapping it outside the container.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.