Filebeat docker help

Sorry if these questions have been answered before, I am new to elastic and cannot find the answers

I have an entry in my docker logs that looks like this

Wed, 28 Aug 2019 15:40:23 GMT - info: Prematch events sync process started

When filebeat ships it elastic it ends up like this

Wed, 28 Aug 2019 15:40:23 GMT - e[32minfoe[39m: Prematch events sync process started

Firstly can I get rid of the "e[32m" and "e[39m"?

Secondly, I get rid of the timestamp as there is already a field for this?

Thirdly I am seeing these errors. How can I stop them?

ERROR readjson/json.go:52 Error decoding JSON: invalid character 'a' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'a' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'W' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: json: cannot unmarshal number into Go value of type map[stri
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character 'c' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: json: cannot unmarshal number into Go value of type map[stri
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value
ERROR readjson/json.go:52 Error decoding JSON: invalid character ':' looking for beginning of value

I forgot to include my config

#=========================== Filebeat inputs =============================
filebeat.inputs:
{% if inventory_hostname in groups['containers'] | default() %}

  • type: container
    paths:
    • '/var/lib/docker/containers//.log'
      multiline.pattern: '[1]'
      multiline.negate: false
      multiline.match: after
      json.message_key: msg
      exclude_lines:
    • "^There are no requests to send"
    • "^Request method 'GET' not supported"
      fields:
      spenvironment.name: {{ env }}
      {% endif %}

#============================= Filebeat modules ===============================
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 1
index.number_of_replicas: 1

setup.template.name: "filebeat"
setup.template.pattern: "filebeat-*"

#============================== Kibana =====================================
setup.kibana:
{% for host in groups['kibana'] %}
host: {{ host }}:5601
{% endfor %}

#============================== Elastic Output =====================================
output.elasticsearch:
hosts:
{% for host in groups['elastic_stack'] %}
- {{ host }}:9200
{% endfor %}
ilm.enabled: auto
ilm.rollover_alias: "filebeat"
ilm.pattern: "{now/M{YYYY.MM}}-000001"

#================================ Processors =====================================

processors:

  • add_host_metadata: ~

processors:

  • add_fields:
    target: spenvironment
    fields:
    name: {{ env }}

{% if inventory_hostname in groups['containers'] | default() %}
processors:

  • add_docker_metadata:
    host: "unix:///var/run/docker.sock"
    {% endif %}

#================================ Logging =====================================
logging.level: warning
logging.to_files: true
logging.to_syslog: false
logging.files:
path: /var/log/filebeat
name: filebeat.log
keepfiles: 7


  1. [:space:] ↩︎

Hi @Phil_Brady and welcome :slight_smile:

They look like ANSI escape sequences that are used to define colors, is it possible that these logs are colored?
You can probably configure your application to don't log colored output, this is usually not useful in log files.
If not, filebeat doesn't have any feature to remove parts of the logs, you may need to use an ingest pipeline and the gsub processor.

You can define an ingest pipeline to parse your logs, there you could separate the date and the rest of the log message, and use the date in the logs as timestamp.
Before starting to implement your own pipeline, take a look to the existing Filebeat modules, to see if there is already a module for the service generating these logs. Filebeat modules include predefined pipelines for well-known services.

It looks like Filebeat is trying to parse as JSON something that is not a JSON document. Configuration would help to diagnose this, I see you have pasted it, but it seems incorrectly formatted. Could you paste it again as preformatted text? There is a button in the toolbar with an icon like </> that can help with that.

Thanks, I will have a look at ingest pipelines.

For number 3 I fixed it by adding "encoding: plain" to filebeat.yml

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.