Filebeat errors after update to version 7

I get a flood of errors since updating the cluster and filebeat to version 7.
My setup consists of a 2 node elasticsearch cluster and a bunch of servers running filebeat with modules (system, auditd and nginx) shipping logs directly to the es cluster.

I get these Cannot write to a field alias [host.hostname]. and Can't get text on a START_OBJECT at 1:*** from every module.

Things I already tried:

  • Running filebeat setup

  • Deleting the index

  • Updating config a bit

      Apr 26 08:43:13 app-flo22 filebeat: 2019-04-26T08:43:13.729Z#011WARN#011elasticsearch/client.go:526#011Cannot index event publisher.Event{....."fileset":common.MapStr{"name":"syslog"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000ec84e0), Source:"/var/log/messages", Offset:5097222783, Timestamp:time.Time{wall:0xbf28ce3fdc8b58b0, ext:4760377953, loc:(*time.Location)(0x2576ec0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x314e, Device:0xfd01}}}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Cannot write to a field alias [host.hostname]."}}
      Apr 26 08:43:13 app-flo22 filebeat: 2019-04-26T08:43:13.729Z#011WARN#011elasticsearch/client.go:526#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbf28cf0825bbb7bc, ext:805914543076, loc:(*time.Location)(0x2576ec0)}, Meta:common.MapStr{"pipeline":"filebeat-7.0.0-nginx-access-default"}, Fields:common.MapStr{"log":common.MapStr{"file":common.MapStr{"path":"/var/log/nginx/flo_access.log"}, "offset":1455336}, "service":common.MapStr{"type":"nginx"}, "fileset":common.MapStr{"name":"access"}, "ecs":common.MapStr{"version":"1.0.0"}, "host":common.MapStr{"name":"app-flo22.host.com"}, "message":"80.240.16.174 - - [26/Apr/2019:08:43:10 +0000] \"GET /api/public/check HTTP/1.1\" 200 0 \"-\" \"-\"", "input":common.MapStr{"type":"log"}, "event":common.MapStr{"module":"nginx", "dataset":"nginx.access"}, "agent":common.MapStr{"type":"filebeat", "ephemeral_id":"e34ee83e-a281-48a5-93e8-f882756db201", "hostname":"app-flo22.host.com", "id":"e4ba4f29-da17-4d99-9403-169b03c4c1be", "version":"7.0.0"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000288270), Source:"/var/log/nginx/flo_access.log", Offset:1455430, Timestamp:time.Time{wall:0xbf28ce3fdd0c5957, ext:4768832121, loc:(*time.Location)(0x2576ec0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x5e54a, Device:0xfd01}}}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [source] of type [keyword] in document with id 'xd3QWGoBxrvTgY5hF3Wj'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:317"}}
      Apr 26 08:43:13 app-flo22 filebeat: 2019-04-26T08:43:13.729Z#011WARN#011elasticsearch/client.go:526#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbf28cf0825bd4ebd, ext:805914647384, loc:(*time.Location)(0x2576ec0)}, Meta:common.MapStr{"pipeline":"filebeat-7.0.0-nginx-access-default"}, Fields:common.MapStr{"log":common.MapStr{"offset":1455430, "file":common.MapStr{"path":"/var/log/nginx/flo_access.log"}}, "service":common.MapStr{"type":"nginx"}, "input":common.MapStr{"type":"log"}, "ecs":common.MapStr{"version":"1.0.0"}, "agent":common.MapStr{"version":"7.0.0", "type":"filebeat", "ephemeral_id":"e34ee83e-a281-48a5-93e8-f882756db201", "hostname":"app-flo22.host.com", "id":"e4ba4f29-da17-4d99-9403-169b03c4c1be"}, "message":"80.240.16.174 - - [26/Apr/2019:08:43:10 +0000] \"GET /api/public/check HTTP/1.1\" 200 0 \"-\" \"-\"", "event":common.MapStr{"module":"nginx", "dataset":"nginx.access"}, "fileset":common.MapStr{"name":"access"}, "host":common.MapStr{"name":"app-flo22.host.com"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc000288270), Source:"/var/log/nginx/flo_access.log", Offset:1455524, Timestamp:time.Time{wall:0xbf28ce3fdd0c5957, ext:4768832121, loc:(*time.Location)(0x2576ec0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x5e54a, Device:0xfd01}}}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [source] of type [keyword] in document with id 'xt3QWGoBxrvTgY5hF3Wj'","caused_by":{"type":"illegal_state_exception","reason":"Can't get text on a START_OBJECT at 1:317"}}
    

(First error is shortened because of char limit)

My config:

  ########################### Filebeat Configuration ############################

  #==========================  Modules configuration ============================

  filebeat.modules:

  #------------------------------- System Module -------------------------------
  - module: system
    # Syslog
    syslog:
      enabled: true

    # Authorization logs
    auth:
      enabled: true

  #-------------------------------- Audit Module -------------------------------
  - module: auditd
    log:
      enabled: true

  #-------------------------------- Nginx Module -------------------------------
  - module: nginx
    # Access logs
    access:
      enabled: true

      # Set custom paths for the log files. If left empty,
      # Filebeat will choose the paths depending on your OS.
      var.paths: 
        - /var/log/nginx/access.log
        - /var/log/nginx/flo_access.log

      # Input configuration (advanced). Any input configuration option
      # can be added under this section.
      #input:

    # Error logs
    error:
      enabled: true

      # Set custom paths for the log files. If left empty,
      # Filebeat will choose the paths depending on your OS.
      var.paths:
        - /var/log/nginx/error.log
        - /var/log/nginx/flo_error.log

      # Input configuration (advanced). Any input configuration option
      # can be added under this section.
      #input:

  #=========================== Filebeat inputs =============================

  filebeat.inputs:

  # Each - is an input. Most options can be set at the input level, so
  # you can use different inputs for various configurations.
  # Below are the input specific configurations.

  - type: log
    enabled: true
    paths:
      - /opt/tomcat/logs/flo-gui-web.log
      - /opt/tomcat/logs/flo-integration.log

    ### Multiline options
    multiline.pattern: ^([0-9]{4}-[0-9]{2}-[0-9]{2}\s[0-9]{2}:[0-9]{2}:[0-9]{2},[0-9]{3})
    multiline.negate: true
    multiline.match: after

  - type: log
    enabled: true
    paths:
      - /opt/tomcat/logs/catalina*.log

    ### Multiline options
    multiline.pattern: ^\s{8}
    multiline.negate: false
    multiline.match: after

  #============================= Filebeat modules ===============================

  filebeat.config.modules:
    # Glob pattern for configuration loading
    path: ${path.config}/modules.d/*.yml

    # Set to true to enable config reloading
    reload.enabled: false

    # Period on which files under path should be checked for changes
    #reload.period: 10sFF

  #================================ General =====================================

  # The name of the shipper that publishes the network data. It can be used to group
  # all the transactions sent by a single shipper in the web interface.
  name: server1

  # The tags of the shipper are included in their own field with each
  # transaction published.
  #tags: ["service-X", "web-tier"]

  # Optional fields that you can specify to add additional information to the
  # output.
  #fields:
  #  env: staging


  #============================== Dashboards =====================================

  # These settings control loading the sample dashboards to the Kibana index. Loading
  # the dashboards is disabled by default and can be enabled either by setting the
  # options here, or by using the `-setup` CLI flag or the `setup` command.
  setup.dashboards.enabled: false

  # The URL from where to download the dashboards archive. By default this URL
  # has a value which is computed based on the Beat name and version. For released
  # versions, this URL points to the dashboard archive on the artifacts.elastic.co
  # website.
  #setup.dashboards.url:

  #============================== Kibana =====================================

  # Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
  # This requires a Kibana endpoint configuration.
  setup.kibana:

    # Kibana Host
    # Scheme and port can be left out and will be set to the default (http and 5601)
    # In case you specify and additional path, the scheme is required: http://localhost:5601/path
    # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
    host: "logs.host.com"

  #================================ Outputs =====================================

  # Configure what output to use when sending the data collected by the beat.

  #----------------------------- Elasticsearch output --------------------------------
  output.elasticsearch:
    # The Elasticsearch hosts
    hosts: ["logs.host.com:9200"]
    index: "filebeat-%{[beat.version]}-%{+xxxx.ww}"

  setup.template.name: "filebeat-7"
  setup.template.pattern: "filebeat-7*"
  setup.template.settings:
    index.number_of_shards: 2
    index.number_of_replicas: 1
  #================================ Logging =====================================

  # Sets log level. The default log level is info.
  # Available log levels are: error, warning, info, debug
  #logging.level: debug

  # At debug level, you can selectively enable logging only for some components.
  # To enable all selectors use ["*"]. Examples of other selectors are "beat",
  # "publish", "service".
  #logging.selectors: ["*"]

Which filebeat versions did you upgrade from?

Do you have some more complete logs in the Elasticsearch log files?

From 6.7 or 6.6.1

Here you can have a look at the startup log of filebeat:
https://pastebin.com/raw/U2YuPLjw

The error messages are so plenty that it fills up disks really fast

Which templates are installed? This looks like a mapping error.

with "filebeat export template" i get this:
https://pastebin.com/raw/MM5w16hD

Or what do you mean?

I mean the templates already registered with Elasticsearch.

curl http://<host>:9200/_template/filebeat*

Have you had a Filebeat 5.x installation in the past? In this case you might have a template that overlaps with the Filebeat 6.x/7.x templates.

No filebeat 5.x installed previously, i think 6.4 was current when I setup stuff up for the first time.

Anyway heres the output:
https://pastebin.com/raw/SU6xBUdj

And here the output curling just for filebeat-7*:
https://pastebin.com/raw/4QVKrHXZ

Thanks for the support so far :smile:

I found this likely very old template in your output:

  "filebeat": {
    "order": 1,
    "index_patterns": [
      "filebeat-*"
    ],
    "settings": {
      "index": {
        "mapping": {
          "total_fields": {
            "limit": "10000"
          }
        },
        "refresh_interval": "5s",
        "number_of_routing_shards": "30",
        "number_of_shards": "1",
        "number_of_replicas": "1"
      }
    },
    ...
 }
1 Like

Thanks, deleting that and running filebeat setup again seems to have fixed the issue.
I would have never found this, thanks a bunch!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.