Filebeat - accesing event data and fields in the configuration

Logstash documentation:
https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html

i want to define a field on the input like this:


    filebeat.inputs:
    - type: log
      enabled: true
      paths:
        - /path/to/logs/**/*.log
      fields:
        type: "application-name"
      encoding: utf-8
      multiline.pattern: ^(\[|\{|t)
      multiline.negate: true
      multiline.match: after
      fields_under_root: true

to use it in an elasticsearch output:


output.elasticsearch:
  hosts: ["hostOne:9200", "HostTwo:9200"]
  index: "%{[fields][type]}-%{+YYYY.MM.dd}"

but when i do this, i get the following errors:


2018-10-05T14:54:00.464Z        ERROR   pipeline/output.go:121  Failed to publish events: temporary bulk send failure
2018-10-05T14:54:00.464Z        INFO    pipeline/output.go:95   Connecting to backoff(elasticsearch(http://HostOne:9200))
2018-10-05T14:54:00.464Z        DEBUG   [elasticsearch] elasticsearch/client.go:688     ES Ping(url=http://HostOne:9200)
2018-10-05T14:54:00.489Z        ERROR   pipeline/output.go:121  Failed to publish events: temporary bulk send failure
2018-10-05T14:54:00.489Z        INFO    pipeline/output.go:95   Connecting to backoff(elasticsearch(http://HostTwo:9200))
2018-10-05T14:54:00.489Z        DEBUG   [elasticsearch] elasticsearch/client.go:688     ES Ping(url=http://HostTwo:9200)
2018-10-05T14:54:00.490Z        DEBUG   [elasticsearch] elasticsearch/client.go:711     Ping status code: 200
2018-10-05T14:54:00.490Z        INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.4.1
2018-10-05T14:54:00.490Z        INFO    pipeline/output.go:105  Connection to backoff(elasticsearch(http://HostTwo:9200)) established
2018-10-05T14:54:00.492Z        DEBUG   [elasticsearch] elasticsearch/client.go:321     PublishEvents: 3 events have been published to elasticsearch in 1.239505ms.
2018-10-05T14:54:00.492Z        DEBUG   [elasticsearch] elasticsearch/client.go:525     Bulk item insert failed (i=0, status=500): {"type":"string_index_out_of_bounds_exception","reason":"String index out of range: 0"}
2018-10-05T14:54:00.492Z        DEBUG   [elasticsearch] elasticsearch/client.go:525     Bulk item insert failed (i=1, status=500): {"type":"string_index_out_of_bounds_exception","reason":"String index out of range: 0"}
2018-10-05T14:54:00.492Z        DEBUG   [elasticsearch] elasticsearch/client.go:525     Bulk item insert failed (i=2, status=500): {"type":"string_index_out_of_bounds_exception","reason":"String index out of range: 0"}
2018-10-05T14:54:00.532Z        DEBUG   [elasticsearch] elasticsearch/client.go:711     Ping status code: 200
2018-10-05T14:54:00.532Z        INFO    elasticsearch/client.go:712     Connected to Elasticsearch version 6.4.1

You have configured fields_under_root: true. With this setting all fields added to the fields setting are put in the top-level document. That is fields.type does no exist, but fields does exist.

I'm a little confused. Your link points to logstash configuration docs, not beats. The formats are quite different.

Either remove fields_under_root and update the output to say index: '%{[fields.type]}}-%{+yyyy.MM.dd}, or change you output config to say index: '%{[type]}-%{+yyyy.MM.dd}'.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.