Setting @timestamp in filebeat

Recent versions of filebeat allow to dissect log messages directly. (Without the need of logstash or an ingestion pipeline.)
Therefore I would like to avoid any overhead and send the dissected fields directly to ES.

Currently I have two timestamps, @timestamp containing the processing time, and my parsed timestamp containing the actual event time.

Is it possible to set @timestamp directly to the parsed event time?
(Or is there a good reason, why this would be a bad idea?)


For reference, this is my current config.

filebeat.yml:

filebeat.inputs:
- type: log
  tags: ["ingestion"]
  multiline.pattern: '^\d{4}-\d{2}-\d{2}T'
  multiline.negate: true
  multiline.match: after
  paths:
  - '/druid/var/druid/task/*/log'

processors:
- dissect:
    tokenizer: "%{timestamp} %{loglevel} [%{component}] %{class} - %{message}"
    field: "message"
    target_prefix: "ingest"
- dissect:
    tokenizer: "/druid/var/druid/task/%{task-id}/log"
    field: "source"
    target_prefix: "ingest"
- include_fields:
    fields: [ "ingest", "message" ]

output.elasticsearch:
  hosts: ["${ES_URL}"]
  index: "ingest-%{+yyyy.MM.dd}"

setup.template.enabled: true
setup.template.overwrite: true
setup.template.name: "ingest"
setup.template.pattern: "ingest-*"
setup.template.fields: "fields.yml"

fields.yml:

- name: main
  type: group
  description: >
    What is this additional main level good for?
  fields:
  - name: ingest
    type: group
    description: >
      Parsed values of the ingestion tasks.
    fields:
    - name: timestamp
      type: date
      description: >
        The actual timestamp.
    - name: loglevel
      type: keyword
      description: >
        The loglevel.
    - name: message
      type: text
      description: >
        The actual message.
2 Likes

At the current time it's not possible to change the @timestamp via dissect or even rename. See https://github.com/elastic/beats/issues/7351.

In the meantime you could use an Ingest Node pipeline to parse the timestamp. See https://www.elastic.co/guide/en/elasticsearch/reference/master/date-processor.html. Then once you have created the pipeline in Elasticsearch you will add pipeline: my-pipeline-name to your Filebeat input config so that data from that input is routed to the Ingest Node pipeline.

1 Like

The charm of the above solution is, that filebeat itself is able to set up everything needed. And all the parsing logic can easily be located next to the application producing the logs.

As soon as I need to reach out and configure logstash or an ingestion node, then I can probably also do dissection there and there. Guess an option to set @timestamp directly in filebeat would be really go well with the new dissect processor.

3 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.