FileBeat Processing @timestamp instead of log @timestamp

I'm using the following flow FileBeat->Elastic->Kibana on Windows-7 using v7.5.2 and the timestamp as seen in the Kibana Discover window always corresponds to the time the log was processed rather than represent the log's timestamp. The log looks as below:

{"stored_proc_name":"sp_prepexecrpc","@timestamp":"2020-01-29T09:51:30.2298715","duration":216,"client_app_name":"TEST","sql_server":"DB01","database":"Cust_R"}
{"stored_proc_name":"sp_prepexecrpc","@timestamp":"2020-01-29T09:51:30.2306409","duration":221,"client_app_name":"TEST","sql_server":"DB01","database":"Cust_R"}

filebeat.yml

filebeat.inputs:
- type: log

  enabled: true

  paths:
- 'D:\Scratch\C#\T3\Monitor.SP_Executions\bin\Debug\*.log'
#    - 'D:\Scratch\C#\T2\LiveStreaming\bin\Debug\*.log'


json.keys_under_root: true
json.overwrite_keys: true

processors:
- decode_json_fields:
fields: ["message"]
max_depth: 1
process_array: true
add_error_key: true
overwrite_keys: true

From FileBeat.log file I see that the timestamp is not being updated to use the time stamp from my log

2020-01-29T10:10:44.758Z	DEBUG	[processors]	processing/processors.go:186	Publish event: {
  "@timestamp": "2020-01-29T10:10:44.758Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "7.5.2"
  },
  "host": {
    "id": "75bb063d-5ad1-4c5a-9fb0-db3b045fcfc5",
    "hostname": "EUD2362",
    "architecture": "x86_64",
    "os": {
      "name": "Windows 7 Enterprise",
      "kernel": "6.1.7601.24540 (win7sp1_ldr_escrow.191127-1706)",
      "build": "7601.24533",
      "platform": "windows",
      "version": "6.1",
      "family": "windows"
    },
    "name": "EUD2362"
  },
  "agent": {
    "type": "filebeat",
    "ephemeral_id": "26200367-10ad-4566-8f12-c38dd987f4b2",
    "hostname": "D2362",
    "id": "7cf1c15a-0903-4551-8dd7-4f21947ed61f",
    "version": "7.5.2"
  },
  "ecs": {
    "version": "1.1.0"
  },
  "log": {
    "offset": 15906770,
    "file": {
      "path": "D:\\Scratch\\C#\\T3\\Monitor.SP_Executions\\bin\\Debug\\STGSQL05_SP_EXECUTIONS.log"
    }
  },
  "message": "{\"stored_proc_name\":\"sp_prepexecrpc\",\"@timestamp\":\"2020-01-29T09:45:31.9330272\",\"duration\":292,\"client_app_name\":\"TEST\",\"sql_server\":\"DB01\",\"database\":\"Cust_R\"}",
  "input": {
    "type": "log"
  }
}

I toggled json.overwrite_keys: true this did not help. i.e., after toggling I made sure I stopped and started filebeat

I'm a newbie to ELK and I believe I'm missing something obvious. Any help in the right direction is much appreciated.
Also, what is the recommended practice for filebeat.yml changes to be visible?

Hi!

Two problems here:

  1. First of all your configuration seems to not be working since the json messages are not parsed at all. Here is what I use:
filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /tmp/filebeattest
  json.keys_under_root: true
  json.add_error_key: true
  json.overwrite_keys: true
  1. The timestamp format looks problematic (not supported) I changed that to:
{"stored_proc_name":"sp_prepexecrpc","@timestamp":"2020-01-28T13:05:13.672Z","duration":216,"client_app_name":"TEST","sql_server":"DB01","database":"Cust_R"}
{"stored_proc_name":"sp_prepexecrpc","@timestamp":"2020-01-29T09:51:30.230Z","duration":221,"client_app_name":"TEST","sql_server":"DB01","database":"Cust_R"}

Then the output I see looks like:

{
  "@timestamp": "2020-01-28T13:05:13.672Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "8.0.0"
  },
  "log": {
    "offset": 0,
    "file": {
      "path": "/tmp/filebeattest"
    }
  },
  "input": {
    "type": "log"
  },
  "host": {
    "name": "local"
  },
  "agent": {
    "version": "8.0.0",
    "type": "filebeat",
    "ephemeral_id": "f3f9fab7-3a44-4182-b43d-5f07fba147fa",
    "hostname": "local",
    "id": "368f6f6b-a6e6-4af2-94c8-72efe7e8ef55"
  },
  "stored_proc_name": "sp_prepexecrpc",
  "client_app_name": "TEST",
  "database": "Cust_R",
  "duration": 216,
  "sql_server": "DB01",
  "ecs": {
    "version": "1.4.0"
  }
}

Regarding the timestamp format issue, if you want to keep your current one I guess you should do go with something like what is proposed to https://discuss.elastic.co/t/get-timestamp-from-log-lines/117816.

Hi Chris,
Thanks for the tip.
It was indeed the wrong timestamp I had to specify explicitly to dump DateTime as UTC.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.