I am trying to use the filebeat timestamp processor to overwrite the timestamp from logs to @timestamp field before sending data to Elasticsearch . but the timestamp is kibana still shows the ingest timestamp . Not sure, where I am going wrong.
Appreciate any help or guidance.
Sample event -
{"timestamp":"15/05/2020 08:24:05.836","level":"ERROR","className":"serviceLOgs","methodName":"GetEntry","message":"GetEntry has an ObjectNotFoundException for EntryId = 282--ERROR","exception":"Svcs.Services.Exceptions.ObjectNotFoundException: Entry [ID=282] not found\r\n at Svcs.Services.DirectoryService.GetEntry(Int32 entryId) in D:\\GIT_C#\\Svcs\\Svcs.Services\\Svcs.cs:line 900"}
File beats configuration (.yml)
filebeat.inputs:
- type: log
enabled: true
paths:
- /opt/svcs-logs/*/Log.txt
json.keys_under_root: true
json.add_error_key: true
processors:
- timestamp:
field: timestamp
layouts:
- '15/05/2020 08:23:46.603'
- rename:
fields:
- from: "level"
- to: "log.level"
ignore_missing: true
- rename:
fields:
- from: "className"
- to: "class.name"
ignore_missing: true
- rename:
fields:
- from: "exception"
- to: "error.message"
ignore_missing: true
- rename:
fields:
- from: "methodName"
- to: "method.name"
ignore_missing: true
exclude_files: ['.gz$']
ignore_older: 72h
fields:
name: svcs
environment: stng
fields_under_root: true