Filebeat - Date processor - Log file content without date

Hello there,

I'm new on this product.

here is my question

I use filebeat to crawl log file and send content to logstash frontend.

Usually my logs files contains on each row a full date format. File is able to read to store document with good timestamp info in ES.

But I have a log file that doesn't have full date in each row but only hours (date is in the log filename)

Each day a new log file is created and contains day related event.

Example :
Log file name : mylogFile-20200423.log

Log file content at each row :
01:00:07.802 (18924:16416) U-PE: 20000013 MWIOff

The regexp could be :
[Hours]:[Min]:[Sec].[msec] ([specific_ID]) [unparseable various datas]

Online documentation said that i could use "Date processor"

https://www.elastic.co/guide/en/elasticsearch/reference/7.6/date-processor.html

Is it possible to append the current system date (years,month,day) in addition to the hours retrieved in log file ?

Whenever Filebeat ingests an entry from a log file, it will create a field in the resulting event called log.file.path. This field will contain the complete path to the log file from where the entry was ingested. So, in your case, this would have values like /path/to/my/logs/mylogFile-20200423.log.

In your Logstash pipeline, you can perhaps use something like a grok filter to extract the date part from this field. And similarly you could extract the time part from your log entry. Then combine the two and pass the combined value to the Logstash date filter to set the correct @timestamp field value for each of your log entries / events.

@shaunak thanks for this answer :slight_smile:

Indeed i could rebuild a proper date based on 2 different string in logstash side.

I'll check this

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.