Date filter in logstash


I am trying to use Date filter of the logstash to create a timestamp on my data stored in Elasticsearch. The timestamp in my data is in Linux format and I want to change it to some standard format like as "dd MMM yyyy HH:mm:ss". I have two questions to ask:
1- I wonder if it is possible to change the UNIX time format to the format I want within the dat filter or I should first change the format and then parse the new format into the filter?

2- I have used the filter as below but I cannot see any timestamp in my data stored in the index of Elasticsearch! (start_time is the field in my data containing the Unix time)

filter {
    date {
        match => ["start_time", 'UNIX']
        target => "start_time"

Could anybody please help out?


Can you share an example of how the value of the start_time field looks like?

Maybe you have epoch time with miliseconds, then you would need to use UNIX_MS.

Thank you @leandrojmp for the response. This is an example of my time:
"start_time": 1618484099000, I know it is in miliseconds and I can convert it to the format I want with this command in linux (date -d @time). Should I use UNIX_MS?


Yes, just change from UNIX to UNIX_MS in your date filter, it should work.

@leandrojmp Thank you so much for your quick and very helpful response. It is working by UNIX_MS. The only problem I have is that when I send a bulk of 3 docs to my index it seems only the last one is stored in the index. I have tried to read my index content both with a curl command in my terminal and also by creating the index in kibana to visualize it. Both results are the same!
I also save this data in a file (as logstash output filter) and I can see all 3 docs in the output file. Could you please help about this too? Any thing I may have missed?

Since this is a different issue, I would suggest that you open another topic, describe the issue and share your logstash pipeline.

Ok. Thank you so much.