Hi Team,
I'm trying to load log data into Elasticsearch through Logstash. My log file is back dated. however, date is not indexed from log file. How to use log file date for searching.. Highly appreciated your response. Thanks
Input data sample:
31/10/2017 6:21:04 PM : Test 1
31/10/2017 6:21:05 PM : Test 2
31/10/2017 6:21:06 PM : Test 3
Logstash Pipeline
input {
file {
path => "C:\Technology\SampleData\input\log.txt"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{DATESTAMP:LogDate} %{GREEDYDATA}" }
}
date {
match => [ "timestamp" , "dd/MM/yyyy:HH:mm:ss Z" ]
target => ["@timestamp"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["textdata"]
}
}

Please copy/paste a complete message in Kibana (expand one of the lines and copy/paste the JSON document from the JSON tab).
Hi Magnus Back,
Thanks for your response. As mentioned, please find the details below
{
"_index": "textdata",
"_type": "logs",
"_id": "AWKpwND6rvMhbdXhRSBV",
"_version": 1,
"_score": null,
"_source": {
"@version": "1",
"host": "DESKTOP-D4PSC86",
"path": "C:\Technology\SampleData\input\log.txt",
"@timestamp": "2018-04-09T09:33:28.251Z",
"message": "31/10/2017 6:21:04 PM : Test 1\r",
"LogDate": "31/10/2017 6:21:04"
},
"fields": {
"@timestamp": [
1523266408251
]
},
"sort": [
1523266408251
]
}
You have configured your date filter to parse a timestamp field but the name of your field is actually LogDate. Secondly the pattern isn't quite right; your timestamp has no colon between the date and the time and it doesn't end with a timezone offset.
Hi Magnus,
Excellent. Thanks for the indicating right to the point. It's fine now. I have two queries
- I want to split the message line into fields. currently all in one line
Ex, "message": "03/04/2018 13:44:54 Number of rows read:10\r"
- How do I parse if some lines in the file does not have datetime. (it's different format)
Thanks in advance