Hello All,
I am facing a issue to fetch the DateTime from filename.
Input File name : ServerInfo_04022016 1204.csv
Expected output to Fetch DateTime : 04/02/2016 12:04
input {
file {
path => "C:\logstash-2.3.1\CSV\ServerInfo_04022016 1204.csv"
type=>"sql"
}
}
Use a grok filter to parse the path
field. Something like this:
filter {
grok {
match => ["path", "_%{MONTHNUM:month}%{MONTHDAY:day}%{YEAR:year} %{HOUR:hour}%{MINUTE:minute}\.csv$"]
add_field => ["datetime", "%{month}/%{day}/%{year} %{hour}:%{minute}"]
}
mutate {
remove_field => ["month", "day", "year", "hour", "minute"]
}
}
(I'm assuming that the date in the filename is mmddyyyy. If not the expression needs to be adjusted.)
2 Likes
Hello Magnus,
Thanks for your response, your solution is working fine,
But i need one more suggestion/help from you.
In mapping document, i have already defined field and its type,
Below is the syntax for mapping document for this one.
"logruntime": {
"type": "date",
"format": "epoch_millis||MM/dd/yyyy HH:mm"
}
how i can add value into logruntime field rather than create add_field "datetime".
Thanks for your help in Advance,
Sam
So... change the add_field
option to name the new field logruntime
instead of datetime
? Or am I misunderstanding the question?
Hello Magnus,
Your solution works fine,
But now i have to add seconds as well.
Input File name : ServerInfo_04022016 120401.csv
Expected output to Fetch DateTime : 04/02/2016 12:04:01
filter {
grok {
match => ["path", "_%{MONTHNUM:month}%{MONTHDAY:day}%{YEAR:year} %{HOUR:hour}%{MINUTE:minute}%{SECOND:second}.csv$"]
add_field => ["ScriptRunTime", "%{month}/%{day}/%{year} %{hour}:%{minute}:%{second}"]
}
mutate {
remove_field => ["month", "day", "year", "hour", "minute","second"]
}
}
I use the same code and add "second" but it throws an error.
400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [S
criptRunTime]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"I
nvalid format: "04/02/2016 12:04:00" is malformed at ":00""}}}}, :level=>:wa
rn}←[0m
No worries, it works fine now.