I have a datafile I wish to ingest using Logstash. Historically I've always used CLI tools to manipulate the date/time fields to be easier to map using a .conf file (my imports tend to be CSV in nature). I was just wondering if someone here might have some guidance that can save me frustration.
Original sample had date and time in separate fields and here's a sample:
26,10119,100,60,KW,0,0
26,20119,200,60,KW,0,0
26,30119,300,60,KW,0,0
26,40119,400,60,KW,0,0
26,50119,500,60,KW,0,0
26,60119,600,60,KW,0,0
26,70119,700,60,KW,0,0
26,80119,800,60,KW,0,0
26,90119,900,60,KW,0,0
26,100119,1000,60,KW,0,0
In this example, the first line is January 1st 2019 at 1am and the last line is October 1st 10am.
Going back to old habits, I removed the trailing zeros and merged the fields
26,10119 1,60,KW,0,0
26,20119 2,60,KW,0,0
26,30119 3,60,KW,0,0
26,40119 4,60,KW,0,0
26,50119 5,60,KW,0,0
26,60119 6,60,KW,0,0
26,70119 7,60,KW,0,0
26,80119 8,60,KW,0,0
26,90119 9,60,KW,0,0
26,100119 10,60,KW,0,0
with the intention to use something like the following
date {"match" => ["date" , "Mddyy H"]}
but this didn't work (no data was brought in for single digit months)
Should I just work on awk'ing or perl scripting this correction or is there perhaps a better way to use a config to properly parse this out?