i all i am new to logstash
i have a input log
{"message":"08/30/2015 16:17:36.442,16.003765,20.503670","@version":"1","@timestamp":"2015-08-30T13:17:46.745Z","host":"ERIKP-PC","LogDate":"08/30/2015 16:17:36.442","RamUsed":16.003765,"CpuPercent":20.50367}
I like to change the "LogDate" from "08/30/2015 16:17:36.442" to "08-30-2015 16:17:36.442"
How can i do it?
Normally one uses the date filter to parse the event's timestamp into the @timestamp
field, which has a standardized format in UTC. What do you plan to do with the LogDate
field, i.e. why does its date format matter?
My PC send the log to ELK server and in the logs i get
"message" => "08/30/2015 15:25:55.353,14.651681,5.618097",
"@version" => "1",
"@timestamp" => "2015-08-30T12:26:05.705Z",
"host" => "ERIKP-PC",
"LogDate" => "08/30/2015 15:25:55.353",
"RamUsed" => 14.651681,
"CpuPercent" => 5.618097,
"tags" => [
[0] "_dateparsefailure"
I cant understand the _dateparsefailure error
The Logstash log will contain clues about why the date filter couldn't parse the date. If you post your configuration we'll be in a better position to help.
This is the server sends data to rabbitmq
input {
perfmon {
interval => 10
counters => [
"\Memory% Committed Bytes In Use",
"\Processor(_Total)% Processor Time"]
}
}
filter {
grok {
match => {
"message" => "%{DATESTAMP:LogDate},%{NUMBER:RamUsed:float},%{NUMBER:CpuPercent:float}"
}
}
}
output {
rabbitmq {
exchange => "Logs.Logstash"
exchange_type => "topic"
host => "blabla"
user => "blabla"
password => "blabla"
port => blabla
durable => "true"
ssl => false
key => "cpu.#"
}
file {
path => "C:\perfmon_output.txt"
}
}
This the server that bring the data from rabbitmq
input {
rabbitmq {
host => "blabla"
port => blabla
ssl => false
user => "blabla"
password => "blabla"
queue => "Logstash.Perfmon.CPU"
durable => true
}
}
output {
elasticsearch {
host => "blabla"
cluster => "blabla"
protocol => transport
}
stdout { codec => rubydebug }
}
Where's the date filter that adds the _dateparsefailure
tag?
I dont have any
this is what i see in the log
"message" => "08/31/2015 10:32:05.767,18.271383,6.800951",
"@version" => "1",
"@timestamp" => "2015-08-31T07:32:11.105Z",
"host" => "ERIKP-PC",
"LogDate" => "08/31/2015 10:32:05.767",
"RamUsed" => 18.271383,
"CpuPercent" => 6.800951,
"tags" => [
[0] "_dateparsefailure"
]
The _dateparsefailure
tag is only added by the date filter so there must be a date filter in there somewhere. Do you by any chance have an old file laying around in /etc/logstash/conf.d?
Yes i found it
date {
match => [ "LogDate", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "EventDate"
}
its in different conf file not related to my conf file
Okay. Well, that date format clearly doesn't match what's in your LogDate
field but it should be an easy adjustment.
Now i understand what happened the date in the different conf took over my conf file
now i understand how its working
So how can i change my date format so it will be the same?
Adjust the match pattern to match what your field contains. This should work:
date {
match => ["LogDate", "MM/dd/yyyy HH:mm:ss.SSS"]
remove_field => ["LogDate"]
}
I added a remove_field
option to remove the field that was just parsed and deleted the target
option since @timestamp
is the default timestamp field that you should use unless you have a compelling reason.
Thanks a lot
how can i manipulate the date so it will be in different format?
That's not supported out of the box but you could use a ruby filter. But why do you feel you need to do that?
Thanks Removed the date all is working