Replace @timestamp with Time field in my log


(Sri Divya Ayalasomayajula) #1

I am trying to replace time field in my log time :
here is my config file: I am new to elastic search , tried couple of things but failed.
nput {
file{
path=>"C:\logs\testing1.csv"
start_position=>"beginning"
}
}
filter{
csv{
separator=>","
columns=>["Message","Thread","Time","Topic","Level","Interaction ID"]
}
date {
match =>{ "Time"=>[ "dd/MMM/yyyy:HH:mm:ss Z"]}
target=>"@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index=>"itest1"
document_type=>"itest1_log"

}
stdout{}
}


(Magnus Bäck) #2

Please show

  • an example line from your CSV file and
  • an example output event after you've modified your stdout{} output to stdout { codec => rubydebug }.

(Sri Divya Ayalasomayajula) #3

Hi Magnus,

Here is the input in my csv file:

Message Thread Time Topic Level Interaction ID

InteractionStartEvent 4580 7/25/2017 2:39:51 AM -- -- --

Time is in 7/25/2017 2:39:51 AM

This is how am writing my config file-
input {
file{
path=>"C:\logs\testing3.csv"
start_position=>"beginning"
}
}
filter{
csv{
separator=>","
columns=>["Message","Thread","Time","Topic","Level","Interaction ID"]
}
date {
match =>["Time","YYYY-MMM-dd:HH:mm:ss Z"]
target=>"@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index=>"itest4"
document_type=>"itest4_log"

}
stdout{ codec => rubydebug}
}


(Magnus Bäck) #4

If your CSV file doesn't use commas to separate the columns then separator=>"," is the wrong configuration to use.

Also, you only addressed the request in the first bullet. Please always answer all questions that you're asked.


(Sri Divya Ayalasomayajula) #5

Its wasnt the problem of csv file. I didnt had date in my timestamp and UTC which I parsed
It working now


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.