Not able to create index based on log date field


(Vikas Gopal) #1

Hi Experts,

I want to create an index based on a log date field , my logs in csv file are like
logtime,name
10/13/2015 2:30,vg

my LS conf has

filter {
csv {
columns => ["logtime","name"]
separator => ","

}
}
filter {
date {
match => ["logtime","M/d/yyyy h:mm"]
target => "logtime"
}
}

now after running this I am getting an error

←[33mFailed parsing date from field {:field=>"logtime", :value=>"logtime", :exception=>java.lang.IllegalArgumentException: In
valid format: "logtime", :level=>:warn}←[0m

Please help me to understand what is the problem


(Mark Walkom) #2

Shouldn't logtime be MM/dd/yyyy h:mm?


(Vikas Gopal) #3

Sorry Mark in my post I mentioned "M/d/yyyy h:mm".
Even with "MM/dd/yyyy h:mm" it's giving same error , though Index has been created but I do not see logtime field in the selection , I can still see @timestamp field .
My logtime field type is still string , I guess this is the problem


(Vikas Gopal) #4

Moved 1 step ahead , it seems it's working even with an error .
First I have created a template for this index to define date type explicitly for this filed (logtime).Now I can see both the fields in the selection , so now I have created index based on logtime .Any Idea why I am still getting this error ?

Error

Kibana Index


(Mark Walkom) #5

Check your ES logs, it should mentioned something about the 400 error.


(Vikas Gopal) #6

It's full with error messages and some of the portion says

Caused by: org.elasticsearch.index.mapper.MapperParsingException: failed to parse date field [logtime], tried both date format [dateOptionalTime], and timestamp number with locale []
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:617)
at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(DateFieldMapper.java:535)
at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:239)
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:401)
... 13 more
Caused by: java.lang.IllegalArgumentException: Invalid format: "logtime"
at org.elasticsearch.common.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187)
at org.elasticsearch.common.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:780)
at org.elasticsearch.index.mapper.core.DateFieldMapper.parseStringValue(DateFieldMapper.java:612)


(Vikas Gopal) #7

Nope !!! no luck yet , I tried many options but still same error . It seems except from default format ES does not support any other date format .


(Magnus Bäck) #8

Take ES out of the equation and use a simple stdout { codec => rubydebug } output to verify that your logtime fields has the expected contents. In the screenshot above showing the "failed action with response of 400" error message I can also see an error message indicating that the date parsing failed. This would leave logtime untouched which explains why ES has problems parsing the date.


(Vikas Gopal) #9

This is what I got after using stdout only . I am using date format as

LS Filter
filter {
date {
match => ["logtime","yyyy-MM-dd HH:mm:ss"]
target => "logtime"
}
}

My Data

logtime,name
2015-10-13 04:10:05,vg

Error

←[33mFailed parsing date from field {:field=>"logtime", :value=>"logtime", :exception=>java.lang.IllegalArgumentException: In
valid format: "logtime", :level=>:warn}←[0m
{
"message" => [
[0] "logtime,name\r"
],
"@version" => "1",
"@timestamp" => "2015-10-13T07:13:15.271Z",
"host" => "LP-54EE752450D4",
"path" => "E:\ELK_Traning\logstash-1.5.1\bin\time.csv",
"logtime" => "logtime",
"name" => "name"
}
{
"message" => [
[0] "2015-10-13 04:10:05,vg\r"
],
"@version" => "1",
"@timestamp" => "2015-10-13T07:13:15.271Z",
"host" => "LP-54EE752450D4",
"path" => "E:\ELK_Traning\logstash-1.5.1\bin\time.csv",
"logtime" => "2015-10-12T22:40:05.000Z",
"name" => "vg"
}


(Magnus Bäck) #10

Well, if you try to parse the header line ("logtime,name") things are obviously not going to work. You should drop those lines, perhaps by checking if the message begins with what looks like a field name rather than a date (to allow for renames of the fields). Something like this?

filter {
  if [message] =~ /^[a-z]*,/ {
    drop { }
  }
  ...
}

(Vikas Gopal) #11

It works like Charm , thank you Mangnus I should have catch this earlier as Error log directly says that value for that field is a string which causes the issue .Thanks again for your support.


(system) #12