Date parse failure

So I'm doing a school project with a scraper, I'm now trying to visualize this scraped data inside kibana.

This is what my logstash config file looks like:

input {

	file{

		path => "/home/nod/Desktop/Data/bodybuilding_forum.csv"

		start_position => "beginning"

		sincedb_path => "/dev/null"

	}

}

filter{

	csv {

		separator => ","

		columns => [ "PostDate", "Author", "Title"]

	}
	date {
		match => [ "PostDate" , "dd/MMM/yyyy:HH:mm:ss Z" ]
	}

}

output{

	elasticsearch{

	hosts => "localhost:9200"

	index => "forum"

	document_type => "forum_bodybuilding"

	}

	stdout{}

}

My data is a csv file with just strings inside, looks like this:

PostDate Author Title
18 feb 2005 XXL Nutrition XXLNutrition.nl
21 jan 2015 Analytic Supplementlabtest.com - Alle testen op één plaats
23 mrt 2015 BULK POWDERS BulkPowders.nl

It's just 3 columns in a csv file, hard to show here.

In kibana I get this error:

"host": "research",
"Author": "Galen",
"Title": "Boek: Bonds voedingssupplementen",
"message": "12 mei 2017,Galen,Boek: Bonds voedingssupplementen",
"PostDate": "12 mei 2017",
"tags": [
"_dateparsefailure"

That does not match;

If you update the match pattern you will be good :slight_smile:

How would I need to change it?

https://www.elastic.co/guide/en/logstash/5.6/plugins-filters-date.html#plugins-filters-date-match, let us know if you run into further problems.

So if I read there i also need to add locale=NL, right? As well as making it "dd MMM yyyy Z".

Still getting the same error as before

You do not have a timezone, so something like this should work:

date {
    match => ["PostDate", "dd MMM yyyy"]
    locale => "nl"
}

lol thats awkward.. it worked.. i had the same except i didnt have nl in quotes lol. Thanks a bunch.

Even tho it didnt give a parsing error anymore it does still seem that PostDate is still a string inside kibana, and none of my stuff seems aggregatable:

You will need to delete the index and then recreate it so the mappings are correctly set.

Also for string, use the .keyword field instead.

I've already deleted the indexes multiple time tho..

This is what my config looks like atm:

input {

	file{

		path => "/home/ipfit6/Desktop/Data/bodybuilding_forum.csv"

		start_position => "beginning"

		sincedb_path => "/dev/null"

	}

}

filter{

	csv {

		separator => ","

		columns => ["PostDate", "Author", "Title"]

	}
	date {
		match => ["PostDate", "dd MMM yyyy"]
		locale => "nl"
	}

}

output{

	elasticsearch{

	hosts => "localhost:9200"

	index => "forum"

	document_type => "forum_bodybuilding"

	}

	stdout{}

}

Ah ok. The date filter puts things into @timestamp by default, if you don't want that you need to tell it so. The docs cover that.

Ah yeah.. figured it was something like that once I look properly at the timestamp info so thats fine. Still I can't get any visualisation from the data i put in for some reason.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.