Mutute Date in CSV

(Sharan Jain) #1

Hi team ,

I have data in CSV File with some column as DATE .

Since Mutute Doesnot allow date .
How can i set type DATE in Filter.


(Magnus B├Ąck) #2

Logstash deals with JSON documents and JSON doesn't have a date type (only string, number, bool, array, and object).

How fields are mapped in Elasticsearch is controlled via the index mappings, which are usually set via index templates.

(Sharan Jain) #3

I am Using Logstast to feed data into kibana .
In which Fecha_Creacion is my date column .
by default while index creation it will take as String .
So i need to set it DATE.

input {  
      file {
          path => "/data/TEST.csv"
          start_position => "beginning"
		  sincedb_path => "/dev/null"

filter {  
    csv {
        columns => ["Descripcion", "Disenador", "Fecha_Creacion" , "ID"]
        separator => ","
	mutate {
            convert => {"ID" =>"integer"}


output {
    elasticsearch {
    stdout {}

Any Suggestion to achieve within .conf file only.

(Kaib) #4

It is possible to match a date with the date filter, then it will be displayed as date in kibana.

Putting into @timestamp field

match => ["dateFiled","yyyy-MM-dd HH:mm:ss Z","ISO8601"]

Put into field of choice

match => ["lastmodified","yyyy-MM-dd HH:mm:ss Z","ISO8601"]
target => "TargetField"

maybe this will help you

(Sharan Jain) #5

Thanks Kaib , i am to load the data in kibana ..
But there is one more format of date which throws warning
data is 1/17/2016 22:35

Unable to parse this of date .

I have tried

match => ["F_M","MM/dd/yyyy hh:mm:ss","ISO8601"]
target => "F_M"

match => ["F_M","MM/dd/yyyy hh:mm","ISO8601"]
target => "F_M"

(Kaib) #6

Infos on how to use the date filter: Link
Infos for the Joda Time Patterns: Link

Just try around a litte, the later one looks closer to what you want. You dont need the ISO8601 ist just another pattern.

match => ["F_M","M/dd/yyyy hh:mm"]

this looks promising, cause ur date doesnt have a 01 at the start

(system) #7