Use the right codec field "date"

Hi everyone,
i use the logstash file below to send data to elasticsearch.

input{
    file{
      path => "C:/Users/SOUMAYA/Desktop/bureau.csv"
      start_position => "beginning"
      sincedb_path => "NUL"
      codec => plain { charset => "CP1252" }
    }
}
filter{
    csv {
        separator => ";"
        convert => {
            "longitude" => "float"
            "latitude" => "float"
        } 
        
    }
    date { match => [ "time", "dd MMM yy HH:mm:ss" ] }

    mutate{ add_field => { "location" => "%{latitude},%{longitude}" } }
  
}

output{
    elasticsearch { 
        action => "index"
        hosts => ["http://localhost:9200/"] 
        index => "bureau"
  }
  stdout { codec => rubydebug }
}

The problem is that the data are saved in a messy way and the field's labels are saved as column1 ,column2 instead of the field's real labels.
This is a capture from elasticsearch .


Also, here is capture of the data file(in format xlsx just to have a clear view) ).

Could you please help me.

Bonsoir !

It's because you did not defined the header names. Documentation says (emphasis is mine):

The CSV filter takes an event field containing CSV data, parses it, and stores it as individual fields with optionally-specified field names. This filter can parse data with any separator, not just commas.

Have a look at those options:

BTW I moved your question to Logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.