Issue with date field in CSV plugin

Hi All,

I'm trying to ingest data into elasticsearch using csv plugin, data is going smooth for one or two days then all of a sudden there is a wired entry which will get updated with some random date and year.

I'm not sure from where its picking up the date as 01-01-5541 , i have searched for the data in the csv file but there is no such entry with this date.

Below is my logstash config file

input {
  file {
    path => "/etc/logstash/http_poller/solarwinds/15mins_perfromance/conf.d/data/finaldata.csv"
    start_position => "beginning"
   sincedb_path => "/dev/null"
filter {
  csv {
      separator => ","
      columns => ["data.results.Availability","data.results.DateTime","data.results.AvgResponseTime","data.results.Caption","data.results.MaxResponseTime","data.results.MinResponseTimedata.results.MinResponseTime","data.results.NodeID","data.results.PercentDown","data.results.PercentLoss"]
output {
  elasticsearch {
    hosts => [""]
    user => "user"
    password => "*****"
    index => "perfchk"
#stdout { codec => rubydebug }

Any suggestions please.


Hard to say without seeing the input and the mapping.
You probably need to use a date filter as well but that's just a guess.

Anyway I moved your question to #elastic-stack:logstash

@dadoonet Here is the output of stdout, looks like a dateparse error, not sure how to avoid this, any advice please

"data.results.vendor" => "Windows",
"@timestamp" => 2021-03-23T12:29:03.529Z,
"data.results.PercentDown" => 0,
"data.results.Node_Category_Type" => "Database",
"message" => "2021-03-23T12:14:21.1162574,1641,0.0,0.0,0.0,0,0.0,100.0,2021-03-23T12:14:21.1162574,PRD01,sep-db-prd01,,Windows,Boulder,United States,Database,Windows 2008 Server,Symantec Endpoint Encryption",
"data.results.Application_Name" => "Symantec Endpoint Encryption",
"data.results.OS" => "Windows 2008 Server",
"host" => "",
"@version" => "1",
"path" => "/etc/logstash/http_poller/perfromance/conf.d/data/finaldata1.csv",
"data.results.MaxResponseTime" => 0,
"data.results.AvgResponseTime" => "0.0",
"data.results.NodeID" => "11",
"data.results.PercentLoss" => 0,
"" => "United States",
"data.results.Availability" => 100,
"data.results.ObservationTimestamp" => "2021-03-23T12:14:21.1162574",
"" => "Boulder",
"data.results.ipaddress" => "",
"data.results.DateTime" => "2021-03-23T12:14:21.1162574"
"data.results.dns" => "",
"data.results.MinResponseTime" => 1,
"data.results.ObservationFrequency" => nil,
"data.results.Caption" => "",
"tags" => [
        [0] "_dateparsefailure"

here is my mapping for date field

date { match => ["data.results.DateTime", "yyyy-MM-dd HH:mm:ss.SSS"] }
date { match => ["data.results.ObservationTimestamp", "yyyy--MM-dd HH:mm:ss.SSS"] 

Any advice please


To parse 2021-03-23T12:14:21.1162574 , I think you probably need yyyy-MM-dd'T'HH:mm:ss.SSS. See the documentation at Date filter plugin | Logstash Reference [7.11] | Elastic

For non-formatting syntax, you’ll need to put single-quote characters around the value. For example, if you were parsing ISO8601 time, "2015-01-01T01:12:23" that little "T" isn’t a valid time format, and you want to say "literally, a T", your format would be this: "yyyy-MM-dd’T’HH:mm:ss"

1 Like

Think just ISO8601 may work also for this. You can check if it's in that format here.

      date {
        match => [ "data.results.DateTime", "ISO8601" ]
1 Like

The [message] field in your screenshot has 5541 in the second column, which is "data.results.DateTime". Since that field has type date elasticsearch has to decide how to interpret 5541 as a date. It chooses January 1st, 5541.

Putting ISO8601 had worked out...Thanks

Usually it'll run for one day and then this weird year will pop up all of a sudden, have started ingesting the data, will check and update back here if i stuck again with the issue.


@Badger You are rite, i observed that and that was my main issue, and i'm not able to figure out from where it is getting that information.
As i'm ingesting data using CSV, those type of values are not there in the CSV at all.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.