Error with datetime field in Kibana

Hi to everyone

I have the following problem. I am exporting from a csv file to generate a dashboard with data that is read daily from that file, which is also generated daily. One of the fields that I include in the csv is the "last_login_date" field that I use to extract certain information and display it in a graph.

An example CSV file has the following information, separated by ","

user-id,date,last_login_date,assigned_quota,used_quota,number_files,number_shares,number_uploads,number_downloads
usernameXXXX,2024-02-07 10:05:21,2024-02-07 10:04:55,26843545600,23223408,151,2,3,531
usernameYYYY,2024-02-07 10:05:21,2024-02-07 10:04:55,26843545600,23223408,151,2,3,531

In the logstash I have the following configuration to process the data that comes from that csv file:

input {
     file {
         path => "/etc/logstash/conf.d/daily/*.csv"
         start_position => "beginning"
         sincedb_path => "/dev/null"
     }
}

filter {
     csv {
         skip_header => "true"
         separator => ","
         columns => [
             "user-id",
             "date",
             "last_login_date",
             "assigned_quota",
             "used_quota",
             "number_files",
             "number_shares",
             "number_uploads",
             "number_downloads"
         ]
         remove_field => ["message"]
     }
     date {
         match => [ "date", "yyyy-MM-dd HH:mm:ss"]
         target => ["@timestamp"]
     }
     date {
         match => [ "last_login_date", "yyyy-MM-dd HH:mm:ss"]
         target => ["last_login_date"]
     }
     mutate {convert => ["assigned_quota", "integer"]}
     mutate {convert => ["used_quota", "integer"]}
     mutate {convert => ["number_files", "integer"]}
     mutate {convert => ["number_shares", "integer"]}
     mutate {convert => ["number_uploads", "integer"]}
     mutate {convert => ["number_downloads", "integer"]}
}

output {
    elasticsearch {
         hosts => ["elk.mydomain.local:9200"]
         index => "nextcloud-usage-%{+YYYY.MM.dd}"
    }
    stdout { codec => rubydebug }
}

As you can see, the "last_login_field" field is formatted so that it takes the format that comes from the CSV.

In Kibana, when the file is processed, and I see the generated index, I see the following warning sign:

imagen

I don't know how to solve it, can someone help me see where the problem is.

Thanks in advance

Do you have pipeline.workers set to 1 as required? The documentation mentions this. I think in recent versions you will also need to set pipeline.ordered.

Otherwise, one pipeline may process a log entry and skip it as a header, then another pipeline may process a log entry and process last_login_date as a date, then a third pipeline may process a log entry and process "last_login_date" as a string. Check if any documents in your index have a _dateparsefailure tag.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.