Error in parsing date field

Hi All,

I'm trying to ingest data from database into elasticsearch, for which i have written mapping in kibana as

PUT agent5
{
  "mappings" : {
    "properties": {
      "logged_date": {
        "type": "date",
         "format": "YYYY-MM-dd HH:mm:ss.SSSS"
      }
    }
  }
}

I have also tried "format": "YYYY-MM-dd HH:mm:ss.SSSSXXX" still getting date parsed error.

Original date field is database is like - 2020-05-20 17:39:04.4766667

Any suggestions please.

Thanks
Gautham

A few questions here:

  1. Does the index to which that mapping is applied to matches the index generated in Elasticsearch?

  2. Do you get mapping_parser_exception in Elasticsearch ?

  3. How are you polling the data from your database and indexing into Elasticsearch? If you are using Logstash then you can do a date type conversion using the date filter to match the datatype of your log_date field in your mapping. I am suspecting you have some records in your database that does not match that format and which causes the Error.

Here are the answers @Rahul_Kumar4

Yes, i have given same index name

Not sure where to check this, when i give "PUT index name" and give the mapping format i dont get any eoors.

I'm using JDBC plugin in logstash to poll the data. I have given date format in logstash as well, still no luck

Logstash confi gfile:

input {
   jdbc {
    jdbc_driver_library => "C:\Users\sqljdbc.jar"
    jdbc_connection_string => "jdbc:sqlserver://server.database.windows.net:1433;database=testdb
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_user => "user"
	jdbc_password =>"*****"
	statement => "SELECT * , CONCAT(date,' ', time) AS logged_date FROM table1"
	schedule => "*/5 * * * *"		
	}
 }
filter {
    date {
        match => [ "logdate", "YYYY-MM-dd HH:mm:ss.SSS" ]
      }
    }
output {
 elasticsearch {
    hosts => ["localhost:9200"]
    index => "jdbc1"
	}
#stdout { codec => rubydebug }
}

Kibana mapping:

PUT jdbc1
        {
          "mappings" : {
            "properties": {
              "logged_date": {
                "type": "date",
                 "format": "YYYY-MM-dd HH:mm:ss.SSS"
               }
            }
          }
        }

Your derived field in your SQL query in JDBC plugin is logged_date which matches your field name in the mapping but you are matching on logdate in the match statement in date filter. Change that to logged_date.

I just noticed....tat was actually a typo....the field name was correctly given in logstash config.

I have one more index with similar format I was able to ingest it....my only problem here is with the milliseconds which is of 7 digits😒

Thanks
Gautham

If you have seven digits of subsecond precision then you need to use .SSSSSSS in the date filter pattern

@Badger its working, but the problem is i'm getting a 5.30hrs delay from my original time.

Thanks
Gautham

elasticsearch always stores times as UTC. kibana will adjust them to the browser's timeone by default. If your log entries are not in UTC use the timezone option on the date filter to tell logstash what timezone they are in.

@Badger All of a sudden i'm getting my date as 1970 not sure from where it is taking this value.

Any idea about this?

Thanks
Gautham

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.