Logstash - not able to insert date in this format 'YYYY-MM-DD hh:mm:ss'

My date format is YYYY-MM-DD hh:mm:ss but logstash is changing this to something like 2020-09-03T05:03:00.000Z .
I have use this filter in logstash config file

filter {
date {
match => [ "endDate", "YYYY-MM-DD hh:mm:ss" ]
timezone => "Asia/Kolkata"
target => "endDate"
}
}

still date not getting in require format

What does your mapping look like for this date data type? If you don't provide a format it will convert it to what you are seeing by default.

{
  "mappings": {
    "properties": {
      "endDate": {
        "type":   "date",
        "format": "YYYY-MM-DD hh:mm:ss"
      }
    }
  }
}
1 Like

My mapping is

PUT indexname
{
"mappings": {
"dynamic_date_formats": ["YYYY-MM-DD hh:mm:ss"]
}
}

Can you post a sample of your endDate field?

This is my logstash config file
input {
jdbc {
jdbc_driver_library =>"mysql-connector-java-8.0.16/mysql-connector-java-8.0.16.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb?serverTimezone=Asia/Kolkata&zeroDateTimeBehavior=convertToNull&useCursorFetch=true"
jdbc_user => "root"
jdbc_password => ""
tracking_column => "unix_ts_in_secs"
use_column_value=>true
jdbc_fetch_size => 50000
jdbc_default_timezone => "Asia/Kolkata"
statement => "SELECT
applicationNo,
endDate,
UNIX_TIMESTAMP(modifyOn) AS unix_ts_in_secs
FROM
ccmaping
WHERE
(UNIX_TIMESTAMP(modifyOn) > :sql_last_value AND modifyOn < NOW())"
schedule => "*/5 * * * * *"
}
}
filter {
date {
match => [ "endDate", "YYYY-MM-DD hh:mm:ss" ]
timezone => "Asia/Kolkata"
target => "endDate"
}
}
output {
elasticsearch {
document_type => "_doc"
document_id=> "%{id}"
index => "testindex"
hosts => ["http://localhost:9200"]

	}
stdout{
	codec => rubydebug
}

}

Sample data for endDate would be like '2020-01-20 14:01:31'
Logstash converting it into '2020-01-20T08:31:31.000Z'

I want the same format i.e YYYY-MM-DD hh:mm:ss

What are you going to use this field for? If you want it as a date then that format will work great.

If you want it for display purposes then I would just make it a string and never convert to a date.

I want to filter data according to endDate i.e and want to show result in 'yyyy-MM-dd HH:mm:ss' format only

"bool":{
"must":{
{
"range":{
"enddate":{
"gte" : '2020-09-01 13:00:00',
"lte" : '2020-09-04 13:00:00',
"format" : "yyyy-MM-dd HH:mm:ss"
}
}
}
}
}

Think I understand what you want to do now. The date filter is used to parse something to an ISO8601 timestamp. I don't think that is what you want.

In Logstash just leave your field as is and when then when it is mapped it will read it as a date.

Data

{
     "endDate": "2020-01-18 14:01:31"
}

Mapping

{
  "mappings": {
    "properties": {
      "endDate": {
        "type":   "date",
        "format": "yyyy-MM-dd HH:mm:ss"
      }
    }
  }
}

Query

{
  "query": {
    "range": {
      "endDate": {
        "gte": "2020-01-17 14:01:32", 
        "lte": "2020-01-19 14:01:31"
      }
    }
  }
}

Result

"_source" : {
  "endDate" : "2020-01-18 14:01:31"
}
1 Like

Thanks for the reply.

This is working fine If I use kibana console to insert data into elasticsearch.but throwing issue if I use logstash to insert data.Showing below error.logstash is ignoring my date format mappings and insert date fields in default date format

failed to parse date field [2020-01-24 21:00:00] with format [strict_date_optional_time||epoch_millis]: [failed to parse date field [2020-01-24 21:00:00] with format [strict_date_optional_time||epoch_millis]]