_dateparseerror

How to parse 2019-08-12T03:09:13.142635Z ?

Sample Data

2019-07-12T03:09:13.142635Z, an interesting message1
2019-07-13T03:09:13.333765Z, an interesting message2
2019-07-14T03:09:13.444987Z, an interesting message3
2019-07-15T03:09:13.555Z, an interesting message4

Sample Conf

filter{
  # Parse the log line
  grok {
    match => { "message" => "%{GREEDYDATA:timestamp_with_usec}, %{GREEDYDATA:message_detail}" }
  }

  # Parse Date
  date { 
    match => ["timestamp_with_usec", "YYYY-MM-dd'T'HH:mm:ss.SSSSSSZ"]
  }

  # Copy into @timestamp field if you want
  mutate {
         copy => { "timestamp_with_usec" => "timestamp" }
  }

}

Sample Results

{
              "timestamp" => "2019-07-12T03:09:13.142635Z",
                "message" => "2019-07-12T03:09:13.142635Z, an interesting message1",
    "timestamp_with_usec" => "2019-07-12T03:09:13.142635Z",
             "@timestamp" => 2019-07-12T03:09:13.142Z,
                   "path" => "/Users/sbrown/workspace/sample-data/timestamps/timestamp.log",
                   "host" => "ceres",
               "@version" => "1",
         "message_detail" => "an interesting message1"
}
{
              "timestamp" => "2019-07-13T03:09:13.333765Z",
                "message" => "2019-07-13T03:09:13.333765Z, an interesting message2",
    "timestamp_with_usec" => "2019-07-13T03:09:13.333765Z",
             "@timestamp" => 2019-07-13T03:09:13.333Z,
                   "path" => "/Users/sbrown/workspace/sample-data/timestamps/timestamp.log",
                   "host" => "ceres",
               "@version" => "1",
         "message_detail" => "an interesting message2"
}
{
              "timestamp" => "2019-07-14T03:09:13.444987Z",
                "message" => "2019-07-14T03:09:13.444987Z, an interesting message3",
    "timestamp_with_usec" => "2019-07-14T03:09:13.444987Z",
             "@timestamp" => 2019-07-14T03:09:13.444Z,
                   "path" => "/Users/sbrown/workspace/sample-data/timestamps/timestamp.log",
                   "host" => "ceres",
               "@version" => "1",
         "message_detail" => "an interesting message3"
}
{
              "timestamp" => "2019-07-15T03:09:13.555Z",
                "message" => "2019-07-15T03:09:13.555Z, an interesting message4",
    "timestamp_with_usec" => "2019-07-15T03:09:13.555Z",
             "@timestamp" => 2019-07-15T03:09:13.555Z,
                   "path" => "/Users/sbrown/workspace/sample-data/timestamps/timestamp.log",
                   "host" => "ceres",
               "@version" => "1",
         "message_detail" => "an interesting message4"
}

Hi Stephen ,

I didn't got your solution.I am using below config and getting _dateparseerror.

filter {
mutate {
convert { "timestamp" => "string" }
}
date {
match => ["timestamp", "yyyy-MM-dd'T'HH:mm:ss'.'SSS'Z'"]
timezone => "UCT"
target => "timestamp"
}
}

I want date in this format : Aug 12, 2019 @ 23:04:53.167 . Just take same example and send logstash configuration.

Try this..

  date { 
    match => ["timestamp_with_usec", "YYYY-MM-dd'T'HH:mm:ss.SSSSSSZ"]
    target => "timestamp"
  }

Apologies but I am unclear what you want.

Is Aug 12, 2019 @ 23:04:53.167 that what you want to see in the Kibana UI.? .. where do you want to see that format or is that the input / log format?

All dates are stored internally in Elasticsearch they are converted to UTC (if the time-zone is specified) and stored as a long number representing milliseconds-since-the-epoch.

Date formats are used when converting into Elasticsearch or formatting output. I am unclear what you are looking for please give an example log input line and perhaps I can help...

If you get this date into Elasticsearch correctly this format Aug 12, 2019 @ 23:04:53.167 will show up in Kibana automatically

Please surround all logs and confs with the using the format button above </>

Hi Stephen , Please find below details :

Input file : 2019-08-12T03:10:01.087Z

Logstash configuration :

input {
file
{ path => "/opt/date.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
mutate {
convert => { "timestamp" => "string" }
}
date {
match => ["timestamp", "yyyy-MM-dd'T'HH:mm:ss'.'SSS'Z'"]
timezone => "UCT"
target => "timestamp"
}
}
output{
stdout {codec => rubydebug}
}

Logstash Output :
[vraghav_dev@sl73elkapq001 vraghav_dev] /usr/share/logstash/bin/logstash --path.settings /opt/wiprotemp/vraghav_dev/ --path.data /opt/wiprotemp/vraghav_dev/output62 -f /opt/wiprotemp/vraghav_dev/date.conf Could not find log4j2 configuration at path /opt/wiprotemp/vraghav_dev/log4j2.properties. Using default config which logs errors to the console [WARN ] 2019-08-13 03:01:10.782 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified [FATAL] 2019-08-13 03:01:10.844 [LogStash::Runner] runner - Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting. [ERROR] 2019-08-13 03:01:10.874 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit [vraghav_dev@sl73elkapq001 vraghav_dev] /usr/share/logstash/bin/logstash --path.settings /opt/wiprotemp/vraghav_dev/ --path.data /opt/wiprotemp/vraghav_dev/output63 -f /opt/wiprotemp/vraghav_dev/date.conf
Could not find log4j2 configuration at path /opt/wiprotemp/vraghav_dev/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2019-08-13 03:02:21.859 [main] writabledirectory - Creating directory {:setting=>"path.queue", :path=>"/opt/wiprotemp/vraghav_dev/output63/queue"}
[INFO ] 2019-08-13 03:02:21.871 [main] writabledirectory - Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/wiprotemp/vraghav_dev/output63/dead_letter_queue"}
[WARN ] 2019-08-13 03:02:22.574 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-08-13 03:02:22.594 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.0.1"}
[INFO ] 2019-08-13 03:02:22.640 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"9ca2669c-9feb-4205-ab9e-19fc826de178", :path=>"/opt/wiprotemp/vraghav_dev/output63/uuid"}
[INFO ] 2019-08-13 03:02:34.756 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x3dcb9145 run>"}
[INFO ] 2019-08-13 03:02:35.482 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2019-08-13 03:02:35.748 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2019-08-13 03:02:35.756 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-08-13 03:02:36.742 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9602}
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"host" => "sl73elkapq001.visa.com",
"@version" => "1",
"message" => "2019-08-12T03:10:01.087Z",
"path" => "/opt/wiprotemp/vraghav_dev/date.log",
"@timestamp" => 2019-08-13T03:02:37.022Z
}

Note : I want date should be in below format Aug 12, 2019 @ 23:04:53.167

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.