Help for grok RFC3339 pattern

Hi i need use millisecond into syslog file, i have commented out the "RSYSLOG_TraditionalFileFormat" template fron rsyslog.conf and now i have timestamp in RFC3339 format, i need parse this timestamp but I do not know what pattern to use.

New format is:

2017-10-25T17:30:31.790589+02:00

does a pattern exist for the match?

TIMESTAMP_ISO8601?

OK thanks, works, but i have problem in kibana, i need order log per millisecond but in case of same @timestamp, syslog_timestamp not sort order

Selezione_001

at the right is syslog_timestamp, this order is unsortable...i need same format as @timestamp (left comumn), maybe i can build a new timestamp with "mutate"? is there a less complex way to do so?

You need to use a date filter to parse the field with the extracted timestamp.

I have tried in this way:

logstash.conf

filter {
  grok {
    match => [
      "message",
      "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day}[T ]%{HOUR:hour}:?%{MINUTE:minute}(?::?%{SECOND:second})?%{ISO8601_TIMEZONE}? %{GREEDYDATA:syslog_data}"
    ]
  }
  mutate {
    add_field => { "sys_timestamp" => "%{year}-%{month}-%{day}T%{hour}:%{minute}:%{second}Z" }
    remove_field => [ "year", "month", "day", "hour", "minute", "second" ]
  }
}

echo '2017-10-26T14:37:06.540286+02:00 some-data}' | logstash -f logstash.conf

Pipeline main started
{
      "message" => "2017-10-26T14:37:06.540286+02:00 some-data",
      "@version" => "1",
      "@timestamp" => "2017-10-26T12:37:21.522Z",
      "sys_timestamp" => "2017-10-26T14:37:06.540286Z"
}
Pipeline main has been shutdown

Now i have sys_timestamp with same format of @timestamp but in kibana the log is displayed with 2 hours more, although in the json view the timestamp is correct

Selezione_003

does not it seem a logstash problem, maybe kibana?

Where's your date filter?

Have tried with:

date {
     match => [ "sys_timestamp", "ISO8601" ]
     timezone => "Europe/Rome"
   }

But i have an error

{:timestamp=>"2017-10-26T12:32:05.920000+0000", :message=>"Failed parsing date from field", :field=>"sys_timestamp", :value=>"%{year}-%{month}-%{day}T%{hour}:%{minute}:%{second}Z", :exception=>"Invalid format: \"%{year}-%{month}-%{day}T%{hour}:...\"", :config_parsers=>"ISO8601", :config_locale=>"default=en_US", :level=>:warn}

I do not know how to handle the match...

It looks like the year, month etc fields weren't set when the sys_timestamp field was created. Let's see all of the configuration at once. Did you try using the the TIMESTAMP_ISO8601 grok pattern?

Yes i have tried to use TIMESTAMP_ISO8601:

filter {
  grok {
    match => [
      "message",
      "%{TIMESTAMP_ISO8601:sys_timestamp} %{GREEDYDATA:syslog_data}"
    ]
  }
  date {
    match => [ "sys_timestamp", "ISO8601" ]
    timezone => "Europe/Rome"
  }
}

The resul is obviously:

Pipeline main started
{
      "message" => "2017-10-26T14:37:06.540286+02:00 some-data",
      "@version" => "1",
      "@timestamp" => "2017-10-26T12:37:21.522Z",
      "sys_timestamp" => "2017-10-26T14:37:06.540286+02:00"
}
Pipeline main has been shutdown

I this way how can convert format 2017-10-26T14:37:06.540286+02:00 in 2017-10-26T14:37:21.522Z ?

Works fine here (below). Are you sure your filters are being run? Your grok filter should either a) be successful and produce sys_timestamp and syslog_data fields or b) be unsuccessful and that the event _grokparsefailure. Right now it appears to produce only a sys_timestamp field and that doesn't make sense.

$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => [
      "message",
      "%{TIMESTAMP_ISO8601:sys_timestamp} %{GREEDYDATA:syslog_data}"
    ]
  }
  date {
    match => [ "sys_timestamp", "ISO8601" ]
    timezone => "Europe/Rome"
  }
}
$ echo '2017-10-26T14:37:06.540286+02:00 some-data' | /opt/logstash/bin/logstash -f test.config 
Settings: Default pipeline workers: 8
Pipeline main started
{
          "message" => "2017-10-26T14:37:06.540286+02:00 some-data",
         "@version" => "1",
       "@timestamp" => "2017-10-26T12:37:06.540Z",
             "host" => "lnxolofon",
    "sys_timestamp" => "2017-10-26T14:37:06.540286+02:00",
      "syslog_data" => "some-data"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

I have omitted in the example syslog_data, sorry , the field exists
If i leave sys_timestamp in this format, i see in kibana the same format like @timestamp (eg. October 26th 2017, 14:37:14.025) ?

I have omitted in the example syslog_data, sorry , the field exists

Please don't tamper with the evidence.

If i leave sys_timestamp in this format, i see in kibana the same format like @timestamp (eg. October 26th 2017, 14:37:14.025) ?

Yes, ES should detect that string as a date the next time you create an index and the automapper gets a chance to pick a mapping (existing field mappings can't be changed).

But why would you keep sys_timestamp now that you've parsed it into @timestamp and they contain the same thing?

Have tried in local env with elk docker and seems to works, thanks !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.