How to process ResponseTimestamp-ClientIP-ReceivedTimestamp to show ReceivedTimestamp (23/Dec/2021:00:02:53 +0000) as @timestamp

Input example line in log:
2021-12-23T00:02:54.126Z 10.173.7.76 - - [23/Dec/2021:00:02:53 +0000] "POST /uri1/uri2?queryp=00-0A-12-34-56-78 HTTP/1.1" 403 2001 "-" "useragent" 0 [500456]

Grok filter expression:

filter {
    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IP:alb_name} %{USER:client_id} %{USER:user_id} \[%{HTTPDATE:request_timestamp}\] \"(?:%{WORD:http_method} %{DATA:request_uri}(?: HTTP/%{NUMBER:http_version:float})?|%{DATA:rawrequest})\" %{NUMBER:target_status_code:int} (?:%{NUMBER:response_bytes:int}|-) %{QS:referrer} %{QS:user_agent} %{NUMBER:target_processing_time_s:int} \[%{NUMBER:target_processing_time_us:int}\]" ]
    }
    date {
        match => [ "request_timestamp", "dd/MMM/yyyy:HH:mm:ss ZZZZ" ]
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
    }
}

Actual output:
@timestamp 2021-12-23T00:02:54.126Z
request_timestamp 23/Dec/2021:00:02:53 +0000

Expected output:
@timestamp 23/Dec/2021-12-23T:00:02:53 +0000
request_timestamp 23/Dec/2021:00:02:53 +0000

What should be done in the grok filter expression to get the expected output?
Where do i find "timestampMatched" getting printed: /var/log/logstash/logstash-plain.log? stdout? Kibana as a field in index with key-value "debug": "timestampMatched"?

Note:
There are quite a few similar queries in this forum and other forums. The answers do Not work for everyone. ELK being something that heavily leverages "time series", please provide a proper answer that is tested and verified to be working for all popular formats with ELK. If you wish to Not be answering another 100 such similar queries in the coming years, please provide a list of filter codeblocks that match each of the popular time format and make it appear as @timestamp in Kibana (be sure to test it). Because, whatever documentation you provide is Not helping, as far as I can see from the different forums.

Change ZZZZ to Z.

Where do i find "timestampMatched" getting printed: /var/log/logstash/logstash-plain.log? stdout? Kibana as a field in index with key-value "debug": "timestampMatched"?

When the date filter fails to parse a date, it adds the _dateparsefailure tag to the event. When it succeeds at parsing a date, it applies any add_field or add_tag directives to the event. In both cases, you need to look at your pipeline's downstream to see the results. In your case, it looks like you are outputting to Elasticsearch and inspecting with Kibana so you should be able to see them there. But some people like adding a stdout output during local development to also output events to their STDOUT stream.

What should be done in the grok filter expression to get the expected output?

From what I can tell, your Grok filter is correctly populating the request_timestamp field with the sequence of characters representing a timestamp.

It looks like your date filter may need some tuning to parse that sequence of characters into a timestamp object.

  • The ZZZZ sequence is not defined in the documentation, so it is unclear what exactly it is capable of parsing. Looking at your input being a UTC-offset string +0000, I would expect a single Z in its place:

    Z
    Timezone offset structured as HHmm (hour and minutes offset from Zulu/UTC). Example: -0700.

  • The MMM sequence is particularly sensitive to locale. If your machine is not configured with American English being the default locales, you may need to add locale => en_US or similar to get it to reliably parse American-English abbreviations for month names.

Putting those together we get:

    date {
        match => [ "request_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
        locale => "en-US"
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
    }

Note:

The @timestamp field holds an object that represents a specific millisecond-precision instant on the UTC timeline. We serialize that timestamp using the ISO8601 standard, so your @timestamp after parsing the string 23/Dec/2021:00:02:53 +0000 will actually look like 2021-12-23T00:02:53.000Z.

Thanks for the quick response.

I tried with your solution, and it still puts the log entry timestamp as @timestamp, instead of the request received timestamp.

filter {
    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IP:alb_name} %{USER:client_id} %{USER:user_id} \[%{HTTPDATE:request_timestamp}\] \"(?:%{WORD:http_method} %{DATA:request_uri}(?: HTTP/%{NUMBER:http_version:float})?|%{DATA:rawrequest})\" %{NUMBER:target_status_code:int} (?:%{NUMBER:response_bytes:int}|-) %{QS:referrer} %{QS:user_agent} %{NUMBER:target_processing_time_s:int} \[%{NUMBER:target_processing_time_us:int}\]" ]
    }
    date {
        match => [ "request_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
        locale => "en-US"
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
    }
}

Sample input:
2021-12-23T00:02:54.126Z 10.173.7.76 - - [23/Dec/2021:00:02:53 +0000] "POST /uri1/uri2?queryp=00-0A-12-34-56-78 HTTP/1.1" 403 2001 "-" "useragent" 0 [500456]

Actual output:
@timestamp Dec 23, 2021 @ 05:32:54.126
log_timestamp Dec 23, 2021 @ 05:32:54.126, Dec 23, 2021 @ 05:32:54.126
timestamp 23/Dec/2021:00:02:53 +0000, 23/Dec/2021:00:02:53 +0000
tags _grokparsefailure, _geoip_lookup_failure, _dateparsefailure

Expected output:
@timestamp Dec 23, 2021 @ 05:32:53.000
log_timestamp Dec 23, 2021 @ 05:32:54.126
timestamp 23/Dec/2021:00:02:53 +0000
debug timestampMatched

Another issue: In the actual output, there seems to 2 sets of values (copies of the same value) in most fields. I don't know why this is happening.
Before every run, in the Dev Tools page in Kibana, "DELETE /<INDEX_NAME>* is run
Then, in elk server terminal, "curl -XDELETE localhost:9200/<INDEX_NAME>" is run. Both cases, "acknowledged: true" is obtained. Then, in Kibana, i ensure that "Index Management" does not have the index, Delete the index pattern in "Index Patterns", ensure that Discover page does not list that index. After that "systemctl restart filebeat && systemctl restart logstash" is run, and wait till the index is listed with 1 doc in "Index Management" page in Kibana, before proceeding to the Discover page.

Versions installed:
/usr/share/Elasticsearch/bin/Elasticsearch --version
7.15.2
/usr/share/logstash/bin/logstash --version
7.15.2
/usr/share/kibana/bin/kibana --version --allow-root
7.15.2
/usr/share/filebeat/bin/filebeat version
7.15.2

My reply is descriptive so that you may have the necessary info to help me resolve this issue. Thanks for taking time to read it. I am stuck for more than a day because of not receiving the expected output.
Please, can you try at your side with my sample input given above, and check what grok expression would give the expected output in Kibana? Thanks.

Hey, a new deployment of ELK 7.16.2 is created today. And in that, I get the expected output.

Grok filter expression:

filter {
    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IP:alb_name} %{USER:client_id} %{USER:user_id} \[%{HTTPDATE:request_timestamp}\] \"(?:%{WORD:http_method} %{DATA:request_uri}(?: HTTP/%{NUMBER:http_version:float})?|%{DATA:rawrequest})\" %{NUMBER:target_status_code:int} (?:%{NUMBER:response_bytes:int}|-) %{QS:referrer} %{QS:user_agent} %{NUMBER:target_processing_time_s:int} \[%{NUMBER:target_processing_time_us:int}\]" ]
    }
    date {
        match => [ "request_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
        locale => "en-US"
        target => "@timestamp"
        add_field => { "debug" => "timestampMatched"}
    }
}

Sample input:
2021-12-23T00:02:54.126Z 10.173.7.76 - - [23/Dec/2021:00:02:53 +0000] "POST /uri1/uri2?queryp=00-0A-12-34-56-78 HTTP/1.1" 403 2001 "-" "useragent" 0 [500456]

Actual output (same as expected output):
@timestamp Dec 23, 2021 @ 05:32:53.000
log_timestamp Dec 23, 2021 @ 05:32:54.126
request_timestamp 23/Dec/2021:00:02:53 +0000
debug timestampMatched

The original issue was probably because of the older 7.15.2 ELK. The new 7.16.2 ELK seems to work fine as shown above.
I am so happy that Badger and yaauie replied to my earlier query. Thank you very much.

Wish you both Happy Holidays!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.