Set a filter to parses out a timestamp and uses it as the timestamp for the event

Hi,

I am trying to configure logstash to parse out a field from the event and use it as timestamp.

My input looks like that:

<Mar 1, 2016 1:54:15 PM IST> <Info> <Security> <BEA-090905> <Disabling the CryptoJ JCE Provider self-integrity check for better startup performance. To enable this check, specify -Dweblogic.security.allowCryptoJDefaultJCEVerification=true.>
<Mar 1, 2016 1:54:15 PM IST> <Info> <Security> <BEA-090906> <Changing the default Random Number Generator in RSA CryptoJ from ECDRBG128 to FIPS186PRNG. To disable this change, specify -Dweblogic.security.allowCryptoJDefaultPRNG=true.>
<Mar 1, 2016 1:54:15 PM IST> <Info> <WebLogicServer> <BEA-000377> <Starting WebLogic Server with Java HotSpot(TM) 64-Bit Server VM Version 25.31-b07 from Oracle Corporation.>
<Mar 1, 2016 1:54:16 PM IST> <Info> <Management> <BEA-141107> <Version: WebLogic Server 12.1.3.0.0  Wed May 21 18:53:34 PDT 2014 1604337 >
<Mar 1, 2016 1:54:16 PM IST> <Warning> <Management> <BEA-141274> <Production mode has been specified at the command line using the the weblogic.ProductionModeEnabled system property. This system property overrides the development mode setting contained in the config.xml file. However, the Administration Console and WLST show the attribute values and defaults that correspond to the develo

My logstash filter:

filter {
        if [type] == "weblogic_log" {       
		       	mutate {
		                add_tag => [ "WL_LOGS" ]
		                uppercase => [ "severity" ]
		        }
                multiline {
                           patterns_dir => "/users/mpswrk1/LogStash/impls/patterns/patterns"
              	           pattern => "^\<%{WEBLOGICTIMESTAMP} "
			  	           negate => true
			  	           what => "previous"
			  	}
			  	grok {
                        match => { "message" => "\<%{WEBLOGICTIMESTAMP} %{WORD}\> \<%{LOGLEVEL:severity}\> \<%{DATA:module}\> \<%{DATA:error_code}\> \<%{DATA:error_message}\>" }
                        patterns_dir => "/users/mpswrk1/LogStash/impls/patterns/patterns"
                }
                date {
                         match => [ "timestamp" , "MMM dd, yyyy HH:mm:ss" ]
                         #match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
                }

*WEBLOGICTIMESTAMP is a private pattern I set:

                      WEBLOGICTIMESTAMP %{MONTH} %{MONTHDAY}, %{YEAR} %{TIME} %{DL}

                      DL ([P|A]M]?)

I am trying to get the Mar 1, 2016 1:54:15 PM IST which is represent by%{WEBLOGICTIMESTAMP} %{WORD} in the grok, as the timestamp. In this way I will have in the timestamp exactly the time the message occurs and not when it was load to elasticsearch.

Thanks
Sharon.

Please edit your post and mark your example log entry as preformatted text (there's a toolbar button for it). If you look at what you actually posted you'll note that it's not the same as what's in your log.

So what happens? Is the grok filter failing, leaving you with a _grokparsefailure tag to each event?

I edited my post again.

Regards
Sharon

Please answer the question in my second paragraph too. The result of a stdout { codec => rubydebug } output would be useful.

The grok filter isn't failing, but I am getting as timestamp the time when the doc inserted to elastic. It affects my results in monitoring.

Thanks
Sharon.

You mean to add this to my output and rerun?

Thanks
Sharon

The grok filter isn't failing, but I am getting as timestamp the time when the doc inserted to elastic. It affects my results in monitoring.

You're not capturing the timestamp into a field. Change %{WEBLOGICTIMESTAMP} to %{WEBLOGICTIMESTAMP:timestamp}. If it still doesn't work, look in your logs for clues. If the date filter fails it'll indicate why.

You mean to add this to my output and rerun?

Yes.

great. I will try it and let you know.

Sharon.

It looks much better now. I have a field named timestamp and the data is there. but I have a t next to it and not a 'clock' like next to @timestamp. It means that the field is understood as a term and not as a timestamp. Right? I want to use that field as timestamp for my X axis line.

In the following example : https://www.elastic.co/guide/en/logstash/current/config-examples.html I see both @timestamp and timestamp fields were updated with the same value. This is also the results I want to have.

Anyway, I also can't find logs in my logstash log directory. I assume something wrong in my configuration.

[root@eaasrt log]# cd logstash/
[root@eaasrt logstash]# ll
total 0

Thanks
Sharon.

I see that on the stdout:

Failed parsing date from field {:field=>"timestamp", :value=>"Feb 28, 2016 3:25:46 AM", :exception=>"Invalid format: \"Feb 28, 2016 3:25:46 AM\" is malformed at \" AM\"", :config_parsers=>"MMM dd, yyyy HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}
Failed parsing date from field {:field=>"timestamp", :value=>"Feb 28, 2016 6:55:44 AM", :exception=>"Invalid format: \"Feb 28, 2016 6:55:44 AM\" is malformed at \" AM\"", :config_parsers=>"MMM dd, yyyy HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}
{

You need to include "aa" at the end of you date pattern to recognize AM or PM. Also, change HH for the hours to hh since the timestamp has 12-hour days. I'd expect "MMM dd, yyyy hh:mm:ss aa" to work.

Anyway, I also can't find logs in my logstash log directory. I assume something wrong in my configuration.

The location of the Logstash logs depends on how you invoke Logstash.

1 Like

It worked!

Thanks
Sharon.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.