Logstash to store information for next events

filter {
grok {
    match => {
        "message" => [
                        "I %{GREEDYDATA:cur_ts}     .* cli.*since",
                        "Incoming.*: %{NUMBER:[dep][pktper]}/%{NUMBER:[dep][byteper]} %{NUMBER:[dep][pkts]}/%{NUMBER:[dep][bytes]}.*"
                     ]
    }
}

date {
    match => [ "cur_ts", "yyyy-MM-dd_HH:mm:ss.SSSSSS" ]
    target => "@timestamp"
}
}

Hi,

I'm currently working with logstash-5.5.1 version.

There are only two match filters I have it currently. My idea is to get 'cur_ts' from the first condition, and if it exists then store them somewhere locally/globally and recall it when the second match happens .

Ideally, these two filters are been generated by same logger at the same time, but the second line does not have any timestamp associated.

Here is the sample example of log which I'll be parsing.

 {
       I 2017-08-03_10:27:52.939770     0 lpmain     --- FINAL Status: run since 2017-08-03_10:24:47 time-of-flight: 0     days 00:03:05
      Resources(usr/sys/rss/xss/dss/sss/spf/hpf): 36 16 280268 0 0 0 6893 5
     Resources(swp/inb/oub/snd/rcv/sig/vcs/ics): 0 520 134504 0 0 0 10079813 9337
      === DISTRIBUTION ===
     Incoming   (%,total,rate,workers): 100/100 9815/6287303 ----/----    0 /   0  9.81K/6.28M
     Failed     (%,total,rate,workers):   0/  0 22/2880 ----/----    0 /   0  22.00/2.88K
     Passed     (%,total,rate,workers):  99/ 99 9793/6284423 ----/----    0 /   0  9.79K/6.28M
}

Any clue on how we need to solve would be appreciated.

Regards,
Anand

After trying out few things, this is the solution which worked out to me.

I reckon, this is just an idea on how I solved. May be there are few other better ways to deal in more efficient way.

{
filter {

       grok {
       	     match => {
       	      "message" => [
	      	  "I %{GREEDYDATA:curts}     .* cli.*since",
		  "Incoming.*: %{NUMBER:[dep][pktper]}/%{NUMBER:[dep][byteper]} %{NUMBER:[dep][pkts]}/%{NUMBER:[dep][bytes]}"
		  ]
	     }
	}

	if [curts] {
		   ruby {
   		   	init => "@@cur_ts = ''"
			code => "
			          @@cur_ts = event.get('curts')
				  event.set('@logtimestamp', @@cur_ts)
				"
			}
	} else {
       	       # If curts (current timestamp) is still empty, fill it with already stored information.
	       ruby {
	       	   	code => "
	       		      event.set('@logtimestamp', @@cur_ts)
			      	"
	       }
	}

	date {
	     match => [ "@logtimestamp", "yyyy-MM-dd_HH:mm:ss.SSSSSS" ]
	     target => "@timestamp"
	}

	mutate {
	       remove_field => [ "@logtimestamp" ] #Removing unused timestamp.
	}
}

}

i would probaly use multiline so that all this will be one event
in logstash file input that would be a multiline codec and with filebeat there is a multiline option

than use the grok to extract the meta data
and whan that has happend use split on \n to make multiple events from this single one

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.