How to convert existing timestamp format to another timestamp format using logstash?

Hi Guyz,

i have 2 different application logs.

one is containing (application1)

16/02/2017 19:21:19:452  INFO - 9DDF7A1C53D518BC9C4F7DAE9045C70E:/RCOM_PREPAID : ** Popped Stack Frame [/RCOM_PREPAID]
16/02/2017 19:21:19:467  INFO - C12A2AB5A5FF56B91AFDF8EF21107573:/RCOM_PREPAID : Storing :session___channel to complex: session:channel  as []
16/02/2017 19:21:19:467  WARN - 1832809EE1D72F1CC4C83A2206D20FFA:/RCOM_PREPAID :  - Particular Property 'rePromptNoInput' is not set!!!
16/02/2017 19:21:19:467 DEBUG - DDEAA825B49F320DE1CEF901032DEE2C:/RCOM_PREPAID : PlatformParams:getPlatformParams:Timestamp is 12/21/16 2:12:12
16/02/2017 19:21:19:467 ERROR - 75EC5FA713110FDC5B15A69A373257E6:/RCOM_PREPAID : session id:cgrmpp04-2017047134807-332 | Error processing request

and another application (application2) these logs:

2017-04-20 11:38:53,751 [Line171] INFO  com.gl.nortel.ivr.service.invoker.cdb.CDBServiceHandlerMOB invokeSOAPRPCService- RequestMsg:<?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><soapenv:Body><getCustomerProfile xmlns="http://webservices.ivr.ibm.com"><profileDTO xmlns=""><errordto xsi:nil="true"/><lob>prepaid</lob><clid>8173077459</clid><firstName xsi:nil="true"/><middleName xsi:nil="true"/><lastName xsi:nil="true"/><nameTitle xsi:nil="true"/><customerEmail xsi:nil="true"/><segmentdto xsi:nil="true"/><accountStatus xsi:nil="true"/><accountNo xsi:nil="true"/><doa xsi:nil="true"/><dob xsi:nil="true"/><language xsi:nil="true"/><tpin xsi:nil="true"/><region xsi:nil="true"/><incomingSource>IVR121</incomingSource><languageUpdated xsi:nil="true"/><tpinUpdated xsi:nil="true"/><dthProfileDTO xsi:nil="true"/><postpaidProfileDTO xsi:nil="true"/><prepaidProfileDTO xsi:nil="true"/><telemediaProfileDTO xsi:nil="true"/><source xsi:nil="true"/><circleId xsi:nil="true"/><airtelOne xsi:nil="true"/><residentialCity xsi:nil="true"/><correspondanceCity xsi:nil="true"/><MobilityServiceItemList xsi:nil="true"/></profileDTO></getCustomerProfile></soapenv:Body></soapenv:Envelope>
2017-04-20 11:38:53,752 [Line171] INFO  com.gl.nortel.ivr.service.invoker.cdb.CDBServiceHandlerMOB invokeSOAPRPCService- ResponseMsg: <?xml version="1.0" encoding="utf-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><soapenv:Body><getCustomerProfileResponse xmlns="http://webservices.ivr.ibm.com">

here, both timestamp is different and i want to convert both timestamp format to standard timestamp format
which would be "dd-mm-yyyy-HH-MM-SS"

i am using different filebeat multiline setting for both applications
application1:
> multiline.pattern: '[1]{2}/[0-9]{2}/[0-9]{4}'
> multiline.negate: true
> multiline.match: after

application2:

multiline.pattern: '[2]{4}-[0-9]{2}-[0-9]{2}'
multiline.negate: true
multiline.match: after

and filter setting is

application 1 is:

grok {
	  match => { "message" => "%{DATESTAMP:timestamp} (?<logLevel>(?:DEBUG|FATAL|ERROR| WARN| INFO)) - (?<sessionId>(%{WORD})):\/(?<appName>(%{USERNAME})) : %{GREEDYDATA:errMsg}" }
	  
	}
	date {
		match => ["timestamp", "dd/mm/yyyy HH:mm:ss a", "dd/mm/yyyy HH:mm:ss:SSS"]
	}

and application 2 is:

grok {
	  match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} \[(?<lineNum>(%{WORD}))\] (?<logLevel>(?:DEBUG|FATAL|ERROR|WARN|INFO)) %{SPACE}%{JAVACLASS:className} %{WORD:serviceMethod}- %{GREEDYDATA:errMsg}" }
	  
	}
	date {
		match => ["timestamp", "yyyy-mm-dd HH:mm:ss,SSS", "yyyy-mm-dd HH:mm:ss a"]
	}

you can test these pattern on test pattern
during test pattern please enable multiline negation true

please suggest some way which helps i can convert these logs timestamp format to another format
like if format is
"yyyy-mm-dd HH:mm:ss,SSS" => "dd-mm-yyyy-HH-MM-SS"

Good answer will be appreciated.


  1. 0-9 ↩︎

  2. 0-9 ↩︎

There's no stock plugin for this. You'll have to write a snippet of Ruby code in a ruby filter.

alright @magnusbaeck

Is it possible if i break the existing timestamp to something like:
suppose if we have these format
2017-04-20 11:38:53,751 then i can do something like in grok pattern
%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:min}:%{SECOND:sec},%{INT:usec}

and then we add into another format something like

mutate {
	add_field => {
		"timestamp" => "%{day}-%{month}-%{year}-%{hour}-%{min}-%{sec}"
	}
	remove_fields => ["day", "month", "year", "hour", "min", "sec", "usec"]
}

is it possible and if it possible
then how to assign this field as a timestamp at same time ?

is it possible and if it possible

Sure, you can do that.

then how to assign this field as a timestamp at same time ?

You mean how to get a date field in Elasticsearch? Update the index template used so that the field in question is recognized as a date type with the format you've picked.

But why does this even matter?

yeah @magnusbaeck

But why does this even matter?

i want fields as a date type.

It would be great help if you can show an example snippet of code by using above scenario for converting fields date type.
and yeah,
These event matter for me because when i do same thing with both kind of logs then it have standard timestamp format at one place for both timestamp in consolidated form so i can visualize these thing easily in kibana dashboard and it would be helpful to understand end user.

i want fields as a date type.

I meant why it matters what exact format you store the timestamps as.

These event matter for me because when i do same thing with both kind of logs then it have standard timestamp format at one place for both timestamp in consolidated form so i can visualize these thing easily in kibana dashboard and it would be helpful to understand end user.

The whole point of the date filter is that it converts different kinds of timestamps into a single format that Elasticsearch happens to understand and treat correctly out of the box. It's not clear to me why you were overcomplicating things by insisting on a different date format that ES doesn't understand without extra work.

1 Like

Hi @magnusbaeck
exact format is not matter but it should be any standard format which is same for both different timestamp which would be store in elasticsearch.

I just want how can i acheive this ?

As I said, use the date filter.

Thanks @magnusbaeck

i had another question i created another topic for this.
can you revert on this topic and point out to me in right direction.

what-would-be-the-filetype-for-our-custom-patterns-in-pattern-directory

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.