Logs not parsed as per splunk

Hi,
I am a newbie to ELK stack and I am trying to read logs from an application, so please pardon my ignorance.

This is a Citrix based application from a vendor and have logs in the format as shown below.

===============================================New Entry===============================================
Log Time: 10/08/2018 12:12:43 PM
frmUserLogin(XXXXXX)

===============================================New Entry===============================================
Log Time: 10/08/2018 12:12:55 PM
Login - Initialize Successfully

When I see this log in Kibana I see that each line of the log including "== New Entry==" is parsed as a separate row. I need to have everything within "== New Entry==" as a single message as it contains the Log entry time and the message.
In above example Log Time is 10/08/2018 12:12:43 PM and Message is frmUserLogin(XXXXXX).

When viewing this log in splunk I see that the parsing is correct and each message comes in a different row. As can be seen in the snapshot below
image

Need help in to get complete message with timestamp from log in single row. Is there a grok filter plugin to merge message or any other way to get the logs displayed as shown in splunk?

My logstash.conf file is as follows.

input {
file {
path => "C:/XXX/XXX/XXX/XXX/*"
sincedb_path => "C:/XXX/XXX/XXX/XXX/logstash-6.3.2/sincedb"
start_position => "beginning"
}
}

The filter part of this file is commented out to indicate that it is

optional.

filter {
# Need to know what filter can help here??
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "test-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

Thanks in advance.

As the log entry is spread across multiple lines, you will need to use a multiline codec to group the relevant lines into a single event. Then you can apply grok to it.

Thanks Christian for the reply.

I tried using multiline codec however the logs do not have a set pattern except that each log message is placed between following lines ===============================================New Entry===============================================

to use multiline codec I need a pattern and the pattern for me is to concatenate fields between the line ===============================================New Entry===============================================

Can you suggest how it I can use multiline code here?

I tried following and it did not help me.

stdin {
codec => multiline {
pattern => "===============================================New Entry==============================================="
what => "next"
}

or is there another way to concatenate these messages?

thanks

Assuming the new entry line is the only one that starts with a series of = I think it should look something like this:

codec => multiline {
  pattern => "^=========="
  negate => true
  what => "previous"
}

HI Christain, Thanks for the codec, I tried it but still do not see the logs concatenated as single message with logtime and message in one row..

my logstash.conf file is as follows:

input {
file {
path => "C:/Rahul/Softwares/EKL/logs/*"
sincedb_path => "C:/Rahul/Softwares/EKL/logstash-6.3.2/sincedb"
start_position => "beginning"
}

stdin {
codec => multiline {
	pattern => "^=========="
	negate => true
	what => "previous"

}
}
}

The filter part of this file is commented out to indicate that it is

optional.

filter {

}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "mvlogs-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

anything you can suggest what should I do to make it segregated based on the Logtime and message.

FYI a snapshot of my logs for your reference.

===============================================New Entry===============================================
Log Time: 10/04/2018 12:12:43 PM
frmUserLogin(Form_Load)

===============================================New Entry===============================================
Log Time: 10/04/2018 12:12:55 PM
Login - Initialize Successfully

===============================================New Entry===============================================
Log Time: 10/04/2018 12:12:55 PM
Check if user XXXX exist

===============================================New Entry===============================================

Logs in Kibana comes up like this:

Hi,
My issue is that the Log Time in the log is the actual time of the event that got captured in the log.
after parsing when I search for errors in the results I see that the timestamp added in the search is no the Log Time but the time Logstash parsed the logs. Please see the example below:

Following is the message in Kibana after parsing.

message:
Inner exception:
host:
XXXXXXXX
path:
C:/XXXX.log
@version:
1
@timestamp:
August 31st 2018, 13:13:29.291
_id:
8mr4jWUBZ-ddumIQro7v
_type:
doc
_index:
mvlogs1-2018.08.31

When I search for exception this message pops up in the result with the timestamp of the time when the log was parsed by Logstash.
However the actual time of this exception is mentioned in the line above this message in the application log which is Log Time: 12/07/2018 10:07:46 AM.

Hence I need a way to capture this log in a way that one log entry (which is in multiple lines) comes as a single row in Kibana and the timestamp is captured as Log time and not the parsed time of Logstash.

Example from application log for above example.

===============================================New Entry===============================================
Log Time: 12/07/2018 10:07:46 AM
encountered a run-time error.
Error Information:
_ Error Time: 12/07/2018 10:07:46 AM_
_ Error Type: Infrastructures.Exceptions.MvSqlException_
_ Description: Error executing query._
_ Error Number: 53_
_ Server: XXXXXXXXXXX_
_ Database: XXXXXXXX_
================Start Query================

select dbo.is_production_db_fn()

================End Query================
Inner exception:
A network-related or instance-specific error occurred while establishing a connection to SQL Server.

I have tried what Christian mentioned in the Logstash.conf file but my logstash does not parse multiple lines in single row entry.

Is there anything I am missing here?

Thanks

Once you have the message captured correctly using the multiline codec, you need to extracts the relevant fields, e.g. the timestamp of the event. Have a look at this practical introduction to Logstash which walks you through this.

Hi chritian, thanks for replying to my queries.
My issue of combining messages is still not solved.
Thats why I am not able to use any grok filters to capture timestamp o other fields.

Is there any thing else you can suggest in multiline code?

I do the following to get the changes in logstash.config come into affect.

  • Stop ELK servies

  • Change logstash.config to include new multiline codec and new index

  • Delete sincedb file

  • Restart ELK services

  • Add new index and view logs in dashboard.

The logs do not seem to parse a single message for each event.

Copying my confg file again below:

input {
_ file {_
_ path => "C:/Rahul/Softwares/EKL/logs/*"_
_ sincedb_path => "C:/Rahul/Softwares/EKL/logstash-6.3.2/sincedb"_
_ start_position => "beginning"_
_ }_
_ _
_ stdin {_
_ codec => multiline {_
_ pattern => "^=============================================="_
_ negate => true_
_ what => "previous"_
}
_ }_
}

_ filter {_
_ _
_ }_
_ _

output {
_ elasticsearch {_
_ hosts => ["localhost:9200"]_
_ index => "mvlogs1-%{+YYYY.MM.dd}"_
_ }_
_ stdout { codec => rubydebug }_
}

Is there anything else you can suggest to get multiline codec working ?

Can you show what the events written to stdout looks like (together with the original raw lines)?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.