How to define the patterns for filters in logstash


(shravankodipaka) #1

Hi all,

I am new to Logstash, I've just set up a new ELK-stack for analyzing of Tomcat and EC2 instance logs in Kibana.
For this how to define the patterns and how to configure custom patterns in logstash.conf file... plz help me.....

Advance thanks,
Shravan K.


(Magnus Bäck) #2

You'll want to use the grok filter for this. Have you read its documentation, including the section on custom patterns?


(shravankodipaka) #3

Hi mag, thank you for reply...

Yes i want to use grok filter for this...

Just explain little deeply how to do the custom patterns/filters for this please...

Thanks,
Shravan K.


(Magnus Bäck) #4

If you ask concrete questions I will attempt to answer them, but I won't write about things that are already documented. If you don't understand the documentation then please ask questions about the parts that you don't understand.


(shravankodipaka) #5

thank you mag,

Please share that document.. or any other related links for this.

Thanks,
Shravan K.


(Magnus Bäck) #6

I pointed you to the grok documentation in my last post!

The Logstash documentation also contains a couple of complete configuration examples that you should be able to find easily.


(shravankodipaka) #7

Hi Mag,

below is my logstash.conf file. Am given the tomcat log and default patterns path..
Now i want to filter the tomcat log file,
If any changes are required in this code please inform to me where i need to change the code...

its my logstash.conf file
###########################################
input
{
file
{
path => "/usr/share/apache-tomcat-8.0.23/logs/.log"
type => "logs"
start_position => "beginning"
}
}
filter {
if [type] == "logs" {
grok {
patterns_dir => "/opt/ELK/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.3.0/patterns"
match => [ "message", "[%{TIMESTAMP_ISO8601:timestamp}][%{DATA:severity}%{SPACE}][%{DATA:source}%{SPACE}]%{SPACE}[%{DATA:node}]%{SPACE}(?(.|\r|\n)
)" ]
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss,SSS" ]
}
}
}
output
{
elasticsearch
{
host => "localhost"
protocol => "http"
port => "9200"
}
}

Its my kibana output:

@timestamp November 10th 2015, 07:41:37.307
t@version1 t_id AVDwV9CJ6v7rE3-RskI1

t_index logstash-2015.11.10

t_type logs

thost ip-10-129-52-27.apsoutheast-.compute.internal

tmessage 09-Nov-2015 00:55:32.678 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 21808 ms
tpath /usr/share/apache-tomcat-8.0.23/logs/catalina.2015-11-09.log
ttags grokparsefailure
_
ttype logs
Thanks,
Shravan K,


(Magnus Bäck) #8

I think you'll find http://grokconstructor.appspot.com/ and https://grokdebug.herokuapp.com/ useful for experimenting and learning how to write grok expressions.

One immediate problem with your current expression is that you're not escaping the square brackets (which have a special meaning in regular expressions). To avoid surprises I also suggest that you avoid using more than one DATA pattern in the same expression.


(shravankodipaka) #9

Hi Mag,

I need to get the info regarding patterns/filters for Heap-dump, memory leak and any exception.. etc in Tomcat log.

Could you please review my query also help me regarding this.

Thanks,
Shravan K,


(Magnus Bäck) #10

If you post a question I might be able to help. Clear questions with input, expected output, and a minimal configuration sample will improve the chances.


(system) #11