Defining grok patterns

In logstash, certain patterns are already defined like "COMBINEDAPACHELOG" etc. But what if I want to use my own regular expression? Where do I define it? And how do I use it?

Grok expressions don't need to contain grok patterns like COMBINEDAPACHELOG. In the end it just gets expanded to a normal regular expression anyway. See https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_regular_expressions.

If you want to define your own grok patterns that you can reuse. See https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_custom_patterns.

Can you give me an example where you define the pattern inside grok?

Hi @Anmol
You have define your grok pattern like this,
grok{
match =>{"field name" => "your own regular expression"}
}

Can you give me an example where you define the pattern inside grok?

I'm not sure what you're asking. Do you want to define your own grok pattern, similar to COMBINEDAPACHELOG, and use that in a grok filter? If so the second documentation link I posted contains an example.

Hey,
I don't know how i got into this conversation but i find it a really
meaningful discussion and hey anmol_garg i also do think that you should
really check that link

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_custom_patterns

This should solve your problem and if not i'd like to know what exactly you
want.

I got it. Thanks a lot.

This doesn't work. I am not sure but using this syntax was giving me an error.

match =>{"field name" , "your own regular expression"}

This is the syntax that worked.

The main problem with the documentation is the syntax. Like suppose my logs contains lines in 3 types of formats. Then how can i parse each of them using if..else statements? Can you give me an example?

if some condition {
  grok {
    match => ["message", "some expression"]
  }
} else {
  grok {
    match => ["message", "some other expression"]
  }
}

If this doesn't answer your question, please be more specific.

Suppose the lines in my log file either start with a letter or with a "-"
If they start with a letter, I want to parse them differently, and if with a "-", then want to parse them differently

Then you want a conditional similar to this:

if [message] =~ /^-/ {
  ...
}

See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html for details and more examples.

Suppose my log is

a=b
x=y
m=n

if I merge them like :
codec => multiline{
pattern => "^[A-Za-z]"
what => "previous"
}
I get a single log.

But what if I want the tags parsed in the way :
a => b
x=> y
m => n

The add_field as well as the add_tag is not working, because it just combines with the previous log and is only displayed in the message. How do I do that?

I probably don't understand what you mean, but have you looked at the kv filter for parsing key/value pairs? If that doesn't help, please provide examples; what do your events look like now and how would you like them to look?


UpTime=5 s
HeapMemory=93/245 Mb
SystemMemory=3151/3956 Mb

This is my log file, including the top line with the '-'

This is one log entry. So, while parsing, my input code is :
codec => multiline{
pattern => "^[A-Za-z]"
what => "previous"
}

i.e. Any line starting with a letter, is a part of the previous log. Whenever a line starts with '-', that is a new log entry.

else if [message] =~ /^[A-Za-z]/ {
grok{
match => [ "message" , "%{WORD:key}=(?[^\n]+)"]
add_field => { "%{key}" => "%{value}" }
}

Now this part of the code (the add_field part), is not being executed. These fields are not shown when I run them into logstash.

You said that each line that does not start with '-' belongs to the previous line. I assume that means that each assembled message will start with '-', and not /^[A-Za-z]/, which is what you are checking for in your conditional. This should as Magnus pointed out earlier be /^-/.

else if [message] =~ /^[A-Za-z]/ {
grok{
match => [ "message" , "%{WORD:key}=(?[^\n]+)"]
add_field => { "%{key}" => "%{value}" }
}

Where does %{value} come from? Aren't you dealing with the case of multiple key=value tokens in the same string? If so, why not use the kv filter?

match => [ "message" , "%{WORD:key}=(? [^\n]+)"]

Sorry, the code was supposed to be like this.i am mentioning value but it is not being shown here. Anyway..

Ya, it worked with KV. I didn't know about it. Cool

One more thing, can you also guide me through logstash agents? like beats or something. To be more specific, Suppose I have logstash installed on my system, and I want to process logs from another system. How can I do that? Like how can I push logs from some other system, too my system where they can be parsed through logstash?

Please read the documentation and start a new thread for any follow-up questions.

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-getting-started.html

What are usually the use cases while using the ELK stack for log analysis?