How to assign the extracted data from log file to new added fields

Hi ,

I have the following logstash config for adding 2 new fields "timestamp1" and "response1" , when I use mutate block I am able to see the new fields but the data parsing is failing from the log file.

filter {
grok {
patterns_dir => ["D:\hp_deloitte\sudhsrivastava\ELK_setup\logstash-6.0.1.OLD\patterns"]
match => { "message" => "%{GREEDYDATA:message}%{RESPPATTERN:response1}%{DATEPATTERN:timestamp1}"}
}
mutate {
add_field => { "timestamp1" => "%{timestamp1}" }
add_field => { "response1" => "%{response1}" }
}
date {
match => ["timestamp1", "yyyy-MM-dd HH:mm:ss,SSS"]
target => "timestamp1"
}

Below is the log sample:

`2020-08-02 10:53:02,052 DEB [0.TriggerInputAdapter:1)]] c.r.f.aAnalyticsApiRequestor      -Sending API request :: ResponseTime : 6 ms - Recieved`

But in Kibana this is what i am seeing:

`timestamp1:%{timestamp1}` 
`response1:%{response1}`

However I am also getting the tags in kibana : tags:beats_input_codec_plain_applied, _grokparsefailure, _dateparsefailure

and not the actual values. Please help. Where am I gong wrong?

That sets the [timestamp1] field to the value of the [timestamp1] field (i.e. it does nothing) and similarly for [response1]. What are you trying to do with that?

If you are getting a _grokparsefailure tag then your grok patterns do not match. How are RESPPATTERN and DATEPATTERN defined?

Thanks @Badger for the response.

below are two patterns:

RESPPATTERN [0-9]{1,2}ms => to match "6ms"
DATEPATTERN [0-9]{4}-[0-9]{2}-[0-9]{1,2}.?[0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2},[0-9]{3} => to match timestamp

I am trying to match the timestamp and "6ms" in the log.

By using mutate i get the new fields in kibana but due to parse error values are not coming up.

how do I make logstash to ship/assign the extracted values to the new fields ?. As per what i understood mutate creates a new field and assign the value to it and grok patterns exatracts the matched value from logs.

Please suggest what changes do i need to make?

The grok pattern has to match the log message. Your message starts with DATEPATTERN, it does not end with it. Also, the '6 ms' has a space in it, so you could try

    grok {
        pattern_definitions => {
            "RESPPATTERN" => "[0-9]{1,2} ms"
            "DATEPATTERN" => "[0-9]{4}-[0-9]{2}-[0-9]{1,2}.?[0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2},[0-9]{3}"
        }
        match => { "message" => "%{DATEPATTERN:timestamp1}%{GREEDYDATA:message}%{RESPPATTERN:response1}" }
    }

Thanks @Badger.

I made the changes as per your suggestion: Below is the final logstash config:

input {
beats {
port => 5044
type => "log"
host => "127.0.0.1"
}
}
output {
elasticsearch {
hosts => "127.0.0.1:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
filter {
grok {
pattern_definitions => {
"RESPPATTERN" => "[0-9]{1,2} ms"
"DATEPATTERN" => "[0-9]{4}-[0-9]{2}-[0-9]{1,2}.?[0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2},[0-9]{3}"
}
match => { "message" => "%{DATEPATTERN:timestamp1}%{GREEDYDATA:message}%{RESPPATTERN:response1}"}
}
mutate {
add_field => { "timestamp2" => "%{timestamp1}" }
add_field => { "response2" => "%{response1}" }
}
date {
match => ["timestamp2", "yyyy-MM-dd HH:mm:ss,SSS"]
target => "timestamp2"
}
}

I changed new field in mutate block. But I am still getting the date gropparse error and date parse errors.

Ok, got it @Badger. Thanks.

I had to match the last part of the log message post 6ms also.

It is working now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.