Grok in logstash.conf doesnt work!


(Swantje Biehler) #1

Hi all together,

i need your help because i have to finish a project in my company.

I've got this log:

Recovery Manager: Release 12.1.0.2.0 - Production on Tue Apr 24 18:30:01 2018
Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved.

xxx #privateinformationofthecompany

channel c1: backup set complete, elapsed time: 00:00:01
Finished backup at 24-APR-18

released channel: c1

Recovery Manager complete.

I want to extract these strings which are written cursive in the log above:

  • Tue Apr 24 18:30:01 2018
    as date

  • 00:00:01
    as duration

  • Recovery Manager complete
    as Status

This is my logstash.conf:

input {
beats{
port => 5044
}
}
if [tag] == "CLD1"{
grok {
match => {"message" => ["\ARecovery Manager: Release 12.1.0.2.0 - Production on %{HTTPDERROR_DATE:timestamp} (?[\r\n]+)"] }
}
date {
match => [ "timestamp", "E MMM dd HH:mm:ss yyyy"]
}
#mutate {
# remove_field => ["day, month, monthday, time, year"]
#}
}
}

output {
if [@metadata][beat] == "filebeat" {
elasticsearch {
hosts => ["localhost:9200"]
user => elastic
password => elastic
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}

No matter how i change the grok, there are no new fields in my elasticsearch Events.

Please can somebody help me!

Thank you very much!!


(Magnus Bäck) #2

if [tag] == "CLD1"{

Is this condition ever true? What does an example event produced by Logstash look like?


(Swantje Biehler) #3

Hello Magnus,

can you speak german? Your Name sounds like...?

No, this condition isn't ever true. There are more tags, but now I just want to Response to this example.

The log you can see above is divided in this to Events in kibana.

Recovery Manager: Release 12.1.0.2.0 - Production on Fri May 4 09:30:01 2018

Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved.

RMAN> connect target *
2> show all;
3> run
4> {
5> sql "alter system archive log current";
6> backup as compressed backupset archivelog all delete input format '/pfde-netapp5-v1/oracle/CLD1/rman/20180504_093001_%d_%s_al.bak';
7> }
8> run
9> {
10> allocate channel c1 type disk;
connected to target database:
...

using target database control file instead of recovery catalog....
CONFIGURE
.....
Starting backup at 04-MAY-18

....
Finished backup at 04-MAY-18

released channel: c1

Recovery Manager complete.

Fields: timestamp, Version, id, index, score, type, beat.hostname, beat.name, beat.version, host, message with the text above, Offset, prospector type, source, tags...

Now i want to get this two Events into one and create new fields like 'date', 'status', 'duration'.

Please could someone help me?


(Magnus Bäck) #4

can you speak german?

I'm Swedish but I do speak some german.

No, this condition isn't ever true.

Okay, but then it's pretty obvious why the filters aren't doing anything.

Now i want to get this two Events into one and create new fields like 'date', 'status', 'duration'.

Parsing multi-line free-form text isn't one of Logstash's strengths. I suppose you'd have to use a multiline codec to merge the group of lines into a single event but it'll be icky.


(Swantje Biehler) #5

So it wont be possible to create new fields like Status, date, Duration etc ?

But for what you can use the grok?


(Magnus Bäck) #6

You can certainly use grok to extract parts of a string but the trickier is joining multiple physical lines into a fewer number of Logstash events.


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.