Doubt Regarding Data Extraction From Same Line - Logstash

Hi ,
I am extracting information from log and creating fields.

Sample Log Line: Keepalive set (10 sec) aliveset (9 sec)

Now how can I parse the above line to get two rows like
{
"parameter" => "Keepalive set",
"duration" => 10,
"@version" => "1",
"path" => "/home/sathya/Downloads/Vodafone/vodafone_log_1.txt",
"@timestamp" => 2019-03-13T10:46:40.570Z,
"host" => "sathya-OptiPlex-3050",
"message" => " Keepalive set (10 sec)",
"unit" => "sec"
}
{
"parameter" => "aliveset",
"duration" => 9,
"@version" => "1",
"path" => "/home/sathya/Downloads/Vodafone/vodafone_log_1.txt",
"@timestamp" => 2019-03-13T10:46:40.570Z,
"host" => "sathya-OptiPlex-3050",
"message" => " Keepalive set (10 sec)",
"unit" => "sec"
}

Note:Already I have parsed line Keepalive set (10 sec) to get a single json. How can I parse Keepalive set (10 sec) aliveset (9 sec) to get two jsons ??

It's ugly, but it works...

    dissect { mapping => { "message" => "%{[@metadata][k1]} (%{[@metadata][v1]} sec) %{[@metadata][k2]} (%{[@metadata][v2]} sec)" } }
    ruby {
        code => '
            event.set("a",
                [
                    { "parameter" => event.get("[@metadata][k1]"), "duration]" => event.get("[@metadata][v1]") },
                    { "parameter" => event.get("[@metadata][k2]"), "duration]" => event.get("[@metadata][v2]") }
                ]
            )
        '
    }
    split { field => "a" }
1 Like

Thanks Badger,
It works

I have one more doubt:
My intention is to get "Queueing strategy" string in parameter field and "fifo" in value field from following Sample
Sample Line: Queueing strategy: fifo
matching Query: grok {
match => {"message" => "\s+%{GREEDYDATA:parameter}:\s+%{GREEDYDATA:value}"}
} -------------> This works But is this the correct way or Is there any other way to extract the info. Like Using metadata??

And when matching Is there any way to fit the original string??

Thanks
Sanriya

The reason I use [@metadata] is that there are fields for both the key and the value. I could have just used k1, v1, k2, v2, but then I have four extra fields the event. I could mutate+remove_field them, but fields in [@metadata] are automatically discarded when the event is output.

Ok.
I am new to Data Transformation in Logstash. So I might throw some basic doubts.

I have a sample line given below:
Sample Line: x xxx xxxxx dddddd xxxxxx, dd xxxxx

From this sample Line, how can I extract and put input rate in parameter field and in dddddd value field??

Thanks
Sanriya

grok { match => { "message" => "^5 minute %{DATA:paramter} %{NUMBER:value:int} " } }

Thanks a lot Badger.
I was struck between GREEDYDATA and WORD semantic.

Thanks
Sanriya

Hi Below is my Sample Input Log

aa set (10 sec) ------> Block 1 starts here
aa type: aa, aa Timeout 04:00:00
aa input 100Mb, aa output 100Mb ------> Block 1 ends here
aa set (10 sec) ------> Block 2 starts here
aa type: aa, aa Timeout 04:00:00
aa input 100Mb, aa output 100Mb ------> Block 2 ends here
The system rebooted
aa set (10 sec) ------> Block 3 starts here
aa type: aa, aa Timeout 04:00:00
aa input 100Mb, aa output 100Mb ------> Block 3 ends here

How Can I extract Mutiplt blocks into single json (i.e)Single row in Kibana????

Note: I am able to extract If only one block is available using multiline.
But I am struck when multiple blocks are present in the same file!

Any Ideas???

Thanks in advance

Regards
Sanriya

Generally If there is anyway to ignore all the lines between two blocks[Pattern matching Multiline] ??

I used below multiline .
filter{
multiline {
pattern => "aa set"
what => "previous"
negate=> true
flush =>"value"
}
}

I am trying to set the end line of a block
But its not working
Regards
Sanriya

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.