Split output of fping properly with grok

Hello,

I am trying to Split a Output from fping with grok, but I am not sure how to do it properly.
I have the following output from fping
google.de : xmt/rcv/%loss = 16/16/0%, min/avg/max = 23.6/24.3/25.2

I only need the value of the Hostname in this case google.de, loss, min, avg and max and the values of them. I only got now the Hostname with the following grok pattern "%{IPORHOST:target_host}" but I don't how to split it properly for the ozher values, or is this not possible with this output?

Maybe someone can give me a hint or help me with this, how to this properly.

If this isn't the correct forum, please move the topic.

Thank you.

Regards
hitman22

Hello there,

what you want to do does not sound difficult (if I got it right) but can you please post here ( properly spaced and formatted) the output of the following pipeline first?

input {
  whatever your input is
}

filter {}

output {
  stdout{}
}

I'd like to see what your logs look like to logstash. Thanks

Hello,

thanks for your response, this is what I get as Output from Logstash:

{
       "command" => "/usr/sbin/fping -q -c 2 -B1 -r1 < /etc/logstash/fping.conf 2>&1",
          "tags" => [
        [0] "fping"
    ],
      "@version" => "1",
          "type" => "fping",
       "message" => "8.8.8.8 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 23.9/24.5/25.1\n8.8.4.4 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 25.8/26.0/26.1\n",
    "@timestamp" => 2020-03-16T16:15:05.163Z,
          "host" => "VILELK001"
}

This is what my config looks like for Logstash:

input {
    exec {
        command => "/usr/sbin/fping -q -c 2 -B1 -r1 < /etc/logstash/fping.conf 2>&1"
        interval => 10
        type => "fping"
        tags => [ "fping" ]
    }
}

filter {}

output {
  stdout{}
}

In this example I used to ping 8.8.8.8 and 8.8.4.4 because in the future I want to ping more hosts. With these two IP Adresses, is just for test now.

Thanks for your help.

Ok so, if that's yout input, you can easily extract what you want with the following grok filter:

filter {
  grok {
    match => { "message" => "%{IPORHOST:test}.*loss = %{DATA}\/%{DATA}\/%{NUMBER:loss}.*= %{NUMBER:min}\/%{NUMBER:avg}\/%{NUMBER:max}" }
  }
}

NOTE: if you want the percentage symbol after loss, replace %{NUMBER:loss}.* with %{DATA:loss},.* (watch the comma after the closing curly bracket).

Obviously, you need to find a way to split your input in order to have logstash process multiple events.

If you don't find a way to do that and you find yourself with a single big line of logs, you can alway use a Ruby filter to split it over the \n character, assign the result to an array field and then use the split filter over it and finally apply the grok to the single event.

Something like the following:

input {
  generator { 
    count => 1
    lines => [ '8.8.8.8 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 23.9/24.5/25.1\n8.8.4.4 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 25.8/26.0/26.1\n' ]
  }
}

filter {
  ruby {
    code => "
      message = event.get('message')
      events = message.split('\n')
      event.set('events', events)
    "
  }

  split {
    field => "events"
  }

  grok {
    match => { "events" => "%{IPORHOST:test}.*loss = %{DATA}\/%{DATA}\/%{NUMBER:loss}%{DATA}= %{NUMBER:min}\/%{NUMBER:avg}\/%{NUMBER:max}" }
  }

  mutate {
    remove_field => ["message"]
  }

  mutate {
    rename => ["events", "message"]
  }
}

output {
  stdout{}
}

which outputs this:

{
           "avg" => "24.5",
      "sequence" => 0,
          "loss" => "0",
           "min" => "23.9",
    "@timestamp" => 2020-03-16T18:25:47.240Z,
           "max" => "25.1",
      "@version" => "1",
          "test" => "8.8.8.8",
       "message" => "8.8.8.8 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 23.9/24.5/25.1",
          "host" => "fabio"
}
{
           "avg" => "26.0",
      "sequence" => 0,
          "loss" => "0",
           "min" => "25.8",
    "@timestamp" => 2020-03-16T18:25:47.240Z,
           "max" => "26.1",
      "@version" => "1",
          "test" => "8.8.4.4",
       "message" => "8.8.4.4 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 25.8/26.0/26.1",
          "host" => "fabio"
}

NOTE: the input generator is simply to simulate your input (according to what you posted)

Thanks for your help.

I have tested it now and when I change my Config to:

input {
    exec {
        command => "/usr/sbin/fping -q -c 2 -B1 -r1 < /etc/logstash/fping.conf 2>&1"
        interval => 10
        type => "fping"
        tags => [ "fping" ]
    }
}

filter {
  ruby {
    code => "
      message = event.get('message')
      events = message.split('\n')
      event.set('events', events)
    "
  }

  split {
    field => "events"
  }

  grok {
    match => { "events" => "%{IPORHOST:test}.*loss = %{DATA}\/%{DATA}\/%{NUMBER:loss}%{DATA}= %{NUMBER:min}\/%{NUMBER:avg}\/%{NUMBER:max}" }
  }

  mutate {
    remove_field => ["message"]
  }

  mutate {
    rename => ["events", "message"]
  }
}

output {
  stdout{}
}

then I still get this Output for the first host, but not for the second one.

{
          "tags" => [
        [0] "fping"
    ],
           "avg" => "97.8",
           "max" => "114",
           "min" => "81.0",
       "command" => "/usr/sbin/fping -q -c 2 -B1 -r1 < /etc/logstash/fping.conf 2>&1",
       "message" => "8.8.8.8 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 81.4/97.0/112\n8.8.4.4 : xmt/rcv/%loss = 2/2/0%, min/avg/max = 81.0/97.8/114\n",
      "@version" => "1",
          "host" => "VILELK001",
          "test" => "8.8.8.8",
          "loss" => "0",
    "@timestamp" => 2020-03-16T19:10:10.098Z,
          "type" => "fping"
}

When I do it in the Input Generator, then the Output looks like your one.
Or is there another Error which I have made? I still have to learn grok and so on, because I am new to this and try to understand how this work.

Edit: I got it now to work with this: events = message.split(/\n/)

Thanks.

Yeah sure, your character was a real newline, that's why the regex works, silly me.

Anyway, the important thing is that the problem is solved. Mark something as a solution so future readers will see the thread has been solved.

Thanks