Grok works (no grok parse failure) but doesnt create the fields

Hi I have a field called "if_speed_in_out", this field contains strings like this "100 Mbps:100 Mbps" i have tested the grok on kibana devs tools and works, but never create the fields defined in the grok, just get the original field.

filter {
     grok {
         match => { "if_speed_in_out" => "%{DATA:if_speed_in} %{DATA:if_speed_unit_in}:%{DATA:if_speed_out} %{GREEDYDATA:if_speed_unit_out}" }
     }
}

the output

"if_speed_in_out":"100 Mbps:100 Mbps"

Any ideas on whats going on?

Does the input really have unbalanced " quotes

Your sample has 2 leading quotes and 1 trailing quote

""100 Mbps:100 Mbps"

Did you already parse into that field before?

Also your output has unbalanced quotes as well?

Also just use DATA on your last if_speed_unit_out

I just tested this pattern

"%{DATA:if_speed_in} %{DATA:if_speed_unit_in}:%{DATA:if_speed_out} %{DATA:if_speed_unit_out}"

With this data

"100 MBS:256 MBS"

Looks good perhaps you have some other issue in your logstash conf.

1 Like

Hi, @stephenb, sorry about the extra quotes, was a typo...the pipeline is for testing, so its really simple, I get data from de DB and I dont see any problem with it

input {
    jdbc {
        jdbc_connection_string => "connection-string"
        jdbc_user => "myuser"
        jdbc_password => "mypass"
        jdbc_driver_class => "Java::com.sybase.jdbc4.jdbc.SybDriver"
        jdbc_default_timezone => "America/Lima"
        statement => "SELECT [Node Name] as node_name, [Interface Speed (In:Out)] as if_speed_in_out
                FROM InterfaceMetrics;"
    }
}
filter {

     grok {
         match => { "if_speed_in_out" => "%{DATA:if_speed_in} %{DATA:if_speed_unit_in}:%{DATA:if_speed_out} %{GREEDYDATA:if_speed_unit_out}" }
     }

     if [if_speed_unit_in] == [if_speed_unit_out] {
          grok { match => [ "if_speed_unit_in" , "%{DATA:if_speed_unit}"]}
          mutate {
             remove_field => [ "if_speed_unit_in", "if_speed_unit_out" ]
          }
     }

     if [if_speed_in] == [if_speed_out] {
           grok { match => [ "if_speed_in" , "%{DATA:if_speed}"]}
           mutate {
              remove_field => [ "if_speed_in", "if_speed_out" ]
           }
       }
}
output {
   file { path => "/etc/logstash/conf.d/test_deleteme.json" codec => json_lines }
}

You don't need those 2nd and 3rd groks just use mutate with copy much more efficient.

What does the output doc look like?

1 Like

this is the output:

{"@version":"1","if_speed_in_out":"1 Gbps:1 Gbps","node_name":"LMmp-02","@timestamp":"2021-04-05T14:37:23.829Z"}
{"@version":"1","if_speed_in_out":"1 Gbps:1 Gbps","node_name":"ANAGG-01","@timestamp":"2021-04-05T14:37:23.831Z"}
{"@version":"1","if_speed_in_out":"0 bps:0 bps","node_name":"C01-Default","@timestamp":"2021-04-05T14:37:23.834Z"}
{"@version":"1","if_speed_in_out":"1.41 Gbps:1.41 Gbps","node_name":"CORE-O1(2)","@timestamp":"2021-04-05T14:37:23.836Z"}

Try this

filter {

  dissect {
    mapping => {
      "if_speed_in_out" => "%{if_speed_in} %{if_speed_unit_in}:%{if_speed_out} %{if_speed_unit_out}"
    }
  }

  if [if_speed_unit_in] == [if_speed_unit_out] {
    mutate {
      add_field => { "if_speed_unit" => "%{if_speed_unit_in}" }
      remove_field => [ "if_speed_unit_in", "if_speed_unit_out" ]
    }
  }

  if [if_speed_in] == [if_speed_out] {
    mutate {
      add_field => { "if_speed" => "%{if_speed_in}" }
      remove_field => [ "if_speed_in", "if_speed_out" ]
    }
  }
}

I took your output above and ran it though so as long as the input fields are there it should work.

Sample output

{
         "@timestamp" => 2021-04-05T14:37:23.831Z,
           "if_speed" => "1",
          "node_name" => "ANAGG-01",
               "path" => "/Users/sbrown/workspace/elastic-install/7.12.0/logstash-7.12.0/test.json",
    "if_speed_in_out" => "1 Gbps:1 Gbps",
               "host" => "ceres",
      "if_speed_unit" => "Gbps",
           "@version" => "1"
}
1 Like

thanks @stephenb it works nicely. :clap: :clap: :clap:

1 Like