Grok - Integer data not being passed into fields when using :int

I've been using https://grokdebug.herokuapp.com/ to debug and learn grok patterns.

Input:
10-07-19 14:06:00.025 SomeString 2963867553 3028425000 205809770 209513296 2298 2292 8 37 90 1562792747810 1562791897820 1562792701362 1562791807023

Pattern:
%{TIMESTAMP_ISO8601:logdate:} %{NOTSPACE:svc} %{INT:thisROPbytes:int} %{INT:lastROPbytes:int} %{INT:thisROPbytesTX:int} %{INT:lastROPbytesTX:int} %{INT:thisROPfiles:int} %{INT:lastROPfiles:int} %{INT:thisROPfailed:int} %{INT:lastROPfailed:int} %{INT:ROPcollectTime:int} %{INT:thisROPEndTime:int} %{INT:lastROPEndTime:int} %{INT:thisROPStartTime:int} %{INT:lastROPStartTime:int}

The fields for all int are empty, but when I take out the :int at the end of each definition, the fields will populate with the correct values. Can anyone explain what's going on?

Additionally, is using something like %{INT:data:int} redundant if I want the data to be passed as an integer to elasticsearch? Is the :int casting usually meant for if I am doing something like %{NUMBER:data:int}? I want to be able to aggregate the data when I'm finished so it's important that it's not passed as a string or something.

I'm new to Logstash/grok so any help/explanation would be greatly appreciated.

Your pattern looks good to me. With this configuration

input { generator { count => 1 lines => [ '10-07-19 14:06:00.025 SomeString 2963867553 3028425000 205809770 209513296 2298 2292 8 37 90 1562792747810 1562791897820 1562792701362 1562791807023' ] } }

filter {
    grok { match => { "message" => "%{TIMESTAMP_ISO8601:logdate:} %{NOTSPACE:svc} %{INT:thisROPbytes:int} %{INT:lastROPbytes:int} %{INT:thisROPbytesTX:int} %{INT:lastROPbytesTX:int} %{INT:thisROPfiles:int} %{INT:lastROPfiles:int} %{INT:thisROPfailed:int} %{INT:lastROPfailed:int} %{INT:ROPcollectTime:int} %{INT:thisROPEndTime:int} %{INT:lastROPEndTime:int} %{INT:thisROPStartTime:int} %{INT:lastROPStartTime:int}" } }
}
output { stdout { codec => rubydebug { metadata => false } } }

I get

    "thisROPbytes" => 2963867553,
    "thisROPfiles" => 2298,
  "ROPcollectTime" => 90,
             "svc" => "SomeString",
   "thisROPfailed" => 8,
"lastROPStartTime" => 1562791807023,
        "sequence" => 0,
    "lastROPbytes" => 3028425000,
  "lastROPbytesTX" => 209513296,
  "thisROPEndTime" => 1562792747810,
    "lastROPfiles" => 2292,
   "lastROPfailed" => 37,
  "lastROPEndTime" => 1562791897820,

etc., and there are no quotes around the numbers, so they are integers, not strings.

So what is the actual configuration and data you are testing with?

1 Like

I guess something must be up with the debugging application I've been using. It just doesn't seem to like to spit out integers as field values.

Configuration:

input {
beats {
port => 5044 // listening for logs same format as previous post
}
}

filter {
if [message] =~ /=FileCollectionInstrumentation/ {

mutate { add_field => { "[@metadata][ddctype]" => "pmservinstr" } }

grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate:} %{DATA:svc} %{INT:thisROPbytes:int} %{INT:lastROPbytes:int} %{INT:thisROPbytesTX:int} %{INT:lastROPbytesTX:int} %{INT:thisROPfiles:int} %{INT:lastROPfiles:int} %{INT:thisROPfailed:int} %{INT:lastROPfailed:int} %{INT:ROPcollectTime:int} %{INT:thisROPEndTime:int} %{INT:lastROPEndTime:int} %{INT:thisROPStartTime:int} %{INT:lastROPStartTime:int}" }
}

date {
match => [ "logdate", "dd-MM-YY HH:mm:ss.SSS" ]
}
}
}
output {
elasticsearch {
hosts => ["someIP"]
index => "%{[@metadata][ddctype]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

It seems to work with Kibana just fine.. The graphs are being populated with the data. It was just odd to me that the data wasn't showing up when debugging.

Thanks for your help!