Parsing string and assigning fields

How would I use a logstash filter to parse the message:

0.00 pool 3041

and assign fields CPU_USAGE, PROCESS_NAME, and PROCESS_ID to it? For example

CPU_USAGE: 0.00 (type float)
PROCESS_NAME: pool (type string)
PROCESS_ID: 3041 (type float)

Untested example that assumes that the string you want to parse is stored in the message field and that the process id is an integer and not a float:

grok {
  match => {
    "message" => "^%{NUMBER:CPU_USAGE:float} %{NOTSPACE:PROCESS_NAME} %{INT:PROCESS_ID:int}$"
  }
}

That is giving a grok_parse_failure with none of the newly created fields showing up.

Please add a stdout { codec => rubydebug } output and show the results.

one parse:

{
       "message" => "0.00               colord              2924",
      "@version" => "1",
    "@timestamp" => "2016-06-28T13:34:37.553Z",
          "beat" => {
        "hostname" =>************,
            "name" => ************
    },
        "source" => *********,
    "input_type" => "log",
         "count" => 1,
        "fields" => nil,
        "offset" => 806707,
          "type" => "log",
          "host" => **********
          "tags" => [
        [0] "beats_input_codec_plain_applied",
        [1] "_grokparsefailure"
    ]
}

And the filter I'm using

input {
    beats{
        port => 5044
    }
}

filter {
    grok {
      match => {
        "message" => "^%{NUMBER:CPU_USAGE:float} %{NOTSPACE:PROCESS_NAME} %{INT:PROCESS_ID:int}$"
      }
    }

}

output {
    stdout{
        codec => rubydebug
    }
    elasticsearch{
    }
}

Looking at my original post, the message would contain a lot more whitespace between characters like:

"0.00_________pool_________3041"

with underscores being spaces

Stripping the extra white space on the log generator made it work. Thanks for the help!

You could of course also have adjusted the grok expression to accept more than one space.

This shows why it's crucial to pay attention to formatting details when posting questions. Formatting text as code is usually enough.