Grok filter for logstash/filebeats

I'm trying to extract my logs field using grok but it doesn't work.

Here's my grok filter:

filter { # parse the CSV structure generated from the log file into fields grok { match => {"message" => "%{GREEDYDATA:Job_Name} %{HAPROXYTIME:ElapsedRenderTime} %{NUMBER:FrameCount} %{WORD:FrameList} %{WORD:Render_Type} %{WORD:UserSubmitter} %{HAPROXYTIME:TotalTaskRenderTime} %{HAPROXYTIME:JobAverageFrameRenderTime} %{DATESTAMP:SubmitDate} %{DATESTAMP:CompletedDate} %{WORD:MachineSubmitted} %{NUMBER:JobPriority} %{WORD:Job_ID}" } } }

Here's my logfile:

Job_Name: TESTj
ElapsedRenderTime: 00:00:27.8370000
FrameCount: 1
FrameList: 21
Render_Type: AfterEffects
UserSubmitter: user1
TotalTaskRenderTime: 00:00:23.2210000
JobAverageFrameRenderTime: 00d:00h:00m:23s
SubmitDate: 01/24/2020 09:56:17
CompletedDate: 01/24/2020 09:56:47
MachineSubmitted: D1-RN-XX01
JobPriority: 50
Job_ID: 5e2b2fc1dddcae24084784b7

I've also tried %{WORD:Job_Name} but the result is this:

The result is incorrect. This should have been "TESTj" instead of "Job_Name".

I would have grok match that against a list of patterns. If you are using a multiline filter to combine the logfile into a single event you will need 'break_on_match => false'

grok {
    match => {
        "message" => [
            "Job_Name: %{WORD:Job_Name}",
            "ElapsedRenderTime: %{HAPROXYTIME:ElapsedRenderTime}",
            "FrameCount: %{NUMBER:FrameCount}",
            ...
        ]
    }
}

I am not convinced that 00:00:27.8370000 matches the HAPROXYTIME pattern.

I would recommend trying kv filter instead of grok.

Does "break_on_match" would go under file or filter section? I have used the grok debugger online and HAPROXYTIME is what it showed for ElapsedRenderTime filed.

How would I use kv filter? thanks.

You would add

break_on_match => false

to the grok filter.

does it require a pattern directory?

No, it does not.

filter {
# parse the CSV structure generated from the log file into fields

break_on_match => false
grok { match => { "message" => [ "Job_Name: %{WORD:Job_Name}",
								"ElapsedRenderTime: %HAPROXYTIME:ElapsedRenderTime}", 
								"FrameCount: %{NUMBER:FrameCount}",
								"FrameList: %{WORD:FrameList}",
								"Render_Type: %{WORD:Render_Type}",
								"UserSubmitter: %{WORD:UserSubmitter}",
								"TotalRenderTime: %{HAPROXYTIME:TotalRenderTime}",
								"JobAverageFrameRenderTime: %{HAPROXYTIME:JobAverageFrameRenderTime}",
								"SubmitDate: %{TIMESTAMP:SubmitDate}",
								"CompletedDate: %{TIMESTAMP:CompletedDate}",
								"MachineSubmitted: %{WORD:MachineSubmitted}",
								"JobPriority: %{NUMBER:JobPriority}",
								"Job_ID: %{WORD:Job_ID}" ] }

	}

Here's my grok filter but running the logstash service shows successful but it shutdown. I'm also running filebeat but logstash.yml is disabled.
I still don't see the result my looking for.

The break_on_match option has to be inside the grok filter.

Here's my complete config:

input {
file {
path => "\d1motion-fs\kibana_logs\render_logs*.txt"
start_position => "beginning"
}
}

filter {
grok { break_on_match => false
match => { "message" => [ "Job_Name: %{WORD:Job_Name}",
"ElapsedRenderTime: %{HAPROXYTIME:ElapsedRenderTime}",
"FrameCount: %{NUMBER:FrameCount}",
"FrameList: %{WORD:FrameList}",
"Render_Type: %{WORD:Render_Type}",
"UserSubmitter: %{WORD:UserSubmitter}",
"TotalRenderTime: %{HAPROXYTIME:TotalRenderTime}",
"JobAverageFrameRenderTime: %{HAPROXYTIME:JobAverageFrameRenderTime}",
"SubmitDate: %{TIMESTAMP:SubmitDate}",
"CompletedDate: %{TIMESTAMP:CompletedDate}",
"MachineSubmitted: %{WORD:MachineSubmitted}",
"JobPriority: %{NUMBER:JobPriority}",
"Job_ID: %{WORD:Job_ID}" ] }

	}

}
output {

elasticsearch {
	hosts => ["D1motion-fs:9200"]
	
}

}

Please let me know if there's any error on this config. I'm still learning this thing.

Do not use backslash in the path option of a file input. Use forward slash.

It's a windows platform. I have tried the // but it doesn't grab my files from that location.

It's actually this:

path => "\\d1motion-fs\kibana_logs\render_logs\*.txt"

As I said, do not use backslash. Try

path => "/d1motion-fs/kibana_logs/render_logs/*.txt"

Changed path to:
path => "//d1motion-fs/kibana_logs/render_logs/*.txt"

I'll run some test and will post the result, thanks.

Here's my result:

log.flags multiline
log.offset 0
message Job_Name: TESTq
ElapsedRenderTime: 00:00:28.8190000
FrameCount: 1
FrameList: 8
Render_Type: AfterEffects
UserSubmitter: gssuboc
TotalTaskRenderTime: 00:00:24.3090000
JobAverageFrameRenderTime: 00d:00h:00m:24s
SubmitDate: 01/27/2020 10:34:42
CompletedDate: 01/27/2020 10:35:12
MachineSubmitted: D1-RN-XX01
JobPriority: 50

The fields are still under the "message" section, not separated.

What do you get when you use

output { stdout { codec => rubydebug } }

?

same result as above.