Grok filter with file input

Hi
I have set of logs that are generated to .log files, and I want to index them in Elasticsearch using Logstash.
The log file contains many lines each line looks like the following:

2018-09-26 17:05:34,060 INFO: ProcessName: ProcessX, File: XFile.txt

How can I insert this to an Elasticsearch index using file plugin for Logstash:
{
timestamp: 2018-09-26 17:05:34,060,
level: "INFO",
process_name: "ProcessX",
File: "XFile.txt"
}

I found the solution finally!
%{DATESTAMP:timestamp} %{GREEDYDATA:level}: Process: %{GREEDYDATA:process}, File: %{GREEDYDATA:file_name}\r

Try not to use GREEDYDATA so much as it's very expensive and can cause performance problems.
Try using patterns like NOTSPACE or WORD instead.
Be sure to use the grokdebugger to help you figure out the easiest method:

https://grokdebug.herokuapp.com/

Try this:
%{DATESTAMP:timestamp} %{WORD:Level}: ProcessName: %{WORD:ProcessName}, File: %{JAVAFILE:Filename}

1 Like

Thank you, for the advice!
This is seems to be working, but I have one problem my actual file field is a path to the file like:
"C:\path\to\file.txt"
is there any pattern to handle the path ?

Yep! It's called %{WINPATH}

:slight_smile:

You can find all the patterns here:
https://grokdebug.herokuapp.com/patterns#

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.