# Grok filter with file input

(M. Alsioufi) #1

Hi
I have set of logs that are generated to .log files, and I want to index them in Elasticsearch using Logstash.
The log file contains many lines each line looks like the following:

2018-09-26 17:05:34,060 INFO: ProcessName: ProcessX, File: XFile.txt

How can I insert this to an Elasticsearch index using file plugin for Logstash:
{
timestamp: 2018-09-26 17:05:34,060,
level: "INFO",
process_name: "ProcessX",
File: "XFile.txt"
}

(M. Alsioufi) #2

I found the solution finally!
%{DATESTAMP:timestamp} %{GREEDYDATA:level}: Process: %{GREEDYDATA:process}, File: %{GREEDYDATA:file_name}\r

(Andreas H) #3

Try not to use GREEDYDATA so much as it's very expensive and can cause performance problems.
Try using patterns like NOTSPACE or WORD instead.
Be sure to use the grokdebugger to help you figure out the easiest method:

https://grokdebug.herokuapp.com/

Try this:
%{DATESTAMP:timestamp} %{WORD:Level}: ProcessName: %{WORD:ProcessName}, File: %{JAVAFILE:Filename}

(M. Alsioufi) #4

This is seems to be working, but I have one problem my actual file field is a path to the file like:
"C:\path\to\file.txt"
is there any pattern to handle the path ?

(Andreas H) #5

Yep! It's called %{WINPATH}

You can find all the patterns here:
https://grokdebug.herokuapp.com/patterns#

(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.