Grok succeeds in debugger and through input{stdin. but with this config file I'm getting _grokparsefailure

path => "C:/ELK/LocalLogs/*"
start_position => beginning
       }
     }
    filter {
      grok{
       match => { "message" => ["%{GREEDYDATA:first}%{DATE_US:date_}-%{TIME:time_}%{GREEDYDATA:last}"]}
       }

      date {
       match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
            }
     }

    output{
      elasticsearch {
        index => "tst-%{+YYYY.MM.dd}"
        hosts => ["localhost:9200"]
      }
      stdout { codec => rubydebug }
    }


this is an example of the input:
library!WindowsService_3!2ec0!10/25/2020-00:00:26:: i INFO: Schedule ace8b126-f566-4324-aa48-a6123f81f28f executed at 10/21/2020 00:00:02.

while working with input{stdin and on any grok debugger it fails on the output to elasticsearch.
Please help. What am I doing wrong?

Hopefully this will help somebody...
so I opened the log file with notepad++ and in the bottom right corner it shoes the type of encoding (attached)

then I entered this link : https://www.elastic.co/guide/en/logstash/current/plugins-codecs-plain.html

and found the closest name in the list' which is - "UCS-2BE"
Then i added this line to the input{file{
codec => plain {charset => "UCS-2BE"}
Now it works.

image

1 Like

In general, if you have a 2 byte little-endian encoding, and tell the codec to use a 2 byte big-endian encoding I would not expect it to work. I would guess that because the file has a byte order mark (BOM) the codec works in UCS and decides whether to use BE or LE based on the BOM.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.