Logstash - Glok and stdout -> rubydebug doesn't work

Hi everyone,

I'm pacing days on the ELK + filebeat to make it work with my Glok filter.
I have started installing ELK + filebeat, then defined logstash.conf:

input {   file {
        path => "D:\ElisticLogs\MyLog.log"
    	start_position => "beginning"
    	}
    	}
    filter {
    		  grok {
    			match => { "message" => "(?<test>.*" }
    	}
    }

output {
	stdout { codec => rubydebug }
	elasticsearch { hosts => ["localhost:9200"] }
}

Everything seems to be working but there is no rubydebug messages in the console of logstash, glok filter doesn't create my field.

Logstash log:

[2019-11-05T15:31:13,280][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2019-11-05T15:31:13,311][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.2"} [2019-11-05T15:31:14,998][INFO ][org.reflections.Reflections] Reflections took 47 ms to scan 1 urls, producing 20 keys and 40 values [2019-11-05T15:31:15,780][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. [2019-11-05T15:31:15,780][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x1ea9b1e1 run>"} [2019-11-05T15:31:16,139][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"} [2019-11-05T15:31:16,155][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"} [2019-11-05T15:31:16,217][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]} [2019-11-05T15:31:16,233][INFO ][org.logstash.beats.Server][main] Starting server on port: 5043 [2019-11-05T15:31:16,514][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Stack version info:
logstash-7.4.2
kibana-7.4.2
elasticsearch-7.4.2
filebeat-7.4.2

Do you have any idea my this might be happening?

...seems a bit unconventional. Try to replace this for:

match => { "message" => "%{GREEDYDATA:test}"}

This just puts your entire message field in a new field called "test".

Here's a good beginner's guide for GROK: https://logz.io/blog/logstash-grok/

Hi St3inbeiss,

Thanks for the fast response!
I believe I did it before and it didn't work out for me. The field did not show up in Kibana and LogStash didn't logged anything in regards to finding patterns.

Do not use backslash in the path option on a file input, use forward slash.

YOU ARE MY HERO, @Badger .

So simple and so unexpected!
Now it's working as it was shown in so many pages and tutorials.

{
          "test" => "  Distributing 3 actions to XGE\r",
          "path" => "D:/ElisticLogs/MyLog.log",
    "@timestamp" => 2019-11-05T16:24:16.252Z,
          "host" => "qqq",
       "message" => "  Distributing 3 actions to XGE\r",
      "@version" => "1"
}
{
          "test" => "  --------------------Project: Default-------------------------------------------\r",
          "path" => "D:/ElisticLogs/MyLog.log",
    "@timestamp" => 2019-11-05T16:24:16.254Z,
          "host" => "qqq",
       "message" => "  --------------------Project: Default-------------------------------------------\r",
      "@version" => "1"
}

Thanks you!

You've made my day :smiley:

@Badger Maybe you also know how to force display my field from Grok filter in Kibana log view?
Or should It do automatically and it means that something is misconfigured on my site?

It should show up automatically.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.