Yet another date parse problem (noob)

Hi, I want to send log lines to elastic with logstash. I'm burning hours and can't get it right.

My log lines all look like:

yyyymmdd hh:mm:ss ev:1 rn:3
(like : 20170312 14:03:55 ev:1 rn:5 etc. etc.)

In patterns_dir I defined the file waxtimestamp with:
WAXTIMESTAMP %{YEAR}%{MONTHNUM}%{MONTHDAY} %{TIME}

In my filter.conf I have:

input {
file {
path => ["/var/log/wax/main.log"]
type => "wax"
}
}

filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{WAXTIMESTAMP:datetime} %{GREEDYDATA:message}" }
}
date {
match => { "datetime" => "yyyyMMdd HH:mm:ss" }
target => "@timestamp"
}
}

output {
elasticsearch {
hosts => ["elastichost:9200"]
index => "wax-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

What am I doing wrong?
In my logstash log I get:

[............]
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["./patterns"]
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{WAXTIMESTAMP:datetime} %{GREEDYDATA:message}"}
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "5b5d9ebf8eb9cb9e2cd333d92e43b66b76587dab67f16ef87ac31e9906e94a76"
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-13T15:19:13,271][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = {"datetime"=>"yyyyMMdd HH:mm:ss"}
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "3988916c15c511ee38c321a06c6a3513310fb62327e17efd50461c75ce7c1226"
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-13T15:19:13,272][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-13T15:19:13,273][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-13T15:19:13,273][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-13T15:19:13,273][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-13T15:19:13,273][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register",

grok will successfully match that pattern against your example data, which makes me think the example is not typical. Can you copy and paste an example of a _grokparsefailure from the JSON tab in Kibana Discover?

Hi Thank you for your prompt response.
Logstash & elastic are running on a (non-graphics) Linux server. Kibana is running on my laptop.
Because of the error, nothing is put into elastissearch. The index mentioned above doesn't exist.
Every single line starts with yyyymmdd hh:mm:ss so every single line fails.
I can't understand the error-message.
(message=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register")
Is my logstash missing something?
Is there another way to parse these data fields?
Thank in advance.

That's not a very friendly error message, but you have

match takes an array, not a hash. Change it to be

match => [ "datetime", "yyyyMMdd HH:mm:ss" ]

Hi Badger, You nailed it! Thanks a bundle. As you can see, this is all new to me.
Your help brought me further along this obstacle course. Right up to:

[2018-04-16T14:11:14,397][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\b[\w-]+\b"}
[2018-04-16T14:11:14,397][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\w"}
[2018-04-16T14:11:14,397][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-16T14:11:14,397][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\[%{DATA:context}\])? %{GREEDYDATA:message}"}
[2018-04-16T14:11:14,397][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x6ddd72d3 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="9cb4743ebb4326504e91a48cf9682dae4306bd3ad235fc864afa6cee19a73388", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x54465efc @metric=#<LogStash::Instrument::Metric:0x23c84e4 @collector=#<LogStash::Instrument::Collector:0x3ee50ae5 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x3c5d2c19 @store=#<Concurrent::map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x19353990, @fast_lookup=#<Concurrent::map:0x00000000000fb8 entries=73 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"9cb4743ebb4326504e91a48cf9682dae4306bd3ad235fc864afa6cee19a73388", :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>["./patterns"], match=>{"message"=>"%{WAXTIMESTAMP:datetime} %{GREEDYDATA:message}"}, id=>"9cb4743ebb4326504e91a48cf9682dae4306bd3ad235fc864afa6cee19a73388", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{WAXTIMESTAMP:datetime} not defined", :thread=>"#<Thread:0x742df586@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}
[2018-04-16T14:11:14,398][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{WAXTIMESTAMP:datetime} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:341:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:352:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:352:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:736:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:362:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:289:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:249:inblock in start'"], :thread=>"#<Thread:0x742df586@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}

I would be very obliged if you or anyone can explain that one to me, or failing that, how to get around it.

Is ./patterns really in the directory where logstash is running? Perhaps try an absolute path.

Well Badger, you nailed it again. My .patterns was relative to the config/settings directory, which I erroneously thought to be correct.
I see I need to parse error messages more carefully (didn't notice the "not defined" part in all that jazz or I might have had an inkling) and re-study a lot of docs.
Bottomline: it works. Thank you very much.
κύδος to you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.