I'm trying to parse an error log, but my date filter keeps failing
example log entry:
Wed Nov 28 02:16:24.654 [MCSS:7497:0x7F17018E7700]: Warning: invalid cid (-1) in Msg::receive for unknown ip, with msg id (0x2b0300)
The date I want to put in @Timestamp
and if that works i want the text in the this field [MCSS:7497:0x7F17018E7700]: to be filtered as useless
Below is my configuration
input {
file {
path => "C:/cygwin64/home/Logstashfiles/IAC/iqs.log"
type => "iqs.log"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP:timestamp} %{GREEDYDATA:useless} %{GREEDYDATA:message}" }
}
date {
match => [ "timestamp" , "EEE MMM dd HH:mm:ss.SSS" ] }
}
output {
stdout {}
}
Below is the error from the Logstash logs
[2019-01-18T14:27:56,740][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-01-18T14:27:57,019][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x1db0ddad @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="c04f67adb828c4c42d8d6592d98b10597fb883c99f12c642fc64df949a440c61", @klass=LogStash::Filters::Grok, @metric_events=#LogStash::Instrument::NamespacedMetric:0xbecbfda, @filter=<LogStash::Filters::Grok match=>{"message"=>"%{TIMESTAMP:timestamp} %{GREEDYDATA:useless} %{GREEDYDATA:message}"}, id=>"c04f67adb828c4c42d8d6592d98b10597fb883c99f12c642fc64df949a440c61", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{TIMESTAMP:timestamp} not defined", :thread=>"#<Thread:0x2f64e4d1 run>"}
[2019-01-18T14:27:57,029][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{TIMESTAMP:timestamp} not defined>, :backtrace=>["C:/ELK/logstash-6.5.4/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:in
loop'", "C:/ELK/logstash-6.5.4/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "C:/ELK/logstash-6.5.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:281:in
block in register'", "org/jruby/RubyArray.java:1734:in each'", "C:/ELK/logstash-6.5.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:275:in
block in register'", "org/jruby/RubyHash.java:1343:in each'", "C:/ELK/logstash-6.5.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.4/lib/logstash/filters/grok.rb:270:in
register'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:242:in register_plugin'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:253:in
block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:253:in
register_plugins'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:595:in maybe_setup_out_plugins'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:263:in
start_workers'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:200:in run'", "C:/ELK/logstash-6.5.4/logstash-core/lib/logstash/pipeline.rb:160:in
block in start'"], :thread=>"#<Thread:0x2f64e4d1 run>"}
[2019-01-18T14:27:57,058][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2019-01-18T14:27:57,519][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}