I got a error while adding custom pattern

Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x1862b6c2 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="7a188c6f1442fcd91dc7a55c0f843241d501051f33a721000957ab12ed439529", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x5d3fcef2 @metric=#<LogStash::Instrument::Metric:0x16c2c399 @collector=#<LogStash::Instrument::Collector:0x7b679d78 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x763ee070 @store=#<Concurrent::map:0x00000000000fe4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x12c0ecb7, @fast_lookup=#<Concurrent::map:0x00000000000fe8 entries=227 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"7a188c6f1442fcd91dc7a55c0f843241d501051f33a721000957ab12ed439529", :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>["E:\\\\logstash-6.2.2\\\\logASA.txt"], match=>{"message"=>["%{LOGASA113004}", "%{LOGASA111008}", "%{LOGASA111010}", "%{LOGASA502103}", "%{LOGASA605005}", "%{LOGASA611103}", "%{LOGASA113011}", "%{LOGASA737034}", "%{LOGASA722041}", "%{LOGASA734001}", "%{LOGASA113005}", "%{LOGASA111007}", "%{LOGASA106017}", "%{LOGASA711004}", "%{LOGASA717055}", "%{LOGASA733100}", "%{LOGASA313001}", "%{LOGASA113019}", "%{LOGASA722051}", "%{LOGASA722028}", "%{LOGASA722032}", "%{LOGASA722034}", "%{LOGASA313005}", "%{LOGASA419002}"]}, id=>"7a188c6f1442fcd91dc7a55c0f843241d501051f33a721000957ab12ed439529", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{LOGASA113004} not defined", :thread=>"#<Thread:0x699314ab run>"}
[2018-04-26T13:41:01,444][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{LOGASA113004} not defined>, :backtrace=>["E:/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "E:/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in compile'", "E:/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.2/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "E:/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.2/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "E:/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.2/lib/logstash/filters/grok.rb:270:inregister'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:341:in register_plugin'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:352:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:352:inregister_plugins'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:736:in maybe_setup_out_plugins'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:362:instart_workers'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:289:in run'", "E:/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:249:inblock in start'"], :thread=>"#<Thread:0x699314ab run>"}
[2018-04-26T13:41:01,485][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}Preformatted text

Where is LOGASA113004 supposed to be defined? It doesn't appear to be a standard pattern.

i had defined it in a pattern_dir file

It looks like that your patterns_dir is pointing to a file, not a directory.

@filter=<LogStash::Filters::Grok patterns_dir=>["E:\\\\logstash-6.2.2\\\\logASA.txt"]

You need to point to a directory, my suggestion would be to create a directory named patterns, put your files with patterns there and point the patterns_dir variable to that directory.

patterns_dir=>["E:\\\\logstash-6.2.2\\\\patterns\\\\"]

i have tried that but i am getting the same error

Can you paste your grok filters and patterns file here?

logASA.txt:

LOGASA113004 %{LOGASAMSG} AAA user %{DATA:aaa_type} Successful \: server =  %{IP:server_ip_address} \: user = %{WORD:user}
LOGASA111008 %{LOGASAMSG} User %{WORD:user} executed the command string
LOGASA111010 %{LOGASAMSG} User %{WORD:username} , running %{DATA:application_name} from IP %{IP:ip_addres} , executed %{GREEDYDATA:cmd}
LOGASA605005 %{LOGASAMSG} Login permitted from %{IP:src_address}/%{NUMBER:src_port} to inside:%{IP:dst}/%{WORD:service} for user %{GREEDYDATA:user}
CISCOFW305012 Teardown %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}/%{DATA:src_port} to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port} duration %{TIME:duration}
CISCOFW305011 %{CISCO_ACTION:edited_action} %{CISCO_XLATE_TYPE:edited_xlate_type} %{WORD:edited_protocol} translation from %{DATA:edited_src_interface}:%{IP:edited_src_ip}(/%{INT:edited_src_port})?(\(%{DATA:edited_src_fwuser}\))? to %{DATA:edited_src_xlated_interface}:%{IP:edited_src_xlated_ip}/%{WORD:edited_src_xlated_port}
LOGASAMSG %{SYSLOGTIMESTAMP:timestamp} %{IP:ip} \: %ASA-%{INT:severity_value}-%{INT:message_id}\:

conf grok:

filter {
grok {
					 patterns_dir => ["C:\\Users\\Desktop\\sand\\patterns.txt"]
					 match => { "message" =>["%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305011}", 
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305012}",
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302013_302014_302015_302016}",
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302020_302021}",
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW106015}",
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\[%{POSINT:id}\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip}:%{INT:src_port} dst=%{IP:dst_ip}:%{INT:dst_port} mac=%{DATA:mac} request: %{GREEDYDATA:request}",
				                         "%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\[%{POSINT:id}\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip} dst=%{IP:dst_ip} mac=%{DATA:mac} protocol=%{WORD:protocol} sport=%{INT:sport} dport=%{INT:dport}"]}
				}	 
}

Is that the right grok filter? Where is the LOGASA113004 pattern that is giving you the error? Also the patterns_dir is pointing to a different location as the one on your log error message.

Just some tips, for a better way of organize the patterns, instead of using multiple patterns to match against on your grok filter configuration, you could use one pattern the grok filter and combine those multiple patterns in the patterns file.

In the grok filter you showed, you are matching your message against 7 patterns where 5 of them diverge only on the last column.

"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305011}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305012}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302013_302014_302015_302016}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302020_302021}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW106015}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\[%{POSINT:id}\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip}:%{INT:src_port} dst=%{IP:dst_ip}:%{INT:dst_port} mac=%{DATA:mac} request: %{GREEDYDATA:request}",
"%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\[%{POSINT:id}\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip} dst=%{IP:dst_ip} mac=%{DATA:mac} protocol=%{WORD:protocol} sport=%{INT:sport} dport=%{INT:dport}"

You could change the match line to your grok filter to the following:

match => { "message => "%{CISCOFW}"}

And in your patterns file you would have the following

CISCOFW %{CISCOFW01}|%{CISCOFW02}|%{CISCOFW03}|%{CISCOFW04}|%{CISCOFWXX}

CISCOFW01 FIRST PATTERN
CISCOFW02 SECOND PATTERN
CISCOFW03 THIRD PATTERN
CISCOFW04 FOURTH PATTERN
CISCOFWXX XX PATTERN

Using this approach, if you need to create a new pattern to match your message, you will only need to add the new pattern on the patterns file.

i had tried this then i am getting an error

>     [2018-04-27T11:09:40,747][ERROR][logstash.pipeline        ] Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x15d42985 @id=\"836628343cd2cdb5d519f1fbc563f903941452d1-2\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x77867482 @metric=#<LogStash::Instrument::Metric:0x5e257a1d @collector=#<LogStash::Instrument::Collector:0x2a7fa615 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x1ed32ade @store=#<Concurrent::Map:0x00000000062050 entries=2 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x5a357fe2>, @fast_lookup=#<Concurrent::Map:0x00000000062054 entries=58 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"836628343cd2cdb5d519f1fbc563f903941452d1-2\", :events]>, @logger=#<LogStash::Logging::Logger:0x78ad57c6 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x46619276>>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"E:\\\\\\\\logstash-5.5.0\\\\\\\\patterns.txt\"], match=>{\"message\"=>[\"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305011}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW305012}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302013_302014_302015_302016}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW302020_302021}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:src_ip} %ASA-%{INT:severity_value}-%{INT:message_id}: %{CISCOFW106015}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\\\\[%{POSINT:id}\\\\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip}:%{INT:src_port} dst=%{IP:dst_ip}:%{INT:dst_port} mac=%{DATA:mac} request: %{GREEDYDATA:request}\", \"%{SYSLOGTIMESTAMP:timestamp} %{IP:host_ip} %{DATA:program}(?:\\\\[%{POSINT:id}\\\\])?: Error processing log message:%{DATA:error_processing} %{NUMBER:msg_id} %{WORD:log_id} %{DATA:msg} src=%{IP:src_ip} dst=%{IP:dst_ip} mac=%{DATA:mac} protocol=%{WORD:protocol} sport=%{INT:sport} dport=%{INT:dport}\"]}, id=>\"836628343cd2cdb5d519f1fbc563f903941452d1-2\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"invalid byte sequence in UTF-8"}
>     [2018-04-27T11:09:40,754][ERROR][logstash.agent           ] Pipeline aborted due to error {:exception=>#<ArgumentError: invalid byte sequence in UTF-8>, :backtrace=>["org/jruby/RubyRegexp.java:1657:in `=~'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:72:in `add_patterns_from_file'", "org/jruby/RubyIO.java:3565:in `each'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:70:in `add_patterns_from_file'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:409:in `add_patterns_from_files'", "org/jruby/RubyArray.java:1613:in `each'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:405:in `add_patterns_from_files'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:284:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:280:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "E:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:275:in `register'", "E:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:281:in `register_plugin'", "E:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "E:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in `register_plugins'", "E:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:302:in `start_workers'", "E:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:226:in `run'", "E:/logstash-5.5.0/logstash-core/lib/logstash/agent.rb:398:in `start_pipeline'"]}

As the error message indicates your pattern file isn't valid UTF-8. Start with an empty file and add back, line by line, the current contents. That'll narrow things down.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.