Grok pattern for custom application log

Hi Team,

I have error log as mentioned below , can you suggest me the grok pattern for this,

20180209 00:00:08,696 ERROR WebContainer : 34989 dao.OrderServiceDAOImpl Acct_Nr=724377 Imp_Acct_Nr= Exception in the method checkCampaignForPUPorg.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0

I tried as
%{TIMESTAMP_ISO8601:timestamp}*

but the output comes as timestamp:null.

can you please suggest solution for this

TIA

Date field in the log doesn't match with TIMESTAMP_ISO8601 pattern.

Hi Sezgin,

Thanks for the info.

this is my error log:
20180209 00:00:08,696 ERROR WebContainer : 34989 dao.OrderServiceDAOImpl Acct_Nr=724377 Imp_Acct_Nr= Exception in the method checkCampaignForPUPorg.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0

I have used grok pattern as
%{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{WORD:file} : %{NUMBER:linenumber}%{SPACE} %{WORD}.%{WORD}%{SPACE}%{DATA:account}=%{NUMBER:acct_nr}%{SPACE}%{DATA:errordet}=%{GREEDYDATA:log}

I am getting the output each and every filter which i mentioned above as separate,but i want to get %{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} this complete details in single variable, may be assigned to time or timestamp like that.

can you suggest me how to add that custom pattern in logstash config or ./pattern file. I have no idea about ./pattern file concept.Requesting you to provide some solution.

TIA

Hi,

you can check link below creating custom patterns.

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#_custom_patterns

add line to file and save file any folder. set patterns_dir value to folder's path.

MYDATEPATTERN %{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}

sample grok filter with custom pattern.

filter {
grok {
patterns_dir => ["C:\development\elk\logstash\custom_pattern"]
match => { "message" => "^%{MYDATEPATTERN:my_date}" }
}
}

Hi,

I tried as you mentioned,but i am facing grokpattern error, its not taking up custom patterns ,

Code:
input {
file {
path => "D:\logstash-6.2.4\bin\webe.log"
start_position => "beginning"
}
}

filter {
grok {
patterns_dir => ["D:\logstash-6.2.4\bin\custom_pattern"]
match => { "message" => "%{DATE:timestamp}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{LINENUMBER:file}%{SPACE} %{WORD}.%{WORD}%{SPACE}%{DATA:account}=%{NUMBER:acct_nr}%{SPACE}%{DATA:errordet}=%{GREEDYDATA:log}" }
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logs_parameter_index"

}
stdout { codec => rubydebug }
}

Error:
[2018-06-04T11:36:17,780][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x39d6bae9 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="9f6b1fb0721974fe26ef58f6626709fdb2832c6b5f081a6fcac6ec7e8b0e329b", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x2c183e59 @metric=#<LogStash::Instrument::Metric:0x23a3ad60 @collector=#<LogStash::Instrument::Collector:0x148ca8e7 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x1881596d @store=#<Concurrent::map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x3a4a6fa6, @fast_lookup=#<Concurrent::map:0x00000000000fb8 entries=63 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"9f6b1fb0721974fe26ef58f6626709fdb2832c6b5f081a6fcac6ec7e8b0e329b", :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>["D:\\avoN\\logstash-6.2.4\\bin\\custom_pattern"], match=>{"message"=>"^%{AVONDATE:timestamp}%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{LINENUMBER:file}%{SPACE} %{WORD}.%{WORD}%{SPACE}%{DATA:account}=%{NUMBER:acct_nr}%{SPACE}%{DATA:errordet}=%{GREEDYDATA:log}"}, id=>"9f6b1fb0721974fe26ef58f6626709fdb2832c6b5f081a6fcac6ec7e8b0e329b", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{DATE:timestamp} not defined", :thread=>"#<Thread:0x6f6570b5 run>"}
[2018-06-04T11:36:17,795][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{AVONDATE:timestamp} not defined>, :backtrace=>["D:/avoN/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "D:/avoN/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in compile'", "D:/avoN/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "D:/avoN/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "D:/avoN/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:inregister'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:342:in register_plugin'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:inregister_plugins'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:731:in maybe_setup_out_plugins'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:363:instart_workers'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:290:in run'", "D:/avoN/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:250:inblock in start'"], :thread=>"#<Thread:0x6f6570b5 run>"}
[2018-06-04T11:36:17,842][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

Hi,

I added your grok patterns to my pattern file and they are compiled without problem.

patterns_dir sets directory containing files for custom pattern, not absolute path of file.

[2018-06-04T11:08:46,070][DEBUG][logstash.filters.grok    ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-06-04T11:08:46,075][DEBUG][logstash.filters.grok    ] Adding pattern {"MYDATEPATTERN"=>"%{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}"}
[2018-06-04T11:08:46,075][DEBUG][logstash.filters.grok    ] Adding pattern {"AVONDATE"=>"%{YEAR}%{MONTHNUM}%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}"}
[2018-06-04T11:08:46,076][DEBUG][logstash.filters.grok    ] Adding pattern {"LINENUMBER"=>"%{WORD} : %{NUMBER}"}
[2018-06-04T11:08:46,076][DEBUG][logstash.filters.grok    ] Adding pattern {"FILENAME"=>"%{WORD}.%{WORD}"}
[2018-06-04T11:08:46,076][DEBUG][logstash.filters.grok    ] Adding pattern {"ACCOUNT"=>"%{DATA}=%{NUMBER}"}

Hi,

I changed the pattern file to directory path, i am not facing any issue now, but i dont see any logs running in the logstash,

[2018-06-04T14:39:30,095][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-06-04T14:39:31,340][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x641ecfe0 run>"}
[2018-06-04T14:39:31,517][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

after this line its not going further, so according to you is it working fine, because whenevr i run logstash file in the screen i see my logs going to elasticsearch according to the parameters defined, can you suggest for the same.

Everyhing looks OK. If your logs stored in ES index, your pipeline is working fine.
To see detailed logstash logs change log level to debug in logstash.yml file.

It doesnt create index in elasticsearch as well. What can be the reason for logstash to get stuck after
[2018-06-04T14:39:31,517][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

Solved.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.