Logstash: Weblogic: Filebeat : Issues while starting the Logstash

Hi,

I am trying to configure ELK with Weblogic logs. where i am getting the plug in registering issues. when i do config test it returns configuration looks good but when i start it is not starting.

./logstash --config.test_and_exit -f weblogic-logstash-pipeline.conf
Sending Logstash logs to /u01/logstash/logs which is now configured via log4j2.properties
[2018-10-25T12:00:21,724][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
Configuration OK
[2018-10-25T12:00:31,875][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

Error :

./logstash -f weblogic-logstash-pipeline.conf --config.reload.automatic

Sending Logstash logs to /u01/logstash/logs which is now configured via log4j2.properties
[2018-10-25T12:05:36,732][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-10-25T12:05:37,432][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-10-25T12:05:48,513][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-25T12:05:49,148][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-10-25T12:05:49,164][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-10-25T12:05:49,401][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-10-25T12:05:49,464][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-25T12:05:49,474][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-10-25T12:05:49,515][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-10-25T12:05:49,547][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-25T12:05:49,587][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-25T12:05:49,771][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x5752e034 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="4f5f2bd9cdf0a0a1e22d07068c0299b6e3979ed0b8c38b53ea1baf82bad2d9a3", @klass=LogStash::Filters::Grok, @metric_events=#LogStash::Instrument::NamespacedMetric:0x358cd7cd, @filter=<LogStash::Filters::Grok patterns_dir=>["./patterns"], match=>{"message"=>"####<%{WLS_SERVERLOG_DATE:wls_timestamp}%{SPACE}%{DATA:wls_timezone}>%{SPACE}<%{LOGLEVEL:wls_level}>%{SPACE}<%{DATA:wls_subsystem}>%{SPACE}<%{DATA:wls_host}>%{SPACE}<%{DATA:wls_server}>%{SPACE}<%{DATA:wls_thread}>%{SPACE}<([<>a-zA-Z ])>%{SPACE}<%{DATA:wls_transactionid}>%{SPACE}<%{DATA:wls_diagcontid}>%{SPACE}<%{DATA:wls_rawtime}>%{SPACE}<%{DATA:wls_code}>%{SPACE}<%{GREEDYDATA:wls_message}"}, id=>"4f5f2bd9cdf0a0a1e22d07068c0299b6e3979ed0b8c38b53ea1baf82bad2d9a3", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"pattern %{WLS_SERVERLOG_DATE:wls_timestamp} not defined", :thread=>"#<Thread:0x52ac15d2 run>"}
[2018-10-25T12:05:49,777][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{WLS_SERVERLOG_DATE:wls_timestamp} not defined>, :backtrace=>["/u01/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in each'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in each'", "/u01/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:inregister'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:242:in register_plugin'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:253:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:253:inregister_plugins'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:595:in maybe_setup_out_plugins'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:263:instart_workers'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:200:in run'", "/u01/logstash/logstash-core/lib/logstash/pipeline.rb:160:inblock in start'"], :thread=>"#<Thread:0x52ac15d2 run>"}
[2018-10-25T12:05:49,797][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}
[2018-10-25T12:05:50,129][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

config file :
input {
beats {
port => "5400"
ssl => true
ssl_certificate_authorities => ["/u01/SSL/elk-ssl.crt"]
ssl_certificate => "/u01/SSL/elk-ssl.crt"
ssl_key => "/u01/SSL/elk-ssl.key"
ssl_verify_mode => "force_peer"
}
}

filter {

if "weblogic-server-log" in [tags] {
grok {
patterns_dir => "./patterns"
match => [ "message", "####<%{WLS_SERVERLOG_DATE:wls_timestamp}%{SPACE}%{DATA:wls_timezone}>%{SPACE}<%{LOGLEVEL:wls_level}>%{SPACE}<%{DATA:wls_subsystem}>%{SPACE}<%{DATA:wls_host}>%{SPACE}<%{DATA:wls_server}>%{SPACE}<%{DATA:wls_thread}>%{SPACE}<([<>a-zA-Z ]*)>%{SPACE}<%{DATA:wls_transactionid}>%{SPACE}<%{DATA:wls_diagcontid}>%{SPACE}<%{DATA:wls_rawtime}>%{SPACE}<%{DATA:wls_code}>%{SPACE}<%{GREEDYDATA:wls_message}" ]
}
# CEST does not exist in JODA-TIME, changed to CET
translate {
field => 'wls_timezone'
destination => 'wls_timezone'
fallback => '%{wls_timezone}'
override => "true"
dictionary => [
'CEST', 'CET'
]
}
date {
match => [ "wls_timestamp", "dd-MMM-yyyy HH'H'mm'''"]
locale => "es-ES"
timezone => "%{wls_timezone}"
target => "wls_timestamp"
}
mutate {
remove_field => [ 'wls_timezone' , 'message']
}
}

if "weblogic-access-log" in [tags] {
grok {
patterns_dir=>"./patterns"
match => [ "message", "%{ACCESSDATE:acc_date}\s+%{TIME:acc_time}\s+%{WORD:acc_verb}\s+%{DATA:acc_transactionid}\s+%{DATA:acc_num}\s+%{URIPATHPARAM:acc_uri}\s+%{NUMBER:acc_status}\s+%{NUMBER:acc_response_time}" ]
}
mutate {
replace => ['acc_timestamp', '%{acc_date} %{acc_time}']
}
date {
match => [ "acc_timestamp" , "yyyy-MM-dd HH:mm:ss" ]
target => "acc_timestamp"
timezone => "UTC"
}
mutate {
remove_field => [ 'acc_date', 'acc_time', 'message']
}
}

if "weblogic-diagnostic-log" in [tags] {
grok {
patterns_dir=>"./patterns"
match => [ "message", "[%{TIMESTAMP_ISO8601:diag_timestamp}]%{SPACE}[%{WORDNOBRACKET:diag_server}]%{SPACE}[%{WORDNOBRACKET:diag_msgType}]%{SPACE}[%{WORDNOBRACKET:diag_add}]%{SPACE}[%{WORDNOBRACKET:diag_compId}]%{SPACE}[%{DATA:diag_threadId}]%{SPACE}[%{WORDNOBRACKET:diag_userId}]%{SPACE}[%{WORDNOBRACKET:diag_ecid}]%{SPACE}[%{WORDNOBRACKET:diag_suppleAttr}]%{SPACE}%{GREEDYDATA:diag_msgText}" ]
}
date {
match => [ "diag_timestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ" ]
target => "diag_timestamp"
}
mutate {
remove_field => ['message']
}
}

if "weblogic-stdout-log" in [tags] {
grok {
patterns_dir => "./patterns"
match => [ "message", "<%{WLS_SERVERLOG_DATE:out_timestamp}%{SPACE}%{DATA:out_timezone}>%{SPACE}<%{LOGLEVEL:out_level}>%{SPACE}<%{DATA:out_subsystem}>%{SPACE}<%{DATA:wls_code}>%{SPACE}<%{GREEDYDATA:out_message}" ]
}
# CEST id does not exist in JODA-TIME, changed to CET
translate {
field => 'out_timezone'
destination => 'out_timezone'
fallback => '%{out_timezone}'
override => "true"
dictionary => [
'CEST', 'CET'
]
}
date {
match => [ "out_timestamp", "dd-MMM-yyyy HH'H'mm'''"]
locale => "es-ES"
timezone => "%{out_timezone}"
target => "out_timestamp"
}
mutate {
remove_field => [ 'out_timezone' , 'message']
}
}

if "weblogic-gc-log" in [tags] {
grok {
patterns_dir=>"./patterns"
match => [ "message", "%{TIMESTAMP_ISO8601:gc_timestamp}:%{SPACE}%{FLOAT:elapsed_time}:.*[%{WORDNOBRACKET:gc_type}%{SPACE}(%{WORDNOBRACKET:cause})%{SPACE}[%{WORDNOBRACKET:gc_name}:%{SPACE}%{MEM:young_mem_before_gc}->%{MEM:young_mem_after_gc}(%{MEM:young_mem_total})]%{SPACE}[%{WORDNOBRACKET:old_gc_name}:%{SPACE}%{MEM:old_mem_before_gc}->%{MEM:old_mem_after_gc}(%{MEM:old_mem_total})]%{SPACE}%{MEM:heap_mem_before_gc}->%{MEM:heap_mem_after_gc}(%{MEM:heap_mem_total}),%{SPACE}[%{WORDNOBRACKET:meta_gc_name}:%{SPACE}%{MEM:meta_mem_before_gc}->%{MEM:meta_mem_after_gc}(%{MEM:meta_mem_total})],%{SPACE}%{FLOAT:pause}%{SPACE}%{WORDNOBRACKET:pause_time_type}].*user=%{FLOAT:user_time}%{SPACE}sys=%{FLOAT:sys_time},%{SPACE}real=%{FLOAT:real_time}" ]

      match => [ "message", "%{TIMESTAMP_ISO8601:gc_timestamp}:%{SPACE}%{FLOAT:elapsed_time}:.*\[%{WORDNOBRACKET:gc_type}%{SPACE}\(%{WORDNOBRACKET:cause}\)%{SPACE}\[%{WORDNOBRACKET:gc_name}:%{SPACE}%{MEM:young_mem_before_gc}->%{MEM:young_mem_after_gc}\(%{MEM:young_mem_total}\)\]%{SPACE}%{MEM:heap_mem_before_gc}->%{MEM:heap_mem_after_gc}\(%{MEM:heap_mem_total}\),%{SPACE}%{FLOAT:pause}%{SPACE}%{WORDNOBRACKET:pause_time_type}\].*user\=%{FLOAT:user_time}%{SPACE}sys\=%{FLOAT:sys_time},%{SPACE}real\=%{FLOAT:real_time}" ]
    }

date {
  match => [ "gc_timestamp" , "yyyy-MM-dd'T'HH:mm:ss.SSSZ" ]
  target => "gc_timestamp"
}
mutate {
   remove_field => ['message']
} 

}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

I installed plugin for filter translate as a prerequisite for filtering. please review and suggest me what i am doing wrong,

Install the plugin logstash-filter-translate

logstash-plugin install logstash-filter-translate

Please help.

Close this issues. i resolved my self.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.