how can i extract directory name from the below ?
source path = "/var/log/companyname/servicename/jjds/smtgxyz/smtgabc.log"
from the above i have to get "servicename" as an another field. can someone please provide the example use case?
how can i extract directory name from the below ?
source path = "/var/log/companyname/servicename/jjds/smtgxyz/smtgabc.log"
from the above i have to get "servicename" as an another field. can someone please provide the example use case?
Like this?
/var/log/.*?/(?<logfolder>.*?)/
{
"logfolder": [
[
"servicename"
]
]
}
exactly
how do i extract the same from existing field "source"?
Oh sorry. I forgot to mention grok.
filter {
grok {
match => { "source" => "/var/log/.*?/(?<logfolder>.*?)/" }
}
}
Please don't send me private messages. I would have answered your question anyway.
[2018-11-06T10:47:46,868][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<RegexpError: undefined group option: //var/log/.?/(?.?)//m>, :backtrace=>["org/jruby/RubyRegexp.java:928:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:127:in
compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in block in register'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in block in register'", "org/jruby/RubyHash.java:1343:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:241:in
register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:594:in
maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:262:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:199:in
run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:159:in `block in start'"], :thread=>"#<Thread:0x2d1a4dc1 run>"}
[2018-11-06T10:47:46,890][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
getting below error after adding
[2018-11-06T10:50:11,840][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-11-06T10:50:12,279][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x5585ac92 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="02ebfe22ba3c411cc37239b7f5cbce01ca59f2a2682af87f0dd1132b12f453ae", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x37f07152>, @filter=<LogStash::Filters::Grok patterns_dir=>["/etc/logstash/conf.d/patterns"], match=>{"message"=>"%{LOGLEVEL:log_lvl}\\s+\\[%{TIMESTAMP_ISO8601:time_stamp}\\]\\s+\\[%{REQID:req_id}\\]\\s+%{JAVACLASS:class_name}:\\s+%{JAVALOGMESSAGE:log_msg}\\s+%{services:services}", "source"=>"/var/log/.?/(?.?)/"}, id=>"02ebfe22ba3c411cc37239b7f5cbce01ca59f2a2682af87f0dd1132b12f453ae", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>"*", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>["_grokparsefailure"], timeout_millis=>30000, tag_on_timeout=>"_groktimeout">>", :error=>"undefined group option: /\/var\/log\/.?\/(?.?)\//m", :thread=>"#<Thread:0x1e1d573c run>"}
[2018-11-06T10:50:12,288][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<RegexpError: undefined group option: //var/log/.?/(?.?)//m>, :backtrace=>["org/jruby/RubyRegexp.java:928:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:127:in
compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in block in register'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in block in register'", "org/jruby/RubyHash.java:1343:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:241:in
register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:252:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:594:in
maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:262:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:199:in
run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:159:in `block in start'"], :thread=>"#<Thread:0x1e1d573c run>"}
[2018-11-06T10:50:12,318][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
I don't really know what's wrong.
My little test works fine:
input {
stdin {}
}
filter {
grok {
match => { "message" => "/var/log/.*?/(?<logfolder>.*?)/" }
}
}
output {
stdout { codec => rubydebug }
}
The stdin plugin is now waiting for input:
/var/log/companyname/servicename/jjds/smtgxyz/smtgabc.log
{
"@version" => "1",
"message" => "/var/log/companyname/servicename/jjds/smtgxyz/smtgabc.log",
"@timestamp" => 2018-11-06T10:54:51.182Z,
"logfolder" => "servicename",
"host" => "elasticsearch-vm"
}
What does your configuration look like?
input {
beats {
port => 5044
host => "0.0.0.0"
}
}
filter {
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => { "message" => "%{LOGLEVEL:log_lvl}\s+[%{TIMESTAMP_ISO8601:time_stamp}]\s+[%{REQID:req_id}]\s+%{JAVACLASS:class_name}:\s+%{JAVALOGMESSAGE:log_msg}\s+%{services:services}"}
match => { "servicename" => "/var/log/.?/(?.?)/" }
}
mutate {
remove_field => ["message"]
#remove_field => ["beat"]
remove_field => ["[host]"]
}
mutate {
add_field => {
"host" => "%{[beat][hostname]}
}
}
}
output {
elasticsearch {
user => "elastic"
password => "changeme"
hosts => "localhost:9200"
manage_template => false
index => "logstash-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
this isn't showing any data regarding "logfolder"!!
For a start you shouldn't define the same option twice in one plugin. That won't work.
Furthermore: Is that really your config?
That doesn't look like my suggestion at all...
input {
beats {
port => 5044
host => "0.0.0.0"
}
}
filter {
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => { "message" => "%{LOGLEVEL:log_lvl}\s+[%{TIMESTAMP_ISO8601:time_stamp}]\s+[%{REQID:req_id}]\s+%{JAVACLASS:class_name}:\s+%{JAVALOGMESSAGE:log_msg}\s+%{services:services}"}
match => { "source" => "/var/log/.*?/(?<log`Preformatted text`folder>.*?)/" }
}
mutate {
remove_field => ["message"]
#remove_field => ["beat"]
remove_field => ["[host]"]
}
mutate {
add_field => {
"host" => "%{[beat][hostname]}"
"servicename" => "%{[logfolder]}"
}
}
}
output {
elasticsearch {
user => "elastic"
password => "changeme"
hosts => "localhost:9200"
manage_template => false
index => "logstash-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
this is how my config file is!!
basically we are already getting our data to elk. But we also wanted our service name from the source path. so we thought of taking it from existing filed "source" using regex.
we want to get the information from existing logstash's field which is "source" field!!
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.