Logstash filter for Jboss Fuse 6.2.1

I've tried to create a filter for Logstash that can consume my Jboss Fuse log file.

My filter is working, and I get data into Kibana. But I was wondering if is possible to split the message into:
-Date
-Message Type
-Log Subsystem
-Route
-Java class
-Message Text

Example from fuse.log:

2018-03-15 06:04:18,795 | INFO | ../../ftp/input | route105 ID-domain.com-227-10-27620 | com.domain.JavaClassImpl | Some message from Java

2018-03-15 15:27:35,464 | WARN | ActiveMQ Task-1 | | org.apache.activemq.transport.failover.FailoverTransport | Failed to connect to [tcp://localhost:61616] after: 10 attempt(s) continuing to retry.

My filter:

input {
        beats {
                port => "5044"
        }
}
filter {

   grok {
     match =>
        {
          "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{LOGLEVEL:level}\s+\[%{DATA:className}\]%{SPACE}%{GREEDYDATA:message}"
        }
   }

}
output {
        elasticsearch {
                hosts => ["localhost:9200"]
                index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
                document_type => "%{[metadata][type]}"
        }
}

What I see in Kibana:

host:
    selite
tags:
    beats_input_codec_plain_applied, _grokparsefailure
@timestamp:
    March 15th 2018, 15:27:36.381
source:
    /usr/lib/jboss-fuse/data/log/fuse.log
@version:
    1
message:
    2018-03-15 15:27:35,464 | WARN | ActiveMQ Task-1 | | org.apache.activemq.transport.failover.FailoverTransport | Failed to connect to [tcp://localhost:61616] after: 10 attempt(s) continuing to retry.
offset:
    76,910
beat.hostname:
    selite
beat.name:
    selite
beat.version:
    6.2.2
prospector.type:
    log
_id:
    hEgPKmIBNzHwb5UAITuV
_type:
    %{[metadata][type]}
_index:
    filebeat-2018.03.15
_score:
    -

the _grokparsefailure tag indicates that your grok pattern failed to parse the input; you can use the Grok Constructor to test an expression against a number of log lines.


Since your data is very well-formatted and consistently delimited, you may have better luck with logstash-filter-dissect, which uses delimiter-splitting under the hood and is much easier to understand because we don't have to worry so much about exact format of each field:

# ...
filter {
  dissect {
    mapping => {
      "message" => "%{timestamp} | %{level} | %{subsystem} | %{route} | %{class} | %{message}"
    }
  }
  date {
    match => [ "timestamp", "ISO8601" ]
  }
}

EDIT: original was missing => in date filter

Thank you yaauie.

Updated filter:
GNU nano 2.8.6 File: logstash-syslog.conf

input {
        beats {
                port => "5044"
        }
}
filter {
  dissect {
    mapping => {
      "message" => "%{timestamp} | %{level} | %{subsystem} | %{route} | %{class} | %{message}"
    }
  }
  date {
    match [ "timestamp", "ISO8601" ]
  }
}
output {
        elasticsearch {
                hosts => ["localhost:9200"]
                index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
                document_type => "%{[metadata][type]}"
        }
}

I tried to use that filter but I got error when starting up:

sindre@selite:~$ tail -f /var/log/logstash/logstash-plain.log
[2018-03-16T10:34:52,532][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-16T10:34:52,665][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-03-16T10:34:52,821][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-16T10:34:52,857][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 13, column 11 (byte 205) after filter {\n  dissect {\n    mapping => {\n      \"message\" => \"%{timestamp} | %{level} | %{subsystem} | %{route} | %{class} | %{message}\"\n    }\n  }\n  date {\n    match ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-03-16T10:35:22,329][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-03-16T10:35:22,334][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-03-16T10:35:22,580][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-16T10:35:22,790][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-03-16T10:35:22,964][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-16T10:35:23,034][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 13, column 11 (byte 205) after filter {\n  dissect {\n    mapping => {\n      \"message\" => \"%{timestamp} | %{level} | %{subsystem} | %{route} | %{class} | %{message}\"\n    }\n  }\n  date {\n    match ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

I found out that the filter was missing => at date match.
Now it works!

Derp. I've updated my previous comment to be correct.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.