Logstash dissect plugin prase doesn't correct with empty field in content

@zikakou1er @Allwyn @matslats @Kubson @fcza
I use the dissect plugin with logstash to parse the log which has empty fileds, but is doesn't match fields correctly.The folowing is my log pattern and logstash config file and logstash output:
log pattern:
2017-10-24|||500|
logstash config file:

input {
  beats {
    port => 5044
  }
}

filter {
   ruby {
        code => "
            event.timestamp.time.localtime
            tstamp = event.get('@timestamp').to_i
            event.set('date_str', Time.at(tstamp).strftime('%Y-%m-%d'))
        "
   }

   dissect {
          mapping => {
              "message" => "%{time_local}|%{server_ip}|%{request}|%{status_code}|%{upstrame}"
          }
  }
}
output {
    stdout { codec => rubydebug }
}

logstash parse result:

{
       "date_str" => "2017-10-23",
        "request" => "",
    "status_code" => "",
       "business" => "nginx",
         "offset" => 217,
     "input_type" => "log",
     "time_local" => "2017-10-24",
         "source" => "/home/ec2-user/filebeat/test.log",
        "message" => "2017-10-24|||500|",
           "type" => "access",
           "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
     "@timestamp" => 2017-10-24T04:00:22.130Z,
       "@version" => "1",
           "beat" => {
        "hostname" => "awsuw7-50.opi.com",
            "name" => "awsuw7-50.opi.com",
         "version" => "5.5.1"
    },
           "host" => "awsuw7-50.opi.com",
      "server_ip" => "500",
       "upstrame" => ""
}

So, as we have seen, the parse was is not correct, the server_ip filed should be "", request filed shuld be "", status_code filed shuld be 500, but the result is not what i want. So, what's the reason, how to fix this problem?

  • Version: 5.5.1
  • Operating System: CentOs 7

This looks like a bug. As a workaround use a csv filter instead.

Thanks for your reply, 5.5.1 is a relatively new version, i was wondering if there was a problem with my usage method.:grinning:

If i use csv filter, could i still write to Elasticsearch?

If i use csv filter, could i still write to Elasticsearch?

Yes, of course.

This bug has been fixed in version 1.1.1 of dissect plugin. GitHub - logstash-plugins/logstash-filter-dissect at v1.1.1
This is change log:https://github.com/logstash-plugins/logstash-filter-dissect/blob/v1.1.1/CHANGELOG.md
But:
I just want to install the new version 1.1.1 of dissect, followed the instruction of README.md, but that failed, when i start logstash then it reported ERROR as following:

[2017-11-01T21:17:18,053][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"filter", :name=>"dissect", :path=>"logstash/filters/dissect", :error_message=>"\n\n\tyou might need to reinstall the gem which depends on the missing jar or in case there is Jars.lock then resolve the jars with lock_jars command\n\nno such file to load -- org/logstash/dissect/jruby-dissect-library/1.1.1/jruby-dissect-library-1.1.1 (LoadError)", :error_class=>RuntimeError, :error_backtrace=>["/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/jar-dependencies-0.3.11/lib/jar_dependencies.rb:348:in `do_require'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/jar-dependencies-0.3.11/lib/jar_dependencies.rb:255:in `require_jar'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/jar-dependencies-0.3.11/lib/jar_dependencies.rb:0:in `require_jar_with_block'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/jar-dependencies-0.3.11/lib/jar_dependencies.rb:254:in `require_jar'", "/home/web/logstash-5.5.1/lib/bootstrap/patches/jar_dependencies.rb:6:in `require_jar'", "/home/web/logstash-filter-dissect/lib/jruby-dissect-library_jars.rb:4:in `(root)'", "org/jruby/RubyKernel.java:1040:in `require'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'", "/home/web/logstash-filter-dissect/lib/logstash/filters/dissect.rb:1:in `(root)'", "org/jruby/RubyKernel.java:1040:in `require'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'", "/home/web/logstash-filter-dissect/lib/logstash/filters/dissect.rb:6:in `(root)'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/plugins/registry.rb:1:in `(root)'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/plugins/registry.rb:156:in `legacy_lookup'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/plugins/registry.rb:138:in `lookup'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/plugins/registry.rb:180:in `lookup_pipeline_plugin'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/plugin.rb:140:in `lookup'", "org/jruby/RubyKernel.java:1079:in `eval'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/pipeline.rb:100:in `plugin'", "(eval):37:in `initialize'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/pipeline.rb:72:in `initialize'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/pipeline.rb:156:in `initialize'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/agent.rb:286:in `create_pipeline'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/agent.rb:95:in `register_pipeline'", "/home/web/logstash-5.5.1/logstash-core/lib/logstash/runner.rb:314:in `execute'", "/home/web/logstash-5.5.1/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/home/web/logstash-5.5.1/lib/bootstrap/environment.rb:71:in `(root)'"]}

The following is my installation steps:


Related Information

  • Logstash Version: 5.5.1
  • Operating System: CentOS 7
  • Part of Gemfile:

    gem "logstash-filter-dissect", :path => "/home/web/logstash-filter-dissect"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.