Logstash filter (extractnumbers) install does not work


(Caspar Lourens) #1

Installed ELK 5.0.2 (Build 14476, Commit SHA 8f2ace74 ) on a clean server. Logstash is accepting data and elasticsearch is showing it's data. Now I want to make use of the config I already built in an older version. This needs the extractnumbers filter (logstash-filter-extractnumbers) for Logstash. Unfortunately, installing the filter is not as straightforward as the elastic doc page makes it seem... Did not find any other people having issues with this filter. Can somebody please have a look at this and solve it (seems the dependencies are wrong somewhere).

[root@ELK501 conf.d]# /usr/share/logstash/bin/logstash-plugin install logstash-filter-extractnumbers
Validating logstash-filter-extractnumbers
Installing logstash-filter-extractnumbers
Plugin version conflict, aborting
ERROR: Installation Aborted, message: Bundler could not find compatible versions for gem "logstash-core-plugin-api":
In snapshot (Gemfile.lock):
logstash-core-plugin-api (= 2.1.16)

In Gemfile:
logstash-devutils (~> 1.1) java depends on
logstash-core-plugin-api (~> 2.0) java

logstash-filter-extractnumbers (>= 0) java depends on
  logstash-core-plugin-api (~> 1.0) java

logstash-core-plugin-api (>= 0) java

Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash-core":
In snapshot (Gemfile.lock):
logstash-core (= 5.0.2)

In Gemfile:
logstash-core-plugin-api (>= 0) java depends on
logstash-core (= 5.0.2) java

logstash-filter-extractnumbers (>= 0) java depends on
  logstash-core (< 2.0.0, >= 1.4.0) java

logstash-core (>= 0) java

Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash":
In Gemfile:
logstash-filter-extractnumbers (>= 0) java depends on
logstash (< 2.0.0, >= 1.4.0) java
Could not find gem 'logstash (< 2.0.0, >= 1.4.0) java', which is required by gem 'logstash-filter-extractnumbers (>= 0) java', in any of the sources.


(Caspar Lourens) #2

Tried to run the update, which updated the "installed" plugins, but this did not have a positive effect on the installation of the Extractnumbers filter.


(Magnus Bäck) #3

The extractnumbers plugin isn't yet compatible with Logstash 5.0. See pull request #2.


(Caspar Lourens) #4

Hello Magnus, thanks for your reply. Any news on the progress of the plugin update process ? It seems like this is quite a big and long-running process.


(Magnus Bäck) #5

Any news on the progress of the plugin update process ? It seems like this is quite a big and long-running process.

I think all Elastic-supported plugins have been updated, but there are quite a few community-supported plugins, including extractnumbers, and you should set your expectations accordingly.


(Caspar Lourens) #6

OK, thanks for the answer. Is there a way around Extractnumbers ? And will the documentation be updated with the fact that this filter does not work in 5.0 ? Currently the documentation states to just install the plugin, so I guess I won't be the only one having issues with non-working (community-supported) plugins.


(Magnus Bäck) #7

Is there a way around Extractnumbers ?

Can't you use a grok filter?

And will the documentation be updated with the fact that this filter does not work in 5.0 ?

I don't know if there are any such plans but I'll pitch it to the core team.


(Caspar Lourens) #8

What I used is %{NUMBER:AVFileSize:int} but this does not seem to get put into Elasticsearch as I can't find the variable. Up till now it was quite easy just using the Extractnumber plugin as this did the trick for me in the "old" ELK environment.


(Magnus Bäck) #9

Without knowing what your events look like it's impossible to help.


(Caspar Lourens) #10

The data is coming from a CSV file generated by an anti-virus script and it's in this format : /largefiles/Studio_26/inbox/Sucker;Sucker_Trailer_1080p_235_LtRt.mxf;mxf;672183964;02/12/2016 20:44,
and I use this Logstash config :
filter {
if [type] == "avscanlog" {
grok {
match => { "message" => "%{PATH:Scanlocation};%{HTTPDUSER:Filename};%{HOSTNAME:File-extension};%{NUMBER:AVFilesize:int};%{JAVALOGMESSAGE:Scan_timestamp}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "Scan_timestamp", "dd/MM/yyyy HH:mm", "MMM dd HH:mm" ]
}
}
}


(Magnus Bäck) #11

That grok expression doesn't make any sense. Why use HTTPDUSER for the second column? And HOSTNAME for a file extension? I suggest you use the csv filter instead. Use a mutate filter's convert option to make sure AVFilesize is an integer.


(system) #12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.