Installed ELK 5.0.2 (Build 14476, Commit SHA 8f2ace74 ) on a clean server. Logstash is accepting data and elasticsearch is showing it's data. Now I want to make use of the config I already built in an older version. This needs the extractnumbers filter (logstash-filter-extractnumbers) for Logstash. Unfortunately, installing the filter is not as straightforward as the elastic doc page makes it seem... Did not find any other people having issues with this filter. Can somebody please have a look at this and solve it (seems the dependencies are wrong somewhere).
[root@ELK501 conf.d]# /usr/share/logstash/bin/logstash-plugin install logstash-filter-extractnumbers
Validating logstash-filter-extractnumbers
Installing logstash-filter-extractnumbers
Plugin version conflict, aborting
ERROR: Installation Aborted, message: Bundler could not find compatible versions for gem "logstash-core-plugin-api":
In snapshot (Gemfile.lock):
logstash-core-plugin-api (= 2.1.16)
In Gemfile:
logstash-devutils (~> 1.1) java depends on
logstash-core-plugin-api (~> 2.0) java
Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash-core":
In snapshot (Gemfile.lock):
logstash-core (= 5.0.2)
In Gemfile:
logstash-core-plugin-api (>= 0) java depends on
logstash-core (= 5.0.2) java
Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash":
In Gemfile:
logstash-filter-extractnumbers (>= 0) java depends on
logstash (< 2.0.0, >= 1.4.0) java
Could not find gem 'logstash (< 2.0.0, >= 1.4.0) java', which is required by gem 'logstash-filter-extractnumbers (>= 0) java', in any of the sources.
Tried to run the update, which updated the "installed" plugins, but this did not have a positive effect on the installation of the Extractnumbers filter.
Hello Magnus, thanks for your reply. Any news on the progress of the plugin update process ? It seems like this is quite a big and long-running process.
Any news on the progress of the plugin update process ? It seems like this is quite a big and long-running process.
I think all Elastic-supported plugins have been updated, but there are quite a few community-supported plugins, including extractnumbers, and you should set your expectations accordingly.
OK, thanks for the answer. Is there a way around Extractnumbers ? And will the documentation be updated with the fact that this filter does not work in 5.0 ? Currently the documentation states to just install the plugin, so I guess I won't be the only one having issues with non-working (community-supported) plugins.
What I used is %{NUMBER:AVFileSize:int} but this does not seem to get put into Elasticsearch as I can't find the variable. Up till now it was quite easy just using the Extractnumber plugin as this did the trick for me in the "old" ELK environment.
The data is coming from a CSV file generated by an anti-virus script and it's in this format : /largefiles/Studio_26/inbox/Sucker;Sucker_Trailer_1080p_235_LtRt.mxf;mxf;672183964;02/12/2016 20:44,
and I use this Logstash config :
filter {
if [type] == "avscanlog" {
grok {
match => { "message" => "%{PATH:Scanlocation};%{HTTPDUSER:Filename};%{HOSTNAME:File-extension};%{NUMBER:AVFilesize:int};%{JAVALOGMESSAGE:Scan_timestamp}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "Scan_timestamp", "dd/MM/yyyy HH:mm", "MMM dd HH:mm" ]
}
}
}
That grok expression doesn't make any sense. Why use HTTPDUSER for the second column? And HOSTNAME for a file extension? I suggest you use the csv filter instead. Use a mutate filter's convert option to make sure AVFilesize is an integer.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.