Logstash & VirusTotal lookup


We're currently building an ELK instance which has the purpose to nicely show forensic level data by reading data from a file and outputting to an elasticsearch instance.

In that data which we have, there's a subset that has file hashes. We're trying to use a VirusTotal plugin https://www.rubydoc.info/gems/logstash-filter-virustotal/0.1.3 to query VT and generate matches, if any.

The plugin was installed and it's working (we see VT data in kibana) until a certain point in which Logstash crashes with the error "[2019-12-23T14:18:57,741][ERROR][org.logstash.execution.WorkerLoop] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.StandardError: (ConnectionFailed) Net::OpenTimeout"

According to logic, the connection to VirusTotal is timing out causing the pipelines in Logstash to get full and eventually crash...according to logic at least but I'm not sure if that is indeed the case or there is something which we may do to get VT data and have a functional Logstash.

logstash 7.3.1
elasticsearch - 7.3.0
kibana - 7.3.0

Logstash conf related to the VT data.
input {
file {

 type => "sometype"
 path => "/somepath/**/*.log"
 start_position => "beginning"
  sincedb_path => "/somepath/since.db.file"


filter {
if [type] == "sometype"{

	if [field] == "somevalue" {
		virustotal {
				apikey => 'key'
				field => "hash_field"
				lookup_type => "hash"
				target => "virustotal_hash"
				}				}

output {

if [type] == "sometype"{
	elasticsearch {
		hosts => [""]
		index => "index-%{+YYYY.MM.dd}"

Is there a good Samaritan which may shed some light, please ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.