Logstash 6.6.2 on Ubuntu 18.04 loops indefinitely and doing nothing

Hi all,

I am working with Logstash version 6.6.2 on Ubuntu 18.04 (ElementaryOS that based on Ubuntu 18.04). I tried to upload my file into Elasticsearch and it seems to repeating the loops and doing nothing based on the debug output below. I have seen some people facing this issue on 6.2.4, but could not find the solution, and interestingly I have not found someone with this newest version (6.6.2).

[DEBUG] 2019-03-13 17:18:18.712 [pool-3-thread-3] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2019-03-13 17:18:18.713 [pool-3-thread-3] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2019-03-13 17:18:23.175 [Ruby-0-Thread-15: :1] pipeline - Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x3827365b sleep>"}
[DEBUG] 2019-03-13 17:18:23.721 [pool-3-thread-3] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2019-03-13 17:18:23.722 [pool-3-thread-3] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2019-03-13 17:18:28.175 [Ruby-0-Thread-15: :1] pipeline - Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x3827365b sleep>"}

Based on the compatibility matrix, logstash, up to the version 6.6.x is not supported in Ubuntu 18.04 (but I can run the logstash, so I am assuming they haven't count the 6.6.2 there). However, anyhow, is it possible if that is actually the reason of this problem?

Any help would be highly appreciated.

Checking for an updated configuration every 5 seconds is normal. What does your configuration look like?

Hi, this is my config file. I specified this config file on executing with -f flag

input {
  file {
    path => "/media/darren/Windows/Users/lukasd/Documents/Horizon-scanning/logstash_data/ai_no_duplicate2.txt"
	start_position => "beginning"
    sincedb_path => "/dev/null"
	codec => plain {
                    charset => "ISO-8859-1"
            }
			stat_interval => 60
  }
}
filter {
  csv {
      separator => "	"
	  columns => ["ID","extracted_text"]
      }
}filter {
  mutate {
    remove_field => ["message","host","path","@timestamp","@version","tags"]
  }
}
output {
stdout { codec => rubydebug }
   elasticsearch {
						hosts => "127.0.0.1:9200"
						index => "do"
						template_overwrite => true
						document_type => "metrics"
						template => "/home/darren/Documents/do.json"
						}
}

Last time I looked, an elasticsearch output requires an event to have an @timestamp field. Do not remove it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.