Logstash Issue: Glob did not work, collector name {:name=>"ConcurrentMarkSweep"}

Hello,

I'm parsing sql logs with logstash version 6.2.3. Windows 10.

this is my logstash.conf file. I've used it many times but unfortunately something happend and it stopped working

input {
file{
path => "C:\Users\Administrator\Desktop\sample_rds_logs_test.txt"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => multiline {
pattern => "[1]"
negate => true
what => "previous"
}
}
}

filter {

}
output {
elasticsearch { hosts => "localhost:9200"
index => "logs_rds_test"
}
stdout {codec => "json"}
}

I've gone through all the topics of same kind in the forum but not able to sort my issue.

Logs when I run in logstash in debug mode

[2018-05-31T04:22:30,463][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:30,463][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:22:34,810][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x32090ed4 sleep>"}
[2018-05-31T04:22:35,480][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:35,480][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:22:36,870][DEBUG][logstash.inputs.file     ] _globbed_files: C:\Users\Administrator\Desktop\sample_rds_logs_test.txt: glob is: []
[2018-05-31T04:22:36,872][DEBUG][logstash.inputs.file     ] _globbed_files: C:\Users\Administrator\Desktop\sample_rds_logs_test.txt: glob is: ["C:\\Users\\Administrator\\Desktop\\sample_rds_logs_test.txt"] because glob did not work
[2018-05-31T04:22:39,811][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x32090ed4 sleep>"}
[2018-05-31T04:22:40,503][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:40,504][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:22:44,813][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x32090ed4 sleep>"}
[2018-05-31T04:22:45,511][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:45,512][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:22:49,817][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x32090ed4 sleep>"}
[2018-05-31T04:22:50,522][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:50,523][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:22:51,951][DEBUG][logstash.inputs.file     ] _globbed_files: C:\Users\Administrator\Desktop\sample_rds_logs_test.txt: glob is: []
[2018-05-31T04:22:51,952][DEBUG][logstash.inputs.file     ] _globbed_files: C:\Users\Administrator\Desktop\sample_rds_logs_test.txt: glob is: ["C:\\Users\\Administrator\\Desktop\\sample_rds_logs_test.txt"] because glob did not work
[2018-05-31T04:22:54,821][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x32090ed4 sleep>"}
[2018-05-31T04:22:55,539][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:22:55,539][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}

What went wrong? please help?


  1. 0-9 ↩︎

  • In the filename pattern, use forward slashes instead of backslashes.
  • In sincedb_path, use "nul" instead of "/dev/null".

@magnusbaeck

I tried but no difference I can see. Can you please help with any other available options
logstash.conf file

input {
      file{
	    path => "C:/Users/Administrator/Desktop/sample_rds_logs_test.txt"
		start_position => "beginning"
		sincedb_path => "null"
	codec => multiline {
     pattern => "^[0-9]"
      negate => true
      what => "previous"
   }
		}
	}

filter {


}
output {
  elasticsearch { hosts => "localhost:9200"
                  index => "logs_rds_test"
	 }
  stdout {codec => "json"}
}

logs

[2018-05-31T04:38:17,405][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: glob is: ["C:/Users/Administrator/Desktop/sample_rds_logs_test.txt"]
[2018-05-31T04:38:17,420][DEBUG][logstash.inputs.file     ] _discover_file: C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: new: C:/Users/Administrator/Desktop/sample_rds_logs_test.txt (exclude is [])
[2018-05-31T04:38:17,579][DEBUG][logstash.inputs.file     ] _open_file: C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: opening
[2018-05-31T04:38:17,579][DEBUG][logstash.inputs.file     ] C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: sincedb last value 5576, cur size 5576
[2018-05-31T04:38:17,579][DEBUG][logstash.inputs.file     ] C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: sincedb: seeking to 5576
[2018-05-31T04:38:21,295][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:38:21,295][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:38:22,186][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x20de62fa sleep>"}
[2018-05-31T04:38:26,326][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:38:26,326][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:38:27,186][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x20de62fa sleep>"}
[2018-05-31T04:38:31,358][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:38:31,358][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:38:31,920][DEBUG][logstash.inputs.file     ] _globbed_files: C:/Users/Administrator/Desktop/sample_rds_logs_test.txt: glob is: ["C:/Users/Administrator/Desktop/sample_rds_logs_test.txt"]
[2018-05-31T04:38:32,201][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x20de62fa sleep>"}
[2018-05-31T04:38:36,420][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-05-31T04:38:36,436][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-05-31T04:38:37,201][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x20de62fa sleep>"}

hello @magnusbaeck

it actually worked. it's "nul" but I've tried with "null" which actually wasted a lot of time. Anyway, thanks for the suggestions

If you have some time, Can you please take few minutes to answer these questions

    1. why did that happen without changing any configuration?

    2. what's the importance of since_db file. I don't think it's mandatory specifying in a configuration file?

    3.difference between forward and backward slashes for specifying file path?

    4. what exactly are globbed files and what is glob did not work?

Any help would be appreciated.

Thanks for all your time :slight_smile:

  1. why did that happen without changing any configuration?

Why did what happen? If it started working after you changed "null" to "nul" then you certainly changed the configuration.

  1. what's the importance of since_db file. I don't think it's mandatory specifying in a configuration file?

No, but setting it to "nul" or "/dev/null" (depending on your OS) will make Logstash oblivious to any old saved state about the current position in the file. The file input documentation has a few paragraphs on sincedb files. Please read them.

  1. difference between forward and backward slashes for specifying file path?

Well, it seems Ruby or some Ruby library used by Logstash works better with forward slashes, even on Windows.

  1. what exactly are globbed files and what is glob did not work?

A filename glob is a filename pattern that may or may not include wildcards. A glob that doesn't work is a glob that doesn't expand to any actual files. See glob (programming) - Wikipedia.

Why did what happen? If it started working after you changed "null" to "nul" then you certainly changed the configuration.

before, I used "/dev/null/". it worked from the past 5-6 days but today it didn't. and then I changed it to "null".

Sure, Thanks for all your suggestions. I will go through the documentation as well.

Thanks
Rahul Nama

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.