NoMethodError: "undefined method 'to_hash'" with CSV input in Logstash 5.6.2

I'm trying to send data from a .csv file containing JMeter results to an online Elasticsearch instance. My .conf file looks this way:

input {
	file {
		path => ["\path\to\somefile.csv"]
		start_position => "beginning"
		sincedb_path => "NUL"
		ignore_older => 0
	}
}
output {
  elasticsearch {
    hosts => ["my.elasticsearch.instance"]
    index => "jmeter-%{+YYYY.MM.dd}"
  }
  stdout { 
    codec => rubydebug 
  }
}

When I run Logstash with the debug option, I get the following error and no data is transferred:

NoMethodError: undefined method 'to_hash' for []:Array
    filter_func at (eval):22
    filter_batch at .../pipeline.rb:398
    worker_loop at ...
    start_workers at ...

How can I fix this? I'll be thankful for any help.

2 Likes

I am also having a similar issue. Here is the Debug output and configuration file I was using. When I don't use teh --log.level=debug option, LogStash seems to work, but doesn't exit, and when I use the debug option, it actually fails to start and run.

I installed the following plugins

 bin/logstash-plugin install logstash-input-file
 bin/logstash-plugin install logstash-output-file
 bin/logstash-plugin install logstash-filter-csv

Since the output is too long for the message window, here is a Gist to the errors and output.

i have same issue

Yea, don't set log output to debug, or it will exit with that missing to_hash method error.

Without the debug flag, LogStash seems to just sit and wait for something, but it doesn't actually parse the CSV file I've given it. When I look at the process, it does seem to have a lock on the file though.

with log.level set to debug:

logstash_1 | [2017-10-16T00:36:57,088][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"undefined method to_hash' for []:Array", "backtrace"=>["(eval):45:in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:398:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:379:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:342:in `start_workers'"]}

any advise?

bump

I too am facing this issue.

Also, I believe this might be relevant: Logstash v5.6.3 won't start if debug logging is on

1 Like

This happened to me when I upgraded to 5.6.3 as well. The CSV plugin worked fine for me in 5.3 before I upgraded. I'll try downgrading to 5.6.1 and see if that solves my problem.

Oddly enough, I turned off debugging (I had it set on the command line I was using for test purposes) and the CSV to_hash errors went away. I was finally able to get my data slurped in from logstash!

That is an issue related only to debugging, that is why it went away once you defined the regular log levels.

Logstash 5.6.2 does not present this problem so, if that's an option, you cn stick with that version for now.

I ended up writing a Ruby script that slurped the CSV, and parsed it, then POST'ed it to Elasticsearch. However, I think that LogStash might have not processed my CSV because some of hte columns had new line characters in the string in a few of those columns.

However, since LogStash is written in JRuby, it should've been able to do the same thing.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.