Failing to parse csv

Good day,

I am trying to parse multiple csv files in a directory for trend analysis and report generation,
Index is being created but not all information is being parsed. The error below is being displayed

[ERROR] 2017-11-21 21:27:27.067 [Ruby-0-Thread-3: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/task.rb:22] agent - An unexpected error occurred! {:message=>"Permission denied - /etc/logstash/conf.d/logstash-simple.conf", :class=>"Errno::EACCES", :backtrace=>["org/jruby/RubyIO.java:3804:in read'", "org/jruby/RubyIO.java:3987:inread'", "/usr/share/logstash/logstash-core/lib/logstash/config/loader.rb:80:in local_config'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/config/loader.rb:73:in local_config'", "/usr/share/logstash/logstash-core/lib/logstash/config/loader.rb:46:inload_config'", "/usr/share/logstash/logstash-core/lib/logstash/config/loader.rb:15:in format_config'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:304:infetch_config'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in reload_pipeline!'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:111:inreload_state!'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:108:inreload_state!'", "org/jruby/ext/thread/Mutex.java:149:in synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:107:inreload_state!'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:75:in execute'", "org/jruby/RubyProc.java:281:incall'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/interval.rb:20:in interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:75:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:320:in execute'", "org/jruby/RubyProc.java:281:incall'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/task.rb:24:in `initialize'"]}

Please find below config file: logstash-simple.conf

input {
file {
path => "/home/africom/backup3/data*.csv"
start_position => "end"
sincedb_path =>"/dev/null"
type => "data_record"
}
}

filter {
if [type] == "data_record"{
csv {
separator => "|"
skip_empty_columns => true
mutate {
rename => {"column1" => "index"}
rename => {"column3" => "system_time"}
rename => {"column5" => "msdn"}
rename => {"column8" => "imsi"}
rename => {"column13" => "session_start"}
rename => {"column16" => "session_end"}
rename => {"column17" => "data_volume"}
rename => {"column74" => "needs_to_be_id"}
}
}
output {
elasticsearch {
hosts => ["192.168.88.101:9200"]
index => "fourth"
}
stdout { codec => rubydebug }
}

Permission denied - /etc/logstash/conf.d/logstash-simple.conf

So have you checked the permissions of that file?

Good day magnus,

The current permissions on the file listed below

-rwxr-xr-- 1 root root 3167 Nov 22 05:48 logstash-simple.conf

Do they need to be changed in any further way?

No, that looks okay (although mode 644 would've made more sense than 754). Not sure what's up here.

Magnus,

I have re-installed logstash and that seems to have removed the permission issue(Using your suggestion of 644 for mode)
The problem now is this error:

:message=>"Unhandled IOException: java.io.IOException: unhandled errno: Too many open files"

An index is being created and some of the data is present, but not all
I am sourcing the data from multiple files. snippet of format of data below(if that may help)

54456138 0 20171115143642 1 2.63864E+12 -1 648448644218931 20171115164808 0 20171115163232 6144
54456145 0 20171115143750 1 2.63864E+12 -1 648448644215611 20171115164916 0 20171115164433 139264
54456146 0 20171115143754 1 2.63864E+12 -1 648448644206356 20171115164920 0 20171115164729 159744
54456147 0 20171115143755 1 2.63864E+12 -1 648448644216975 20171115164921 0 20171115163501 1024
54456149 0 20171115143759 1 2.63864E+12 -1 648448644224596 20171115164925 0 20171115163846 0

Please assist
Thank you

Does the filename pattern matches thousands of files, or why might you be getting the "too many open files" error?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.