Logstash csv - [[main]<file] DEBUG logstash.inputs.file - each: file grew:

I'm trying to input csv file into elasticsearch, however before going there, I'd like to display data on screen first, yet when I attempt to run logstash, I'm getting an error and cannot figure out what I'm doing wrong...

$ cat ./etc/logstash/conf.d/* | grep -v \#

input {
  file {
    path => "/root/us_postal_codes.csv"
  }
}

filter {
  csv {
    columns => ["Postal Code","Place Name","State","State Abbreviation","County","Latitude","Longitude",""]
  }
}

output {
  stdout {codec => rubydebug}
}
$  

&

full output of docker-compose up, but to narrow it down, logs keeps repeating itself with these:

logstash_1  | 15:12:52.744 [[main]<file] DEBUG logstash.inputs.file - each: file grew: /root/us_postal_codes.csv: old size 0, new size 2408030
logstash_1  | 15:12:53.746 [[main]<file] DEBUG logstash.inputs.file - each: file grew: /root/us_postal_codes.csv: old size 0, new size 2408030
logstash_1  | 15:12:54.747 [[main]<file] DEBUG logstash.inputs.file - each: file grew: /root/us_postal_codes.csv: old size 0, new size 2408030
logstash_1  | 15:12:55.748 [[main]<file] DEBUG logstash.inputs.file - each: file grew: /root/us_postal_codes.csv: old size 0, new size 2408030
logstash_1  | 15:12:56.749 [[main]<file] DEBUG logstash.inputs.file - each: file grew: /root/us_postal_codes.csv: old size 0, new size 2408030
logstash_1  | 15:12:57.433 [pool-2-thread-5] DEBUG logstash.instrument.periodicpoller.cgroup - Error, cannot retrieve cgroups information {:exception=>"Errno::ENOENT", :message=>"No such file or directory - /sys/fs/cgroup/cpuacct/system.slice/docker-284b94e8dced053b2f62bf8738fe8c2dd57853b6688d8513edfd44eb223c9113.scope/cpuacct.usage"}
logstash_1  | 15:12:57.731 [Ruby-0-Thread-14: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline

... needless to say, file is static and is not growing as logstash suggests ...

Please advise...

It is stdout not stout.

I corrected that error myself, however still thank you for trying to help me out)

I also updated my question with different issue that I'm encounter at the moment, if you could take a look at it, I'd much appreciate it)

Thanks again!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.