I am looking to use logstash to ingest a csv file that contains some stats and then graph them. I have setup a file input but I am unable to get it to work.
Then it's most likely one of two things:
Your path does not contain the files that match your path pattern.
-Check your path and files to make sure that they are ok.
OR
The sincedb is showing that all files have been processed
-Delete the .sincedb file and restart logstash, if that works then you'll need to add some additional parameters to your config like setting the sincedb_path to NUL.
Following the steps above setting the sincedb_path to "NuLL" and checking the files all seemed ok. I then placed the logstash logs in debug I received the following in a loop
2018-12-02T22:59:01,649][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
Downgrading to ELK 6.0.0 resolved the problem but now the csv has now uploaded ok.
I have uploaded the file ok and I have added a mutate convert to the config but when imported it makes the columns 0 .
The file has columns like col1:16591.07535,col 2: 2310, col3: 7950,col4: 1.5, col5: 180000, col6: 38300000
but when imported to elastic it shows 0.
The mutate section of config is
convert => {
"Apdex" => "integer"
"Count" => "integer"
"Avg (ms)" => "integer"
"SD (ms)" => "integer"
"Min (ms)" => "integer"
"Max (ms)" => "integer"
"Total (ms)" => "integer"
"Total (% time)" => "integer"
"Dissat (%)" => "integer"
}
What am I doing incorrectly?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.