We have disk space of around 40GB avaialble. And provided below how the error looks from the logstash log:
{:timestamp=>"2017-07-24T05:41:07.568000-0400", :message=>"A plugin had an unrecoverable error. Will restart this plugin.\n Plugin: <LogStash::Inputs::File type=>"Sample",
path=>["/mySampleDir/log/*"], codec=><LogStash::Codecs::Plain charset=>"UTF-8">, stat_interval=>1, discover_interval=>15, sincedb_write_inter
val=>15, start_position=>"end", delimiter=>"\n">\n Error: Disc quota exceeded", :level=>:error}
How much min/max space Logstash needs for its usage?
Do we need to specify the max size of log entries in Logstash configuration file?
40GB should be ok, are u sure is quota for the user set, maybe you have space but you run out of quota.
How much space for logstash, I have no idea. For the since files (store where the file pointer are keep) some kilobytes. For the queue is depending of how big are your data and how good or bad is connection to elastic.
Can you try to start with another user or root ?
The error is as below:
{:timestamp=>"2017-07-27T02:45:44.407000-0400", :message=>"A plugin had an unrecoverable error. Will restart this plugin.\n Plugin: <LogStash::Inputs::File type=>"Audit_2",
path=>["/mySampleDir2/log/*"], codec=><LogStash::Codecs::Plain charset=>"UTF-8">, stat_interval=>1, discover_interval=>15, sincedb_write_inter
val=>15, start_position=>"end", delimiter=>"\n">\n Error: Disc quota exceeded", :level=>:error}
Please help as i am clue less and did not find any source for this error over the internet,
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.