Need help to have space on elasticsearch

When I replay the logs from logstash to elasticsearch, I no longer see them on kibana.
I think there is no more storage space.
So I go to Stack Management -> Index Management to delete Indices. But, that does not solve my problem.
I need your help.

Is your system showing there is no disk space left?

What do you mean by this?

No. But, my large compressed files are no longer imported into elasticsearch. So I deleted the 3 month indexes. And there are still 12 months of index left. I don't know if this is related to indexes. Or how do I free up space to import my compressed files.

For your second question, I meant that the logs are not imported, in order to view them in Kibana.

What is the output from the _cat/allocation?v API in Elasticsearch?

This is the output from from the _cat/allocation?vAPI in Elasticsearch:

(In future, please don't post pictures of text or code. They are difficult to read, impossible to search and replicate (if it's code), and some people may not be even able to see them :smiley:)

So you've got heaps of space on the nodes by the looks of it.

But what causes all of my files to stop pushing to Elasticsearch?

You'd need to check your Beats logs.

I looked at the logs. Below are some logs:

[2021-07-09T15:40:40,085][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] QUIT - closing all files and shutting down.
[2021-07-09T15:40:40,088][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] START, creating Discoverer, Watch with file and sincedb collections
[2021-07-09T15:40:40,091][ERROR][logstash.javapipeline ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:cpe_b2b
Plugin: <LogStash::Inputs::File mode=>"read", path=>["/data/logstash/cpe/b2b-tacacs-*"], add_field=>{"[event][module]"=>"cpe", "[event][dataset]"=>"tacacs", "[@metadata][idx]"=>"CPE_B2B", "[@metadata][vendor]"=>"tac_plus"}, id=>"25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f", sincedb_path=>"/dev/null", file_sort_by=>"path", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6b387e0b-12f8-4cfd-8ae3-7c5186a1c13c", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, start_position=>"end", delimiter=>"\n", close_older=>3600.0, file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_direction=>"asc", exit_after_read=>false, check_archive_validity=>false>
Error:
Exception: Java::JavaIo::EOFException
Stack: java.util.zip.GZIPInputStream.readUByte(java/util/zip/GZIPInputStream.java:269)
java.util.zip.GZIPInputStream.readUShort(java/util/zip/GZIPInputStream.java:259)
java.util.zip.GZIPInputStream.readHeader(java/util/zip/GZIPInputStream.java:165)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:80)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:92)
jdk.internal.reflect.GeneratedConstructorAccessor91.newInstance(jdk/internal/reflect/GeneratedConstructorAccessor91)
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:490)
org.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:285)
org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:918)
org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)

This error comes up every time.
How do you interpret it?
What solution do you offer me?

Regards,

It's a little hard to follow here as you're now dropping Logstash logs into the thread.

Can you please provide a description of your setup, including versions.
If Logstash is erroring, which it seems to be, please provide more of the log.

Please also format your code/logs/config using the </> button, or markdown style back ticks. It helps to make things easy to read which helps us help you .

Can I put it in a file, to send it as an attachment?

No sorry. If it's too large then use gist/pastebin/etc.

This is version 6
I will therefore provide more logs.
Below are the logs:

[2021-07-09T15:40:40,085][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] QUIT - closing all files and shutting down.
[2021-07-09T15:40:40,088][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] START, creating Discoverer, Watch with file and sincedb collections
[2021-07-09T15:40:40,091][ERROR][logstash.javapipeline ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:cpe_b2b
Plugin: <LogStash::Inputs::File mode=>"read", path=>["/data/logstash/cpe/b2b-tacacs-"], add_field=>{"[event][module]"=>"cpe", "[event][dataset]"=>"tacacs", "[@metadata][idx]"=>"CPE_B2B", "[@metadata][vendor]"=>"tac_plus"}, id=>"25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f", sincedb_path=>"/dev/null", file_sort_by=>"path", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6b387e0b-12f8-4cfd-8ae3-7c5186a1c13c", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, start_position=>"end", delimiter=>"\n", close_older=>3600.0, file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_direction=>"asc", exit_after_read=>false, check_archive_validity=>false>
Error:
Exception: Java::JavaIo::EOFException
Stack: java.util.zip.GZIPInputStream.readUByte(java/util/zip/GZIPInputStream.java:269)
java.util.zip.GZIPInputStream.readUShort(java/util/zip/GZIPInputStream.java:259)
java.util.zip.GZIPInputStream.readHeader(java/util/zip/GZIPInputStream.java:165)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:80)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:92)
jdk.internal.reflect.GeneratedConstructorAccessor91.newInstance(jdk/internal/reflect/GeneratedConstructorAccessor91)
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:490)
org.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:285)
org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:918)
org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.handlers.read_zip_file.handle_specifically(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/handlers/read_zip_file.rb:27)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.handlers.base.handle(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/handlers/base.rb:26)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.read_zip_file(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:39)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_active(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:102)
org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_active(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:88)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_all_states(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:45)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.watch.iterate_on_state(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/watch.rb:68)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.watch.subscribe(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/watch.rb:45)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.observing_read.subscribe(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/observing_read.rb:12)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.logstash.inputs.file.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/logstash/inputs/file.rb:363)
usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:345)
RUBY.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:336)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)
java.lang.Thread.run(java/lang/Thread.java:834)
[2021-07-09T15:40:41,094][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] QUIT - closing all files and shutting down.
[2021-07-09T15:40:41,095][INFO ][filewatch.observingread ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] START, creating Discoverer, Watch with file and sincedb collections
[2021-07-09T15:40:41,098][ERROR][logstash.javapipeline ][cpe_b2b][25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:cpe_b2b
Plugin: <LogStash::Inputs::File mode=>"read", path=>["/data/logstash/cpe/b2b-tacacs-
"], add_field=>{"[event][module]"=>"cpe", "[event][dataset]"=>"tacacs", "[@metadata][idx]"=>"CPE_B2B", "[@metadata][vendor]"=>"tac_plus"}, id=>"25989e194c2b3b6fdaf9b77853519b22edf519002a5e5db6c31ac71b71740e5f", sincedb_path=>"/dev/null", file_sort_by=>"path", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6b387e0b-12f8-4cfd-8ae3-7c5186a1c13c", enable_metric=>true, charset=>"UTF-8">, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, start_position=>"end", delimiter=>"\n", close_older=>3600.0, file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_direction=>"asc", exit_after_read=>false, check_archive_validity=>false>
Error:
Exception: Java::JavaIo::EOFException
Stack: java.util.zip.GZIPInputStream.readUByte(java/util/zip/GZIPInputStream.java:269)
java.util.zip.GZIPInputStream.readUShort(java/util/zip/GZIPInputStream.java:259)
java.util.zip.GZIPInputStream.readHeader(java/util/zip/GZIPInputStream.java:165)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:80)
java.util.zip.GZIPInputStream.(java/util/zip/GZIPInputStream.java:92)
jdk.internal.reflect.GeneratedConstructorAccessor91.newInstance(jdk/internal/reflect/GeneratedConstructorAccessor91)
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:490)
org.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:285)
org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:918)
org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.handlers.read_zip_file.handle_specifically(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/handlers/read_zip_file.rb:27)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.handlers.base.handle(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/handlers/base.rb:26)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.read_zip_file(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:39)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_active(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:102)
org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_active(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:88)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.read_mode.processor.process_all_states(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/read_mode/processor.rb:45)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.watch.iterate_on_state(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/watch.rb:68)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.watch.subscribe(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/watch.rb:45)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.filewatch.observing_read.subscribe(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/filewatch/observing_read.rb:12)
usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_file_minus_4_dot_1_dot_18.lib.logstash.inputs.file.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.18/lib/logstash/inputs/file.rb:363)
usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:345)
RUBY.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:336)
org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)
java.lang.Thread.run(java/lang/Thread.java:834)

The version is 7.8.0

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.