Problems in outputting files to shared unit with Logstash in Windows

Hello all:

First publication in this forum for me. I hope I'm not breaking a bunch of guidelines.

We have a Logstash 7.11.1 installed over a Windows Server 2016 ingesting log files via filebeat.
As output, both an Elasticsearch and a file plugins are used in parallel.

Our Elasticsearch output works properly and our file output does so when using a local drive folder (C:, for instance).
Nonetheless, we want to output the files to a DFS shared folder on the same net. This shared folder has been mapped in the machine where Logstash is running as a persistent drive via net use (unit O:).

The problem we've found at this point is that using a path over this O:\ drive both with \ and with / formatting Logstash throws (different) errors that cause the pipeline to crash and stop:

  1. When using \ path format, provided full path is taken as a relative path inside bin folder in Logstash folder installation, as shown here (also available in https://pastebin.com/YYMuSz8k):
[2021-11-15T19:13:31,963][INFO ][logstash.outputs.file    ][my_pipeline][66c0492c0128f174f5eab55f7acd00d076d057a5d20f3ed0315da8c56791bc48] Opening file {:path=>"C:/logstash-7.11.1/bin/O:\temp/myfile.txt"}
[2021-11-15T19:13:31,979][INFO ][logstash.outputs.file    ][my_pipeline][66c0492c0128f174f5eab55f7acd00d076d057a5d20f3ed0315da8c56791bc48] Creating directory {:directory=>"C:/logstash-7.11.1/bin/O:\temp"}
[2021-11-15T19:13:31,979][ERROR][logstash.javapipeline    ][my_pipeline] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"my_pipeline", :error=>"(SystemCallError) Unknown error (SystemCallError) - Unknown Error (20109) - C:\\logstash-7.11.1\\bin\\O:\temp", :exception=>Java::OrgJrubyExceptions::SystemCallError, :backtrace=>["org.jruby.RubyDir.mkdir(org/jruby/RubyDir.java:632)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.fu_mkdir(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:235)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:213)", "org.jruby.RubyArray.reverse_each(org/jruby/RubyArray.java:1891)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:211)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:196)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.open(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:264)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:119)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1415)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:118)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:164)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:117)", "C_3a_.logstash_minus_7_dot_11_dot_1.logstash_minus_core.lib.logstash.outputs.base.multi_receive(C:/logstash-7.11.1/logstash-core/lib/logstash/outputs/base.rb:103)", "org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:143)", "org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121)", "C_3a_.logstash_minus_7_dot_11_dot_1.logstash_minus_core.lib.logstash.java_pipeline.start_workers(C:/logstash-7.11.1/logstash-core/lib/logstash/java_pipeline.rb:295)"], :thread=>"#<Thread:0xd81e74a run>"}
  1. When using / path format, an error like this one is thrown, providing a "No such process" message (also available in https://pastebin.com/q1wE301n):
Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"my_pipeline", :error=>"(ESRCH) No such process - O:\\temp", :exception=>Java::OrgJrubyExceptions::SystemCallError, :backtrace=>["org.jruby.RubyDir.mkdir(org/jruby/RubyDir.java:632)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.fu_mkdir(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:235)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:213)", "org.jruby.RubyArray.reverse_each(org/jruby/RubyArray.java:1891)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:211)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.fileutils.mkdir_p(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:196)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.open(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:264)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:119)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1415)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:118)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:164)", "C_3a_.logstash_minus_7_dot_11_dot_1.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(C:/logstash-7.11.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:117)", "C_3a_.logstash_minus_7_dot_11_dot_1.logstash_minus_core.lib.logstash.outputs.base.multi_receive(C:/logstash-7.11.1/logstash-core/lib/logstash/outputs/base.rb:103)", "org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:143)", "org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121)", "C_3a_.logstash_minus_7_dot_11_dot_1.logstash_minus_core.lib.logstash.java_pipeline.start_workers(C:/logstash-7.11.1/logstash-core/lib/logstash/java_pipeline.rb:295)"], :thread=>"#<Thread:0x4373d35c sleep>"}

Here's a simple example of my pipeline (also available in https://pastebin.com/WgG72fan):

input { pipeline { address => my_main_pipeline } }
filter {
  mutate {
    copy => { "[log][file][path]" => "log_file_path" }
  }
}
output {
  elasticsearch {
    index => "i-my_index-%{[fields][project]}-%{[fields][instance]}-%{[fields][tenant]}-%{[fields][context]}-%{[fields][subcontext]}-%{+yyyy.MM.dd}"
    cloud_id => "${MY_ELASTIC_CLOUD_ID}"
    cloud_auth => "${MY_ELASTIC_CLOUD_AUTH}"
  }
  if [fields][subcontext] == "my_metrics" {
    file {
      path => "<PATH TO USE. DESIRABLY, I'D USE [log][file][path] HERE, BUT THE PATH ISSUE IS MAKING IT IMPOSSIBLE FOR ME>"
	  codec => line { format => "{\"Timestamp\": \"%{Timestamp}\", \"Timezone\": \"%{Timezone}\", \"Body\": %{Body} "}
    }
  }
}

Is there any known limitation or issue regarding Logstash and shared folders in Windows systems?
Is it a problem regarding the formatting of the folder slashes been used?
Do you know any way of solving this problem?

Thank you very much in advance and sorry if I missed any guideline of the forums.

Best regards,
Roberto Rodríguez.

Creating directory {:directory=>"C:/logstash-7.11.1/bin/O:\temp"}

There are places where backslash in a path is treated as an escape. Try using O:\\temp

Hello Badger,
Thanks for your quick reply.
I tried so; in fact, the [log][file][path] field created after the processing in filebeat for the event contains the path of the original file (which may suit my final path too) with every single backslash scaped; this was the URL that I was trying to use at first.
Seeing this was not working, I tried using shorter and simpler paths (always escaping the backslash), with the forementioned result.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.