Ruby pipeline permission denied sending to logstash

Hi,

So I've set up an ELK stack.
First time setting one up but installed elastic, kibana and logstash. When I go to localhost:5601 it takes me to the elastic dashboard.
The problem I'm having is when I'm trying to send logs from my vsphere.

So I've set up a simple config as below:

input {
        tcp {
        type => syslog
        port => 5003
        }
}

filter {
        if [type] == "syslog" {
        grok {
                match => { "message" => "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp}

                %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?:

                %{GREEDYDATA:syslog_message}" }
                }
        }
}
output {
        elasticsearch {
        }
        if [type] == "syslog" and "_grokparsefailure" in [tags] {
                file { path => "/var/log/failed_syslog_events-%{+YYYY-MM-dd}" }
        }
}

With this config it opens the port 5003 I can see that in netstat and now in my vsphere if I enter the Host : TCP : 5003 it reaches it and I can see the connecting come in via the logstash logs however it immediately causes logstash to crash but the port and pipline stay up otherwise.

Apologies for the huge error message:

May 10 13:06:46 vmwarelogs logstash[48297]: [2021-05-10T13:06:46,119][ERROR][logstash.javapipeline    ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(EACCES) Permission denied - /var/log/failed_syslog_events-2021-05-10", :exception=>Java::OrgJrubyExceptions::SystemCallError, :backtrace=>["org.jruby.RubyIO.sysopen(org/jruby/RubyIO.java:1237)", "org.jruby.RubyFile.initialize(org/jruby/RubyFile.java:365)", "org.jruby.RubyIO.new(org/jruby/RubyIO.java:876)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.open(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:276)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:119)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1415)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:118)", "org.jruby.ext.thread.Mutex.synchronize(org/jruby/ext/thread/Mutex.java:164)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_file_minus_4_dot_3_dot_0.lib.logstash.outputs.file.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-file-4.3.0/lib/logstash/outputs/file.rb:117)", "usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:103)", "org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:143)", "org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:295)"], :thread=>"#<Thread:0x53556946 sleep>"}

So i've tried a few different ports and different configurations now including rdp and things but I've been at this for 7 hours so I'm hoping someone can tell me what I can't see.
All other servers if I use something like filebeat work a treat however trying to configure it for vCenter Server Appliance Log Files seems to have me beat.

Thank you,
Stan

:error=>"(EACCES) Permission denied - /var/log/failed_syslog_events-2021-05-10"

The output is failing because it cannot open the file it wants to write to. It is nothing to do with the input.

With something like file beats it's relatively easy to get set up ELK stack < remote server using beats. However with esxi I'm having this issue because I'm using the remote syslog feature on the vcenter gui so I'm confused as to why it would be writing to different locations at all If i'm pointing them to the same server?

So for anyone wondering I had to make the logstash configuration like below so it wouldn't crash

input {
  tcp {
    type => "syslog"
    port => 30100
    tags => ["syslog", "tcp", "vsphere"]
  }
  udp {
    type => "syslog"
    port => 30101
    tags => ["syslog", "udp", "vsphere"]
  }
}

filter {
  grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{DATA:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
    add_field => [ "received_from", "%{host}" ]
  }
  date {
     match => [ "timestamp", "MMM dd HH:mm:ss", "MMM  d HH:mm:ss" ]
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "logstash-vsphere"
  }
}

This lets kibana receive the documents however I'm still working on it pulling in usefull information, right now all it shows me is how many notifications there have been in the syslogs rather than things like guest name and alert information etc.

I will update when I've completely figured it out as this has been a nightmare for me so hopefully I can help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.