Rubydebug is not displayed

Hello,

I am running logstash on a windows server and have enabled Rubydebug to see how the data is being processed, but I don't see anything being logged on to console or in the rubydebug file. I have tried both the options stdout and file as mentioned below without any luck.

Any thoughts are appreciated on fixing this. Thanks in Advance.

output {

Uncomment out below file output if users wish to view

event data in a debug file. Specify the path for the file.

stdout{
codec => rubydebug
}
#file {

codec => rubydebug

path => "C:\logstash2.2.1\log\ruby-debug.log"

}

}

Sam

rubydebug only works on stdout. It does not work with the file output.

As for why you're not seeing it at the command-line, how is it being launched?

I am running the logstash from command prompt below is the command I am using.

logstash agent --verbose -f C:\logstash2.2.1\logstash\plugins\logstash\config\logstash-scala_rubydebug.conf -l C:\logstash2.2.1\log\console.log

For stdout to work with the rubydebug codec, remove the -l C:\logstash2.2.1\log\console.log portion. It needs to log to the console.

Yes, I have tried that already, I see only the Regex patterns being logged in the console.

Below is an excerpt of the output on the conosole.

C:\logstash2.2.1\bin>logstash agent --verbose -f C:\logstash2.2.1\logstash\plugins\logstash\config\logstash-scala_rubydebug.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 1
e[32mRegistering file input {:path=>["C:\Windows\System32\winevt\Logs\ForwardedEvents.evtx"], :level=>:info}e[0m
e[32mNo sincedb_path set, generating one based on the file path {:sincedb_path=>"C:\Users\itmuser/.sincedb_d1c5aedc0be3c7fc50ce39bb2e81ca62", :path=>["C:\Windows\System32\winevt\Logs\ForwardedEvents.evtx"], :level=>:info}e[0m
e[32mGrok patterns path {:patterns_dir=>["C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns", "C:/logstash2.2.1/patterns/"], :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/aws", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/bacula", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/bro", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/exim", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/firewalls", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/grok-patterns", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/haproxy", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/java", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/junos", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/linux-syslog", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/mcollective", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/mcollective-patterns", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/mongodb", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/nagios", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/postgresql", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/rails", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/redis", :level=>:info}e[0m
e[32mGrok loading patterns from file {:path=>"C:/logstash2.2.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-2.0.2/patterns/ruby", :level=>:info}e[0m
e[32mMatch data {:match=>{"TimeGenerated"=>"%{DATA:TIMEGEN_DATE} %{DATA:TIMEGEN_TIME} %{ISO8601_TIMEZONE:TZ}", "message"=>[]}, :level=>:info}e[0m
e[32mGrok compile {:field=>"TimeGenerated", :patterns=>["%{DATA:TIMEGEN_DATE} %{DATA:TIMEGEN_TIME} %{ISO8601_TIMEZONE:TZ}"], :level=>:info}e[0m
e[32mAdding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})", :level=>:info}e[0m
e[32mAdding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \[%{HTTPDATE:timestamp}\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:"%{S3_REQUEST_LINE}"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:"?%{QS:agent}"?|-) (?:-|%{NOTSPACE:version_id})", :level=>:info}e[0m
e[32mAdding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?", :level=>:info}e[0m
e[32mAdding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]
)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?", :level=>:info}e[0m
e[32mAdding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})", :level=>:info}e[0m
e[32mAdding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} "%{ELB_REQUEST_LINE}"", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_VOLUME"=>"%{USER}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_DEVICE"=>"%{USER}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_VERSION"=>"%{USER}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_JOB"=>"%{USER}", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \"%{BACULA_DEVICE:device}\" \(%{BACULA_DEVICEPATH}\)", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \"%{BACULA_VOLUME:volume}\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}.", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \"%{BACULA_VOLUME:volume}\" in catalog.", :level=>:info}e[0m
e[32mAdding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \"%{BACULA_VOLUME:volume}\" on device \"%{BACULA_DEVICE:device}\" \(%{BACULA_DEVICEPATH}\).", :level=>:info}e[0m

You shouldn't need to add agent, and remove --verbose for this test.

I still don't see anything being logged on the console. Below is the ouput.

C:\logstash2.2.1\bin>logstash -f C:\logstash2.2.1\logstash\plugins\logstash\config\logstash-scala_rubydebug.conf
io/console not supported; tty will not be manipulated
Settings: Default pipeline workers: 1
Logstash startup completed

I am AFK at the moment. There was something about how Windows launched java that did this, but I don't remember off hand. Is there a reason you're not using a more recent version? Like 5.6.1?

I am using logstash with an IBM log management product, so I had to use the custom output plugin which is developed for the product. I had to go with officially supported logstash which is 2.2.1 and not sure about support of the latest version of Logstash with the plugin.

Do you have any thoughts regarding using filebeats or winlogbeat with custom plugins to divert the output to a different tool?

Hi,

As you suggested I am trying to use logstash 5.6.2. now I see a different issue that its unable to locate jruby.

C:\logstash-5.6.2\bin>logstash.bat --configtest -f C:\logstash2.2.1\logstash\plugins\logstash\config\logstash-scala_rubydebug.conf
"could not find jruby in C:\logstash-5.6.2\vendor\jruby"

Do we have to set any environment variables to point Jruby which is coming with Logstash? Any help is appreciated.

Thanks.

Have you tried running setup.bat in the same directory? Is JAVA_HOME properly set?

Yes, I just tried setup.bat which is again complaining about jruby.

C:\logstash-5.6.2\bin>setup.bat
"could not find jruby in C:\logstash-5.6.2\vendor\jruby"

Below if the JAVA_HOME path.

C:\logstash-5.6.2\bin>echo %JAVA_HOME%
C:\logstash2.2.1\eclipseDevelopmentPackage\ibm_sdk80

That JVM is not a supported one. I'm sorry this is not straightforward. :frowning: I'm not sure what to say at this point. It seems that a very custom installation was created to support the IBM JDK, and your specific plugin. Perhaps the people who created it know more?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.