Not able to filter certian ganglia gmond ouput using logstash


#1

The objective of this exercise is to visualise ganglia gmond output in kibana. I am trying to filter out certian lines from ganglia gmond though logstash.

If i try listening to udp port in logstash and apply the filter, no messages are filtered. However my filter is correct, because when i try using logstash generated text file output of gmond, instead of directly listening to gmond on udp, i am able filter the message.

To solve this issue, i have used to conf files.

  1. Logstash listens to gmond on upd port and output the resluts to a file
  2. Logstash uses the generated file as input, applies filters on it and then sends the output to elasticsearch. This would make it more clear

Conf1

input {
ganglia {
port => "8686"
type => "ganglia"
}
}
output {
file {
path => "/tmp/elastic_gmond-log.txt"
}
}

Conf2

input {
file {
path => "/tmp/elastic_gmond-log.txt"
}
}
filter {
# bacsically filtering all the string values so that i can visusalize other float values in kibana
if ([message] =~ /machine_type/ or [message] =~ /os_name/ or [message] =~ /location/ or [message] =~ /os_release/ ) {
drop{}
}
}
output {
file {
path => "/tmp/gmond-log.txt"
}
elasticsearch {
cluster => elasticsearch
port => 9300
index => "ganglialog-%{+YYYY.MM.dd}"
}
}

However my logstash is dying as soon as i restart the gmond

Logstash startup completed
Exception in thread ">output" java.lang.UnsupportedOperationException
at java.lang.Thread.stop(Thread.java:869)
at org.jruby.RubyThread.exceptionRaised(RubyThread.java:1221)
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:112)
at java.lang.Thread.run(Thread.java:745)

The command used is
./logstash --config /etc/logstash/conf.d/

How do get this thing working? Is there an easlier/better alternative?


(Mark Walkom) #2

The problem here is that LS will merge both configs and you will have 3 outputs.
You need to add a tag and then a conditional to the filter and the second set of outputs.


#3

Apologies!!
I am a newbie to this stuff,
Can you help how this can be achived ?


(Mark Walkom) #4

This is basically what you want.

input {
  ganglia { 
    port => "8686"
    type => "ganglia"
  }
}

filter {
  if ([message] =~ /machine_type/ or [message] =~ /os_name/ or [message] =~ /location/ or [message] =~ /os_release/ ) {
    drop{}
  }
}

output {
  file {
    path => "/tmp/gmond-log.txt"
  }
  elasticsearch {
    cluster => elasticsearch
    port => 9300
    index => "ganglialog-%{+YYYY.MM.dd}"
  }
}

#5

As mentioned before, the issue with this type of configuraion is that, the filter does not work when i am reading messaged directly from udp port.
That is why I am using a two step process of creating a output file after listening to upd port and then applying filters onthis generated file, the ouptput of which is then sent to elasticsearch

Seems, the filters not working with ganglia upd. how can this be overcome ?


(Mark Walkom) #6

Try removing the quotes from the port?
Otherwise see if it works with a udp input?


#7

Removing quotes on port does not change anything
when i put upd input instead of ganglia, the messages are not formated and can't figure out the output.

\u0000\u0000\u0000\u0000\u0004%.1f\u0000\u0000\u0000\u0000","@version":"1","@timestamp":"2015-09-08T08:35:41.350Z","type":"ganglia","host":"10.40.94.157"}
{"message":"\u0000\u0000\u0000\x86\u0000\u0000\u0000\u000FE8a7-IBMHS23-13\u0000\u0000\u0000\u0000\bload_one\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0004%.2f>\u0005\u001E\xB8","@version":"1","@timestamp":"2015-09-08T08:35:41.351Z","type":"ganglia","host":"10.40.94.157"}
{"m


#8

Just to add the messages on output i.e stdout appear as

2015-09-08T08:39:10.423Z 10.40.94.157 %{message}
2015-09-08T08:39:10.426Z 10.40.94.157 %{message}
2015-09-08T08:39:10.432Z 10.40.94.157 %{message}
2015-09-08T08:39:10.434Z 10.40.94.157 %{message}
2015-09-08T08:39:10.435Z 10.40.94.157 %{message}
2015-09-08T08:39:10.436Z 10.40.94.157 %{message}
2015-09-08T08:39:10.438Z 10.40.94.157 %{message}

However when i see them in logfile the %{message} is decrypted

"@version":"1","@timestamp":"2015-09-04T11:36:18.494Z","log_host":"E8a7-IBMHS23-13","metric":"mem_total","value":2.64503584E8,"dmax":0,"tmax":1200,"slope":"z ero","type":"float","units":"KB","host":"10.40.94.157"}
{"@version":"1","@timestamp":"2015-09-04T11:36:18.498Z","log_host":"E8a7-IBMHS23-13","metric":"swap_total","value":4194296.0,"dmax":0,"tmax":1200,"slope":"zer o","type":"float","units":"KB","host":"10.40.94.157"}


(Mark Walkom) #9

Ok, let's try working with the consolidated config, run that with LS and --debug, see what gets returned.


Filter not getting applied on ganglia input in logstash
#10

HOw can i share the debug information.
I am not able to paste the info here..


(Christian Dahlqvist) #11

Upload the debug information somewhere, e.g. in a gist, and provide a link to it here.


#12

Thanks Christian

Here is the debug information


(system) #13