I am trying to ship data from two different sources i.e. log data from the application server itself using gelf interface and the second is jmx metrics data using jmx input plugin. My question is, is it possible to ship data from different sources to same logstash instance ? If yes, how can i do that by creating two separate pipelines or another method ? For now my logstash.conf file looks like this:
input {
tcp {
port => 5000
}
gelf {
port => 12201
use_tcp => true
type => gelf
}
}
Add your filters / logstash plugins configuration here
Logstash will read data from those two inputs, send the events from both through the filters and then on to the outputs. If you want to use the same filters for both sets of events then using a single pipeline and a conditional output makes sense. If you want different filters for the different outputs then using two pipelines may be better.
JMX metrics
Do you think if i should use "type" field to differentiate them and then create an output like following:
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "%{[type]}-%{+YYYY.MM.dd}"
}
}
I would not call it type, since elastic has used that field in the past and is trying to retire it. But you could do that with another field name, such as doctype. The two types of events sound different enough that two pipelines is probably a better solution.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.