I have 2 .conf files in /etc/logstash/conf.d:
-
jdbc.conf connects to my DB2 to run some sql queries and pushes data to ES, and works perfectly fine by itself.
-
http.conf opens port 31000 listening for json posts and pushes data to ES, and that works perfectly fine by itself.
I've added conditions to separate the types of data if [type] == "jdbc_data", if [type] == "http_data", and the inputs are coming from two different sources one from a DB2 connection, the other is a curl -xpost file.json.
So, all is great if they are running individually, but if I add them both to /etc/logstash/conf.d and the logstash service pipeline tries to handle them both...
curl -xpost file.json will fail repeatedly, but if I do it enough times it will eventually go through. So, is there some kind of conflict with have a jdbc input connection and http input connection?
OBSERVATION: Logstash seems to check the "health" of the connection every 5 seconds, could that be interfering with the curl -xpost? I need this solution to work consistently, can't tell teams just keep posting the json until it works.
jdbc.conf:
input {
jdbc {
type => "jdbc_data"
jdbc_driver_library => "${JDBC_DRIVER_LIB}"
jdbc_driver_class => "com.ibm.db2.jcc.DB2Driver"
jdbc_connection_string => "jdbc:db2://${DB2_HOST}:${DB2_PORT}/${DB2_DATABASE}:securityMechanism=13;"
jdbc_user => "${DB2_USER}"
jdbc_password => "${DB2_PASSWORD}"
statement_filepath => "${JDBC_SQL}"
schedule => "*/15 * * * * * "
}
}
output {
if [type] == "jdbc_data" {
elasticsearch {
action => "index"
hosts => "http://${ES_LOGSTASH_HOST}:${ES_LOGSTASH_PORT}"
index => "jdbcdata"
document_id => "%{SERVICE_REQUEST}%{SR_ACTUALFINISH_TIME}"
}
stdout {codec => rubydebug}
}
}
http.conf:
input {
http {
port => 31000
type => "http_data"
}
}
output {
if [type] == "http_data" {
elasticsearch {
hosts => "http://x.x.x.x:9200"
action => "index"
index => "http_data"
document_id => "%{Date}%{Cluster}"
}
}
}