input {
kafka {
// Pass a value from kafka to jdbc input plugin
}
jdbc {
}
}
input {
kafka {
// Pass a value from kafka to jdbc input plugin
}
jdbc {
}
}
No, that is not possible. What is it you are looking to achieve?
I want fetch a message (and index value (primary_key of a table )) from Kafka Queue and pass this jdbc input so that i can query it and send the data to Elasticsearch
How many records/documents would each query return? If you are simply looking up a value based on the key coming from Kafka, you may be able to use the jdbc_streaming plugin for this.
Only One key at a time,
You mean I can use Kafka in the Input plugin and then use the Jdbc stream plugin to take input value from kafka and then run a query in the filter section and then in ouput section use elasticsearch right ?
input{
kakfa {}
}
filter {
jdbcstreaming
}
output {
elasticsearch {}
}
Is this right ? Can you share example if you have any ?
Something like that. Do not have any example though.
But this is possible if implemented ?
@Christian_Dahlqvist, is it not possible to configure a TCP output for the first 'stage' and then a TCP input for the second 'stage'?
@patrick007 I think it should be possible, but I do not know anything about your data or queries.
@wwalker I am not sure I understand what you are suggesting. Could you please add some more details on why TCP inputs would be useful here?
@Christian_Dahlqvist
Thanks for you input
Used a wrong emoticon first
Input from Kafka topic, enrich, output to another Kafka topic.... rinse lather repeat
@sm00thindian Did not follow, Can you explain more please
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.