Sequential processing jdbc input to elastic search

Hi, I am new to logstash and elasticsearch.
I have some separated jdbc query that can't be joined by SQL syntax (different source).

I want to process it in logstash in this sequential:
Run Query A , check if query A result is not empty then run query B (example have field1, field2) and query C (field 1, field 3, field4).

Then data from query B and query C will be combined by data that have same id (field 1) so each data have: {field1, field2, field3, field4} and will be inserted to elasticsearch.

How to do this process in one logstash file?

input {
    #Query A
    jdbc {
    }
    #Query B
    jdbc {
    }
    #Query C
    jdbc {
    }
}

output {
  stdout{
        codec=>rubydebug{}
  }
  elasticsearch {
       hosts => "elasticsearch:9200"
       index => "product"
       document_type => "product"
       document_id => "%{product_number}"
       doc_as_upsert => true
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.