Can logstash's JDBC input plugin do multiple sql tasks?

Now there is a JDBC task read from db as

input {
  jdbc {
    jdbc_connection_string => "jdbc:mysql://${MYSQL_MAIN_HOST}/${MYSQL_DATABASE}"
    jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
    jdbc_page_size => 10000
    jdbc_paging_enabled => true
    jdbc_password => "${MYSQL_PASSWORD}"
    jdbc_user => "${MYSQL_USER}"
    schedule => "0 1 * * *"
    statement_filepath => "/usr/share/logstash/pipeline/sql/select_posts.sql"
    tracking_column => "updated_at"
    tracking_column_type => "numeric"
    use_column_value => true
    last_run_metadata_path => "/usr/share/logstash/jdbc_last_run/select_posts_last_value"


FROM posts

This task is heavy when the body item goes to very large data. So want to remove it at the first search as:

FROM posts

Then get the IDs and use them to find body again.

FROM posts
WHERE id in (IDs)

To set to output target. Even update the output target is also okay.

So can logstash read statement by statement in this case?

Is that means you want to query the data incremently or just get all the data in one go?
I saw your sql statement will query all the data in one go, with some changes of the query you can get the "delta" from the latest value.

I want to search all the data except body in one go. But want to get body again with IDs that searched from the last result.

As search with body cost lots of time during the whole process. So want to separate them with 2 statements.

You could use a jdbc input to SELECT id, title FROM posts, then a jdbc_streaming filter to fetch the body for each event (i.e. each row of the database). Hard to see how that would actually be cheaper, although it would distribute the work more.

1 Like

I tested it. It became even slower.

As per your statement:

At the end you get all the data, instead of the iD, also the body, right? or wrong?

I can get all the data includes id, title and body with the method of @Badger .

I'm looking at jdbc_static to see if it can do well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.