Order by class invalid in views, inline functions Logstash JDBC

Hi Team,

I am using below config file to index document from MSSQL server to logstash. Here i am using the order by class in SQL statement, but it's throwing error. Attached screenshot for reference.
How to order the result set in ascending order. Because the row with highest date value has to be indexed at last.

input {
jdbc {
jdbc_driver_library => "D:\Users\mmurugesan\Desktop\driver\mssql-jdbc-7.4.1.jre12-shaded.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://EC2AMAZ-J90JR4A\SQLEXPRESS:1433;databaseName=xxxx;"
jdbc_user => "xxxx"
jdbc_password => "xxxx"
jdbc_paging_enabled => true
tracking_column => modified_date
use_column_value => true
clean_run => true
tracking_column_type => "timestamp"
schedule => "*/1 * * * *"

statement => "Select pl.Policyholdername,pl.Age,pl.Dob,pl.Client_Address,cl.claimnumber,cl.claimtype,cl.is_active,cl.Modified_date from claim cl
inner join Policy pl on pl.Policynumber=cl.policynumber
where cl.Modified_date >:sql_last_value
order by cl.Modified_date Asc"

last_run_metadata_path => "D:\Users\xxxx\Desktop\logstash-7.5.2\jdbc_lastrun\jdbc_last_run.txt"
jdbc_default_timezone => "UTC" 
}
}
filter {
if [is_active] {
        mutate {    
            add_field => {
                "[@metadata][elasticsearch_action]" => "index"
            }
        }
        mutate {
            remove_field => [ "is_active","@version","@timestamp" ]
        }
       
    } else {
        mutate {    
            add_field => {
                "[@metadata][elasticsearch_action]" => "delete"
            }
        }
        mutate {
            remove_field => [ "is_active","@version","@timestamp" ]
        }
    } 
}
output {
elasticsearch {
hosts => "https://e5a4a4a4de7940d9b12674d62eac9762.eastus2.azure.elastic-cloud.com:1234"
user => "xxxx"
password => "xxxx"
index => "duckcreek"
action => "%{[@metadata][elasticsearch_action]}"
document_type => "_doc"
document_id => "%{claimnumber}"

}
stdout { codec => rubydebug }
}

Is there any option to achieve this?

It is getting an exception when it tries to COUNT the number of rows that the query is going to return. That should cause it to log an error, stop trying to COUNT, and then carry on executing the query.

That said, logstash generally does not maintain the order of events. If you need to maintain order then set pipeline.workers to 1 and (for now) disable the java_execution engine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.