hey team,
I have the following logstash config file,
jdbc {
jdbc_driver_library => "C:\Program Files\sqljdbc_6.2\enu\mssql-jdbc-6.2.2.jre8.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://xxxxxx.xxxxx..xx\Strongmail:1433"
jdbc_user => "xxx"
jdbc_password => "xxxxxxxxxx"
sql_log_level => "debug"
lowercase_column_names => "false"
schedule => "*/1 * * * *"
use_column_value => true
tracking_column => recordId
last_run_metadata_path => "logstash_jdbc_last_run_SM_SUCCESS_LOG"
statement => "exec strongmail.dbo.CRM_EMAIL_DELIVERY_LOGS_GET :sql_last_value, null"
}
}
output {
stdout { codec => json }
#file {
# path => "f:/logstash_output_logs/test1.log"
#}
#stdout {codec => rubydebug}
}
put When the sql output get printed to standard output the ordering of the sql statement changes, how do we we the output of the jdbc input to the same as the ordering in the actual stored proc in SQl server. Basically we want the timestamp to the be the first field.
thanks