Hi Below is my config i am trying to index the data from MSSQL to Elasticsearch using Logstash but field order is getting changed after getting indexed into elasticsearch.
input {
jdbc {
jdbc_driver_library => "D:\Users\mmurugesan\Desktop\driver\mssql-jdbc-7.4.1.jre12-shaded.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://EC2AMAZ-J90JR4A\SQLEXPRESS:1433;databaseName=xxxx;"
jdbc_user => "xxx"
jdbc_password => "xxxx"
jdbc_paging_enabled => true
tracking_column => modified_date
use_column_value => true
clean_run => true
tracking_column_type => "timestamp"
schedule => "*/1 * * * *"
statement => "Select pl.Policyholdername,pl.Age,pl.Dob,pl.Client_Address,cl.claimnumber,cl.claimtype,cl.is_active,cl.Modified_date from claim cl
inner join Policy pl on pl.Policynumber=cl.policynumber
where cl.Modified_date >:sql_last_value"
last_run_metadata_path => "D:\Users\mmurugesan\Desktop\logstash-7.5.2\jdbc_lastrun\jdbc_last_run.txt"
jdbc_default_timezone => "UTC"
}
}
filter {
if [is_active] {
mutate {
add_field => {
"[@metadata][elasticsearch_action]" => "index"
}
}
mutate {
remove_field => [ "is_active","@version","@timestamp" ]
}
} else {
mutate {
add_field => {
"[@metadata][elasticsearch_action]" => "delete"
}
}
mutate {
remove_field => [ "is_active","@version","@timestamp" ]
}
}
}
output {
elasticsearch {
hosts => "xxx"
user => "xxx"
password => "xxx"
index => "duckcreek"
action => "%{[@metadata][elasticsearch_action]}"
document_type => "_doc"
document_id => "%{claimnumber}"
}
stdout { codec => rubydebug }
}
Screen shot given for reference
How to set the field order as given in the select statement in my config?
Also i need to know how to change the date format using logstash?
Could some one help me to solve above queries? Thanks in advance