How can i keep the fields case sensitive using logstash jdbc

Hi ,

I want the field name to be exactly same when importing from logstash instead of default lowercase conversion

For eg: 'FirstName' field in SQL should be imported exactly as 'FirstName' in elasticsearch not 'firstname'.

I don't think Logstash is doing any downcasing of field names. Please show your configuration files.

input {
jdbc {
jdbc_driver_library => "/opt/logstash-2.3.2/SQLServerDriver/sqljdbc4-3.0.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver:/xxxxx"
jdbc_user => "xxxxxxxx"
jdbc_password => "xxxxxxx"
jdbc_page_size => "50000"
statement_filepath => "sqlqueries/xxx.sql"
}
}
filter {

}
output {
elasticsearch {
hosts => ""
index => "xxx"
document_type => "xxxx"
document_id =>"%{serial}"
#protocol => "http"
}
stdout {
codec => rubydebug
}
}

I was able to resolve this by adding below parameter in input.

lowercase_column_names => false

Thanks

2 Likes