Jdbc-streaming and jdbc-input gets different numeric results

I'm using jdbc-input to get some sales data at a SQL Server satabase, and then I run a jdbc-streaming to complement the data.

One thing I noticed is that for NUMERIC(18,4) data type, at the jdbc-input I get comma for decimal separator, and logstash shows it as an array.

ex.: {prc_unitar":[4,1826],"@version":"1","val_liqite":[10,1715]})

But when I run another SQL on the jdbc-streaming filter, with the same jdbc_* configuration, and for database fields with the same data type, I get it with dot decimal separator.

ex.: {"prc_unitar":21.13,"val_liqite":19.85,"cod_produt":22841}

I supose it's not the expected behaviour, right ?

For my situation I'd like to get with dot decimal separator, for input and filter, but didn't want to solve it at the SQL level just for input.

I enabled debug and got this output for each plugin :

I can see some configurations that are only available on input-jdbc, like codec and sequel_ptions but can't see the relation to the problem.

input-jdbc

[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_library = "/usr/share/logstash/vendor/jdbc-drivers/sqljdbc4.jar"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_driver_class = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_connection_string = "jdbc:sqlserver://<ip-address>:<port>;databaseName=<db-name>"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_user = "<user>"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_password = <password>
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_default_timezone = "America/Maceio"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@schedule = "* * * * *"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@clean_run = true
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@statement = "select * from sales"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@id = "281144412c87c6b9671e42c5a40ca86cbe330b87-1"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@enable_metric = true
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@codec = <LogStash::Codecs::Plain id=>"plain_8614bc66-509d-4a69-ad6d-6efa828e217c", enable_metric=>true, charset=>"UTF-8">
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@add_field = {}
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_paging_enabled = false
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_page_size = 100000
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validate_connection = false
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_validation_timeout = 3600
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@jdbc_pool_timeout = 5
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sequel_opts = {}
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@sql_log_level = "info"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts = 1
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@connection_retry_attempts_wait_time = 0.5
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@parameters = {}
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@last_run_metadata_path = "/usr/share/logstash/.logstash_jdbc_last_run"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@use_column_value = false
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@tracking_column_type = "numeric"
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@record_last_run = true
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@lowercase_column_names = true
[logstash.inputs.jdbc     ] config LogStash::Inputs::Jdbc/@columns_charset = {}

filter-jdbc_streaming

[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_driver_library = "/usr/share/logstash/vendor/jdbc-drivers/sqljdbc4.jar"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_driver_class = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_connection_string = "jdbc:sqlserver://<ip-address>:<port>;databaseName=<db-name>"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_user = "<user>"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_password = <password>
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@statement = "select * from sales_itens where sales_id = :id"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@parameters = {"id"=>"id"}
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@target = "itens"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@id = "281144412c87c6b9671e42c5a40ca86cbe330b87-3"
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@enable_metric = true
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@add_tag = []
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@remove_tag = []
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@add_field = {}
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@remove_field = []
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@periodic_flush = false
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_validate_connection = false
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@jdbc_validation_timeout = 3600
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@default_hash = {}
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@tag_on_failure = ["_jdbcstreamingfailure"]
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@tag_on_default_use = ["_jdbcstreamingdefaultsused"]
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@use_cache = true
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@cache_expiration = 5.0
[logstash.filters.jdbcstreaming] config LogStash::Filters::JdbcStreaming/@cache_size = 500

Hello i want to ask you if you had found solution to yout problem ? Thank you

@nano nothing yet.

I'll try to use some filter like ruby or mutate to fix it, but I think it should be solved on the plugins.

okay ! because i'm trying to use jdbc streaming but i have nothing in output , i see some data in mode debug

@nano I suggest you to open a new thread with more details of your cofigurations, maybe I can help.

alreday done , Problem with jdbc_streaming i will be glad if you can help me solving this

@endersonmaia

This is a sort-of bug, but I'm not sure which plugin is buggy.

This [4,1826] is a debug representation of the datatype - what do you see in ES?

I’m not using ES as output but RabbitMQ instead, but the JSON output is an array either.

The value should be 4.1826 numeric, but the database returns “4,1826" and logstash understands it as two numeric array itens, 4 and 1826.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.