Logstash with Docker ( Unable to connect to database )

I've already changed the driver, I've already checked the connection string for a possible error, but apparently everything is fine. I am trying to remotely connect to a SQL Server database through JDBC via Logstash running on Docker. And I'm getting the following error, I hid the real IP

Unable to connect to database. Tried 3 times {:message=>"Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host IP_HIDDEN, port 1433 has failed. Error: \"Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.\".", :exception=>Sequel::DatabaseConnectionError, :cause=>#<Java::ComMicrosoftSqlserverJdbc::SQLServerException: The TCP/IP connection to the host IP_HIDDEN, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".>

Here my JDBC Config:

jdbc { 
        tags => ["usuario"] 
        jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver" 
        jdbc_driver_library => "/usr/share/logstash/plugins/connectors-database/sqlserver/mssql-jdbc-9.2.1.jre8.jar"
        jdbc_connection_string => "jdbc:sqlserver://IP_HIDDEN:1433;databaseName=DATABASE;integratedSecurity=false;"
        jdbc_user => "user" 
        jdbc_password => "password"
        sql_log_level => "debug"
        connection_retry_attempts => 3
        statement => "select * from usuario" 
    }

If I connect through SQL Server Management Studio with the same IP and credentials, it works correctly. But with JDBC it doesn't work. Does anyone know what can it be?

Information that may be useful: SQL Server is hosted on AZURE

Here are all the connectors I've tried, and both give the same error

image

I have a similar problem but my problem is occurring in Output

output {
jdbc {
jdbc_connection_string => "jdbc:postgresql://10.0.00.00:5432/dwanaics"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_driver_library => "/usr/share/logstash/vendor/jar/jdbc/postgresql-42.2.24.jar"
statement => "INSERT INTO tms.nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)"
jdbc_user => "postgres"
jdbc_password => "D*r"
}
}

displays the following erro
[2024-04-09T15:12:45,430][ERROR][logstash.outputs.jdbc ] Unknown setting 'jdbc_user' for jdbc
[2024-04-09T15:12:45,431][ERROR][logstash.outputs.jdbc ] Unknown setting 'jdbc_password' for jdbc
[2024-04-09T15:12:45,431][ERROR][logstash.outputs.jdbc ] Unknown setting 'jdbc_driver_library' for jdbc
[2024-04-09T15:12:45,431][ERROR][logstash.outputs.jdbc ] Unknown setting 'jdbc_connection_string' for jdbc
[2024-04-09T15:12:45,431][ERROR][logstash.outputs.jdbc ] Unknown setting 'jdbc_driver_class' for jdbc

the access data is correct

The jdbc output is a third-party plugin. None of those options exist, they all have other names. See here.

Because your config is in output. Should be Input

input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://10.0.00.00:5432/dwanaics"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_driver_library => "/usr/share/logstash/vendor/jar/jdbc/postgresql-42.2.24.jar"
statement => "INSERT INTO tms.nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)"
jdbc_user => "postgres"
jdbc_password => "D*r"
}
}

my goal is to extract data from a postgresql database and insert it into another postgresql database

I will force the complete code

input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://db.consulta.interno.transpofrete.com.br:7432/transpofrete"
jdbc_user => "anals"
jdbc_password => "HbBXEt"
jdbc_driver_library => "/usr/share/logstash/vendor/jar/jdbc/postgresql-42.2.24.jar"
jdbc_driver_class => "org.postgresql.Driver"
schedule => "* * * * *"
statement => "SELECT oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao FROM nota_fiscal_conhecimento"
tracking_column => "oid_conhecimento"
use_column_value => true
tracking_column_type => "numeric"
last_run_metadata_path => "/usr/share/logstash/.logstash_jdbc_last_run"
}
}
output {
jdbc {
connection_string => "jdbc:postgresql://10.2.10.00:5432/transpofrete_tms?user=postgres&password=Linc2077"
driver_class => "org.postgresql.Driver"
statement => "INSERT INTO nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)"
}
}

Now it shows the following error

[2024-04-09T18:02:00,083][INFO ][logstash.inputs.jdbc ][main][34d2bcf272f4a1d0e9ce3d20017dd0d109ecf12eb90850ef891e5d9d19e77e6f] (0.017555s) SELECT oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao FROM nota_fiscal_conhecimento where oid_nota_fiscal = '489041'
[2024-04-09T18:02:00,218][ERROR][logstash.outputs.jdbc ][main][77220514e6e06f7e998cc005a4cd891671f42e01e6872d299d45f3d26b0c2250] JDBC - Exception. Not retrying {:exception=>#<Java::OrgPostgresqlUtil::PSQLException: No value specified for parameter 1.>, :statement=>"INSERT INTO nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)", :event=>"{"dt_gravacao":null,"@version":"1","@timestamp":"2024-04-09T18:02:00.087760016Z","usuario_gravacao":null,"oid_conhecimento":366304,"oid_nota_fiscal":489041}"}
[2024-04-09T18:03:00,296][INFO ][logstash.inputs.jdbc ][main][34d2bcf272f4a1d0e9ce3d20017dd0d109ecf12eb90850ef891e5d9d19e77e6f] (0.015347s) SELECT oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao FROM nota_fiscal_conhecimento where oid_nota_fiscal = '489041'
[2024-04-09T18:03:00,418][ERROR][logstash.outputs.jdbc ][main][77220514e6e06f7e998cc005a4cd891671f42e01e6872d299d45f3d26b0c2250] JDBC - Exception. Not retrying {:exception=>#<Java::OrgPostgresqlUtil::PSQLException: No value specified for parameter 1.>, :statement=>"INSERT INTO nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)", :event=>"{"dt_gravacao":null,"@version":"1","@timestamp":"2024-04-09T18:03:00.301779262Z","usuario_gravacao":null,"oid_conhecimento":366304,"oid_nota_fiscal":489041}"}

what to do to insert the values ​​of the other columns

However, when trying to insert this data into the other table using the INSERT statement, some of the values ​​are missing or null, which is causing the error.

Can you help me please?

This is the first time I use logstash

You cannot assume that logstash will fill in the oid_conhecimento column from the [oid_conhecimento] field -- you need to tell it. I cannot test it, but perhaps something like

statement => "INSERT INTO nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (%{oid_conhecimento}, %{oid_nota_fiscal}, %{dt_gravacao}, %{usuario_gravacao})"

input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://db.consul.com.br:7432/trane"
jdbc_user => "anaics"
jdbc_password => "HXEt"
jdbc_driver_library => "/usr/share/logstash/vendor/jar/jdbc/postgresql-42.2.24.jar"
jdbc_driver_class => "org.postgresql.Driver"
schedule => "* * * * *"
statement => "SELECT oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao FROM nota_fiscal_conhecimento where oid_nota_fiscal = 489041"
tracking_column => "oid_conhecimento"
use_column_value => true
tracking_column_type => "numeric"
last_run_metadata_path => "/usr/share/logstash/.logstash_jdbc_last_run"
}
}

output {
jdbc {
connection_string => "jdbc:postgresql://10.2.00.20:5432/transe_tms?user=analytics&password=Hbt"
driver_class => "org.postgresql.Driver"
statement => "INSERT INTO nota_fiscal_conhecimento1 (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (%{oid_conhecimento}, %{oid_nota_fiscal}, %{dt_gravacao}, %{usuario_gravacao})"
}
}

But it appears that the field values ​​are being interpreted incorrectly. Placeholders "%{oid_conhecimento}", "%{oid_nota_fiscal}", "%{dt_gravacao}", "%{usuario_gravacao}" are not being replaced with actual event values.

Having taken another look at the documentation, you could enable the unsafe_statement option to make the output sprintf that, but first try

statement => [ "INSERT INTO nota_fiscal_conhecimento (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)", "oid_conhecimento", "oid_nota_fiscal", "dt_gravacao", "usuario_gravacao" ]

You could use "%{oid_conhecimento}" etc in that, but it will inhibit automatic type conversion in the output.

The error indicates that the "oid_conhecimento" column is of type bigint, but the expression provided is character varying.

That's seems likely if you use "%{oid_conhecimento}" etc, but for "oid_conhecimento" the code should be converting that value to an integer (not that it needs converting).

I managed to fix this part now it's the dt_gravação that has the error

filter {
mutate {
# convert => {
convert => { "dt_gravacao" => "string" }
# "oid_conhecimento" => "integer"
# "oid_nota_fiscal" => "integer"
# }
remove_field => ["@timestamp", "@version"]
}
}

output {
jdbc {
connection_string => "jdbc:postgresql://10.200.00:5002/tra_tms?user=anacs&password=HXEt"
statement => [
"INSERT INTO nota_fiscal_conhecimento1 (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, ?, ?)", "oid_conhecimento", "oid_nota_fiscal", "dt_gravacao", "usuario_gravacao" ]
}
}

ERROR: column "dt_gravacao" is of type timestamp without time zone but expression is of type character varying

I solved the problem

    statement => ["INSERT INTO nota_fiscal_conhecimento1 (oid_conhecimento, oid_nota_fiscal, dt_gravacao, usuario_gravacao) VALUES (?, ?, TO_TIMESTAMP(?, 'YYYY-MM-DD HH24:MI:SS'), ?)", "oid_conhecimento", "oid_nota_fiscal", "dt_gravacao", "usuario_gravacao" ]
}

}

Thank you master, it helped me a lot to find the solution

1 Like