Hello,
I am looking for some pointers regarding how to convert input data from a JBDC plugin into json to send the content to an API.
Here is the logstash.conf:
input {
jdbc {
jdbc_driver_library => "/logstash-core/lib/jars/mssql-jdbc-11.2.0.jre11.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://servert:1433;databaseName=db;encrypt=false;trustServerCertificate=true;"
jdbc_user => "admin"
jdbc_password => "pass"
statement => "SELECT * from Metadata WHERE CreatedDate > :sql_last_value" # Comparison based on the latest value it ingested
schedule => "* * * * *" # => Polling is just a cronjob
use_column_value => true # Enables you to use one of the entry column to save as sql_last_value
tracking_column => "createddate" # Select the entry column to use, must be lowercase regardless of your column casing (otherwise by default it uses the last entry process time which is not accurate enough)
tracking_column_type => "timestamp" # Select the type
type => "db-logs-access"
last_run_metadata_path => "/usr/share/logstash/temporary/lastvalues_persistence/state_sql_last_value.yml" # Guarantuees persistence of the last_sql_value if the service restarts
target => "message"
}
stdin {
codec => "json"
}
}
filter
{
json {
source => "message"
target => "parsed"
}
http {
url => "http://myapi.com/api/enrichment?site=siteA"
verb => "POST"
body => "%{[message]}"
body_format => "json"
target_body => "api_response"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["url/"]
index => "xyz_index"
document_id => "%{[itemid]}"
user => "elastic"
password => "pass"
}
}
When i try to parse the input as json, the fomat is indeed different:
WARN ] 2023-01-11 15:25:02.887 [[pipeline_1]>worker0] json - Error parsing json {:source=>"message", :raw=>{"docname"=>"THE UNITED KINGDOM", "title"=>nil"....}
Is there anything to do with the JBDC plugin in order to have data as json?
Thanks,
BR,
L