Logstash base64 encode/decode from jdbc into Elastic does not generate same results

Hi, I am trying to ingest BLOB data using jdbc input plugin into Elastic.
I use a Base64 encoder filter to before ingesting into Elastic and viceversa when reading.
Both logstash configurations (from Oracle->logststash->Elastic and Elastic->logstash->file works apparently OK with no errors, but the binary content retrieved is different from the one ingested.

I am not expert on ELK framework so probably doing something wrong.
Any feedback highly appreciated.


Ingest config:
input {
jdbc {
jdbc_driver_library => "/mnt/data/ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@server:1521/XE"
jdbc_user => ""
jdbc_password => ""
statement => "SELECT content from MYTABLE"
# Declare blob fields as binary, to avoid text conversions
columns_charset => {
"content" => "BINARY"
}
}
}

filter {
base64 {
field => "content"
action => "encode"
}
}

output {
elasticsearch {
hosts => "elasticsearch"
index => "demo"
}
}


Export config:
input {
elasticsearch {
hosts => "elasticsearch"
index => "demo"
query => '{ "query": { "query_string": { "query": "*" } } }'
size => 500
scroll => "5m"
docinfo => true
}
}

filter {
base64 {
field => "content"
}
}

output {
file {
codec => line { format => "%{content}"}
path => "/mnt/data/output/%{[@metadata][_id]}"
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.