which yields valid base64. However, when decoding the b64 string, the content is different from the BLOB content. It seems as if, at some place, an encoding conversion was attempted, because bytes that can be represented as printable ASCII remain unchanged, while others are replaced by the replacement character U+FFFD � or a "zero literal" \u0000.
Is there a way to keep the column value without any further encoding as the original byte sequence, in order to create a base64 string?
I found a solution that suits my needs. This refers to logstash in docker, image logstash:7.13.1.
I modified the following ruby file /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.7/lib/logstash/plugin_mixins/jdbc/jdbc.rb
to create a base64 string from any Sequel::SQL::Blob type.
***************
*** 256,271 ****
--- 256,273 ----
private
def decorate_value(value)
case value
when Time
# transform it to LogStash::Timestamp as required by LS
LogStash::Timestamp.new(value)
when Date, DateTime
LogStash::Timestamp.new(value.to_time)
+ when Sequel::SQL::Blob
+ Base64.strict_encode64(value.to_s)
else
value
end
end
end
end end end
The filter in the pipeline's configuration is not needed anymore.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.