Oracle BLOB column with jdbc input to base64

When fetching an Oracle BLOB column with logstash's jdbc input, I try to convert it to base64 using the following filter:

ruby { 
    code => 'event.set("b64", Base64.strict_encode64(event.get("blob_column")))' 
  }

which yields valid base64. However, when decoding the b64 string, the content is different from the BLOB content. It seems as if, at some place, an encoding conversion was attempted, because bytes that can be represented as printable ASCII remain unchanged, while others are replaced by the replacement character U+FFFD � or a "zero literal" \u0000.
Is there a way to keep the column value without any further encoding as the original byte sequence, in order to create a base64 string?

Are you setting the charset or columns_charset options on the jdbc input?

I tried

columns_charset => {
      "blob_column" => "BINARY"
}

already, but still same behavior.

I found a solution that suits my needs. This refers to logstash in docker, image logstash:7.13.1.

I modified the following ruby file
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.0.7/lib/logstash/plugin_mixins/jdbc/jdbc.rb
to create a base64 string from any Sequel::SQL::Blob type.

***************
*** 256,271 ****
--- 256,273 ----
      private
      def decorate_value(value)
        case value
        when Time
          # transform it to LogStash::Timestamp as required by LS
          LogStash::Timestamp.new(value)
        when Date, DateTime
          LogStash::Timestamp.new(value.to_time)
+       when Sequel::SQL::Blob
+         Base64.strict_encode64(value.to_s)
        else
          value
        end
      end
    end
  end end end

The filter in the pipeline's configuration is not needed anymore.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.