Write base64 decoded field from JSON message to a file

Hi Team,
I have the following logstash pipeline configuration.

input {
    tcp {
        port => 5102
        codec => json
    }
}
filter {
    json {
        source => "message" remove_field => [ "message" ]
    }
    ruby {
        code => '
            b = event.get("buffer")
            n = event.get("node")
            if b and n
                event.set("dec_data", Base64.decode64(event.get("data")))
                event.set("file_name", "#{n}/#{b}")
            end
        '
    }
}
output {
    stdout {
        codec => rubydebug
    }
}

With the above config, I am able to decode base64 data from JSON message. But I am not able to write the decoded data to a file whose name is defined by the above "event.set" declaration above. Can someone suggest how to get this working.

Thanks,
Arinjay

@Arinjay_Jain if you want to write the parsed log to a file, shouldn't you be defining it as output to logstash ? Refer: File output plugin | Logstash Reference [8.6] | Elastic

If you like to write in a file:

output {
file { path => "/path/file_%{+YYYY-MM-dd}.txt"
}

Thanks @Ayush_Mathur @Rios for the reply. I have the following JSON data coming from remote host.

{
      "@version" => "1",
          "data" => "ZGIAAKgAAAALAAAAAAAAAAAAAAAvZGV2L3N5c2RiAAC7AAAAJgAAAAAAAAAAAAAAL2Rldi90bXBuYW1lL3N5c2RiLWVkbS9qaWRfNDI0L2NoYW5fMQAAALwAAAAmAAAAAAAAAAAAAAAvZGV2L3RtcG5hbWUvc3lzZGItZWRtL2ppZF80MjQvY2hhbl8xAAAAvQAAADkAAAAAAAAAAAAAAC9lZG0vbm9kZTBfUlAwX0NQVTAvZGV2L3RtcG5hbWUvc3lzZGItZWRtL2ppZF80MjQvY2hhbl8xAAAAAL4AAAA5AAAAAAAAAAAAAAAvZWRtL25vZGUwX1JQMF9DUFUwL2Rldi90bXBuYW1lL3N5c2RiLWVkbS9qaWRfNDI0L2NoYW5fMQAAAADGAAAACwAAAAAAAAAAAAAAL2Rldi9zeXNkYgAA2gAAAAsAAAAAAAAAAAAAAC9kZXYvc3lzZGIAAO0AAAAmAAAAAAAAAAAAAAAvZGV2L3RtcG5hbWUvc3lzZGItZWRtL2ppZF80MjQvY2hhbl8xAAAA7gAAADkAAAAAAAAAAAAAAC9lZG0vbm9kZTBfUlAwX0NQVTAvZGV2=",
        "buffer" => "my_daemon/dbg",
      "dec_data" => "db\x00\x00\xA8\x00\x00\x00\v\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xBB\x00\x00\x00&\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xBC\x00\x00\x00&\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1C\x00\x00\x00\f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x004\x00\x00\x00\x05\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00NULL\x00\x00\x00\x00c\x00\x00\x00\v\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\v\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00",
     "file_name" => "node0/my_daemon/dbg",
    "@timestamp" => 2023-02-02T22:17:43.165252311Z,
          "node" => "node0"
}

I added the following statements in the output section of my logstash pipeline.

    file {
        path => "%{node}/%{file_name}"
        codec => line
    }

With the above change, I can see a file getting created i.e. "node0/node0/my_daemon/dbg". But I want to write data from event field "dec_data" into this file which is not happening. The file is empty.

Can you suggest how to achieve this?

Thanks,
Arinjay

I think this will work:

    file { 
	codec => line { format =>"%{[dec_data]}" }
	path => "/%{node}/%{file_name}" # or you have to hardcode path "/node0/%{file_name}"
	flush_interval => 5
	write_behavior => "append"
	}
1 Like

Thanks @Rios for the suggestion. I am able to see the decoded base64 data being written in the file specified by path attribute.

Thanks,
Arinjay

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.