Hi Team,
I have the following logstash pipeline configuration.
input {
tcp {
port => 5102
codec => json
}
}
filter {
json {
source => "message" remove_field => [ "message" ]
}
ruby {
code => '
b = event.get("buffer")
n = event.get("node")
if b and n
event.set("dec_data", Base64.decode64(event.get("data")))
event.set("file_name", "#{n}/#{b}")
end
'
}
}
output {
stdout {
codec => rubydebug
}
}
With the above config, I am able to decode base64 data from JSON message. But I am not able to write the decoded data to a file whose name is defined by the above "event.set" declaration above. Can someone suggest how to get this working.
I added the following statements in the output section of my logstash pipeline.
file {
path => "%{node}/%{file_name}"
codec => line
}
With the above change, I can see a file getting created i.e. "node0/node0/my_daemon/dbg". But I want to write data from event field "dec_data" into this file which is not happening. The file is empty.
file {
codec => line { format =>"%{[dec_data]}" }
path => "/%{node}/%{file_name}" # or you have to hardcode path "/node0/%{file_name}"
flush_interval => 5
write_behavior => "append"
}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.