Ask for a help. Does the logstash output s3 plugin support getting data from Kafka?

Ask for a help. Does the logstash output s3 plugin support getting data from Kafka? Below is my configuration

input {
kafka {
bootstrap_servers => "10.88.14.172:9092,10.88.6.9:9092,10.88.10.166:9092"
#topics => ["k8s"]
#group_id => "k8s1"
topics => ["fluent-log"]
#topics_pattern => ".*"
group_id => "k8s2"
#如果使用元数据就不能使用下面的byte字节序列化,否则会报错
#key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
#value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
consumer_threads => 4
#默认为false,只有为true的时候才会获取到元数据
decorate_events => true
auto_offset_reset => "latest"
codec => "json"
}
}
output {
s3{
access_key_id => ""
secret_access_key => ""
region => ""
bucket => ""
prefix => "%{[container_name]}/%{+YYYY}/%{+MM}/%{+dd}"
temporary_directory => "/tmp/logstash_s3"
encoding => "gzip"
codec => "plain"
rotation_strategy => "time"
time_file => 5
}
}
[root@ip-10-88-126-115 bin]#

The Kafka input plugin can read events from Kafka, placing the events into the pipeline's queue. The pipeline workers pick up those events and process them as instructed, including routing to output plugins like the s3 output.

No plugin has knowledge of the other plugins in the pipeline.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.