I am trying to execute a workflow in pnda using logstash.
The config file has ruby inline code and I understand we need to have plugin.
can you please tell me on how to install thesame
I am trying to execute a workflow in pnda using logstash.
The config file has ruby inline code and I understand we need to have plugin.
can you please tell me on how to install thesame
Are you sure you don't already have the ruby plugin? The page below describes how to list the plugins you have as well as how to install new ones.
https://www.elastic.co/guide/en/logstash/current/working-with-plugins.html
Thanks Magnus for your kind help. I was able to successfully install with logstash_plugin install logstash-ruby-filter and it is getting listed too.
However , I get this exception when I try to run the logstash, Please find below the config file
[2018-03-26T12:56:29,980][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Crea te/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 14, column 18 (byte 392) after filter {\n mutate {\n add_field => {\n "host_ip" => "localhost" # change the IP and set something meaningful\n }\n rename => { "message" => "rawdata" } # Put the content of th e message to the PNDA Avro 'rawdata' field\n ruby ", :backtrace=>["/opt/pnda/logstash-6.2.3/logstash-core/lib/logs tash/compiler.rb:42:in compile_imperative'", "/opt/pnda/logstash-6.2.3/logstash-core/lib/logstash/compiler.rb:50:in
compile _graph'", "/opt/pnda/logstash-6.2.3/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/Ruby Array.java:2486:in
map'", "/opt/pnda/logstash-6.2.3/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/opt/p nda/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:51:in
initialize'", "/opt/pnda/logstash-6.2.3/logstash-core/lib/lo gstash/pipeline.rb:169:in initialize'", "/opt/pnda/logstash-6.2.3/logstash-core/lib/logstash/pipeline_action/create.rb:40:in
execute'", "/opt/pnda/logsta
input {
tcp {
port => 20518
add_field => [ "src", "syslog" ]
}
}
filter {
mutate {
add_field => {
"host_ip" => "localhost" # change the IP and set something meaningful
}
rename => { "message" => "rawdata" } # Put the content of the message to the PNDA Avro 'rawdata' field
ruby {
# Convert the Logstash timestamp to a milliseconds timestamp integer
# You can use whatever makes sense to you as long as the timestamp is a valid timestamp integer in milliseconds
code => "event.set('timestamp', (event.get('@timestamp').to_f * 1000).to_f * 1000).to_i)"
}
}
}
output {
kafka {
codec => pnda-avro { schema_uri => "/opt/pnda/pnda.avsc" } # The PNDA Avro schema to use
bootstrap_servers => "localhost:9092"
topic_id => 'test'
compression_type => "none" # "none", "gzip", "snappy", "lz4"
value_serializer => 'org.apache.kafka.common.serialization.ByteArraySerializer'
}
}
The ruby filter doesn't belong inside the mutate filter.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.