Migrating nested data from mysql to elasticsearch

I have got following error when i run my script.

ilsa@ILSA-LAPTOP-01:/usr/share/logstash$ sudo bin/logstash -f /etc/logstash/conf.d/mysql.conf

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-10-02 17:43:44.099 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-10-02 17:43:44.119 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.8.3"}
[ERROR] 2019-10-02 17:43:53.552 [Converge PipelineAction::Create] ruby - Invalid setting for ruby filter plugin:

filter {
ruby {
# This setting must be a path
# File does not exist or cannot be opened sampleRuby.rb
path => "sampleRuby.rb"
...
}
}
[ERROR] 2019-10-02 17:43:53.562 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:86:in `config_init'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:126:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-ruby-3.1.5/lib/logstash/filters/ruby.rb:48:in `initialize'", "org/logstash/plugins/PluginFactoryExt.java:78:in `filter_delegator'", "org/logstash/plugins/PluginFactoryExt.java:248:in `plugin'", "org/logstash/plugins/PluginFactoryExt.java:184:in `plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:71:in `plugin'", "(eval):12:in `initialize'", "org/jruby/RubyKernel.java:1061:in `eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:43:in `block in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `block in exclusive'", "org/jruby/ext/thread/Mutex.java:165:in `synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `exclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:39:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:334:in `block in converge_state'"]}
[INFO ] 2019-10-02 17:43:53.972 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2019-10-02 17:43:58.876 [LogStash::Runner] runner - Logstash shut down.

here is my config file.

Blockquote

input{
jdbc {
jdbc_connection_string => "jdbc:mysql://172.17.0.3:3306/pro_fayvo_db"
jdbc_user => "test"
jdbc_password => "test"
jdbc_driver_library => "/home/ilsa/mysql-connector-java-5.1.36-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "select u.email,u.username,u.created_at,up.text_content from users u join user_posts up ON up.user_id = u.id limit 0, 10"
}
}

filter {
ruby {
path => 'sampleRuby.rb'
}
}

output {
stdout { codec => rubydebug }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "user_posts"
"document_type" => "user_post"
}
}

Thanks in advance

I have solved the issue by providing the full path in filter . It should be as follow.

filter {
ruby {
path => '/etc/logstash/conf.d/sampleRuby.rb'
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.