Failed to execute action error in Logstash

Hi All,

I have a logstash_sample.conf file which have :-

input {
    jdbc {
        jdbc_connection_string => "jdbc:oracle:thin:@host:port/XE"
        jdbc_user => "username"
	jdbc_password => "password"
        jdbc_driver_library => "/home/tomadm/JatinSandbox/KibanaDbTry/ojdbc7.jar"
        jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
	#jdbc_validate_connection => true
        statement => "select * from metable"
    }

}
output {
    stdout { codec => json_lines }
	elasticsearch {
	index => "batchre"
	hosts => "http://my_ip:port"
	document_type => "Batchdoc"
	}
}

I used the same config with already present table and it worked perfectly fine.

Now I am trying to do the same with a different table I have created (metable) myself(It appears in the tables list of Db so i have write permissions). I have also inserted some dummy values.

But this time it shows this error in logstash logs=

[2018-10-29T06:10:01,778][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.23.213.99:9201/"}
[2018-10-29T06:10:01,854][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-29T06:10:01,859][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-10-29T06:10:01,877][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-29T06:10:02,209][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"<LogStash::Inputs::Jdbc jdbc_connection_string=>\"jdbc:oracle:thin:@host:port/XE\", jdbc_user=>\"username\", jdbc_password=><password>, jdbc_driver_library=>\"/home/tomadm/JatinSandbox/KibanaDbTry/ojdbc7.jar\", jdbc_driver_class=>\"Java::oracle.jdbc.driver.OracleDriver\", statement=>\"select * from metable\", id=>\"c5910e3e089e1c55280c08602538771152e87dcf5105dcabaa30040841039404\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_28fa4420-990e-46ba-a0cd-f2e594038518\", enable_metric=>true, charset=>\"UTF-8\">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>\"info\", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, last_run_metadata_path=>\"/home/tomadm/.logstash_jdbc_last_run\", use_column_value=>false, tracking_column_type=>\"numeric\", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>", :error=>"can't dup Fixnum", :thread=>"#<Thread:0x2e77a42b run>"}
[2018-10-29T06:10:02,903][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<TypeError: can't dup Fixnum>, :backtrace=>["org/jruby/RubyKernel.java:1882:in `dup'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date/format.rb:838:in `_parse'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/date.rb:1830:in `parse'", "/apps/tomcat/elk/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:87:in `set_value'", "/apps/tomcat/elk/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:36:in `initialize'", "/apps/tomcat/elk/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/value_tracking.rb:29:in `build_last_value_tracker'", "/apps/tomcat/elk/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/inputs/jdbc.rb:216:in `register'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:342:in `register_plugin'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:353:in `register_plugins'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:500:in `start_inputs'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:394:in `start_workers'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:290:in `run'", "/apps/tomcat/elk/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:250:in `block in start'"], :thread=>"#<Thread:0x2e77a42b run>"}
[2018-10-29T06:10:02,946][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

I did some googling and it says it is usually because of syntax error but my syntax works fine.
Moreover the records of the table that I inserted are also washed away from the database after frequent logstash conf runs.

please help me create an index. My original goal is to create an index with schedule parameter so the data in Kibana Visualization is almost real time.

Thank you.

Note:-Due to character limit i have deleted INFO level outputs from the log.

hope this helps you,

I can't even complete the first step of executing the logstash.conf file. Let alone following the template part to load data.
Moreover, In your config and in mine only 2 things differ. You have used OJDBC14 whereas I have used OJDBC7 and provided pagination details parameters. I don't think driver would make such difference because similar script worked fine before with ojdbc7 and i tried changing pagination parameters and still same error.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.