Logstash stuck at 'Fixnum is deprecated'

Hi all, I was trying to send data to elasticsearch using scheduler. I want it to run once a day and update new records. But it gets stuck everytime after these outputs.
Please suggest any solution, thanks.

Sending Logstash logs to C:/logstash-7.6.1/logs which is now configured via log4j2.properties
[2020-05-05T18:46:10,371][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-05-05T18:46:10,646][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.1"}
[2020-05-05T18:46:18,642][INFO ][org.reflections.Reflections] Reflections took 207 ms to scan 1 urls, producing 20 keys and 40 values
[2020-05-05T18:46:21,535][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://<ip address>:9200/]}}
[2020-05-05T18:46:22,149][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://<ip address>:9200/"}
[2020-05-05T18:46:22,297][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-05T18:46:22,335][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-05-05T18:46:22,455][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//<ip address>:9200"]}
[2020-05-05T18:46:22,634][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-05-05T18:46:22,721][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-05T18:46:22,739][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/logstash-7.6.1/log_server_db.conf"], :thread=>"#<Thread:0x4fd171c5 run>"}
[2020-05-05T18:46:22,808][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-05-05T18:46:25,628][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-05T18:46:25,764][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-05T18:46:27,199][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9601}
C:/logstash-7.6.1/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/cronline.rb:77: warning: constant ::Fixnum is deprecated

I've already tried adding the timezone and using different versions of logstash but atill not getting rid of it.

What do you mean by stuck? It looks like that shouldn't stop the processing of the data.

Means nothing is happening after that, importing of data remains pending.

OK, well what does your config look like?

This is how the config file looks like

input {
  jdbc {
    jdbc_driver_library => "C:\Users\jpal1\Downloads\mysql-connector-java-5.1.39.jar"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/nsat_regression"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_user => "root"
    jdbc_password => "root123"
    tracking_column => "id"
    use_column_value =>true
    statement => "SELECT * FROM tab4 WHERE id > :sql_last_value"
    schedule => "0 0 * * *"
    clean_run => true
  }
}

output {
  elasticsearch {
    hosts => ["<ip_address>:9200"]
    index => "log_server_db"
  }

  stdout
  {
    codec => json
  }
}

I'm using logstash 7.6.1, though the issue persists on 7.6.2 also.

@Jaishree_Pal - schedule is set to 0 0 * * * which means the statement will run every day at 12:00AM. Logstash is probably not "stuck" - it is just waiting for the timing to execute the SQL statement.

2 Likes

Thanks a lot @ropc ! You're right, it wasn't stuck, it was waiting for that time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.