Logstash-6.2.4 Pipeline has terminated

vadim@vadim-HP:~/www/FULL/Search/logstash-6.2.4$ bin/logstash -f logstash-simple.conf
Sending Logstash's logs to /home/vadim/www/FULL/Search/logstash-6.2.4/logs which is now configured via log4j2.properties
[2018-05-23T17:36:54,711][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/vadim/www/FULL/Search/logstash-6.2.4/modules/netflow/configuration"}
[2018-05-23T17:36:54,768][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/vadim/www/FULL/Search/logstash-6.2.4/modules/fb_apache/configuration"}
[2018-05-23T17:36:55,554][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-05-23T17:36:56,823][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"}
[2018-05-23T17:36:57,597][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-23T17:37:01,451][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::Elasticsearch index=>"products", document_type=>"products_description", document_id=>"%{products_id}", hosts=>[//127.0.0.1], id=>"fee76b4ebc3936f44487d37029b879b6a5bcea82e7731c9dddc5438cb7ef717c", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_ce825696-9529-4bac-a8f4-4d07ce9ba113", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-05-23T17:37:01,693][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-05-23T17:37:02,668][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://127.0.0.1:9200/]}}
[2018-05-23T17:37:02,691][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2018-05-23T17:37:03,010][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2018-05-23T17:37:03,117][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-05-23T17:37:03,123][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-05-23T17:37:03,157][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-05-23T17:37:03,204][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-05-23T17:37:03,299][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//127.0.0.1"]}
[2018-05-23T17:37:03,791][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x41822b3a run>"}
[2018-05-23T17:37:03,957][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2018-05-23T17:37:09,185][INFO ][logstash.inputs.jdbc ] (3.187628s) SELECT products.products_id, products_description.products_name, categories_description.categories_name, products_description.products_keywords, products.products_page_url FROM products LEFT JOIN products_description ON products.products_id = products_description.products_id LEFT JOIN products_to_categories ON products_description.products_id = products_to_categories.products_id LEFT JOIN categories_description ON products_to_categories.categories_id = categories_description.categories_id WHERE products.products_status = 1;
[2018-05-23T17:39:03,383][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x41822b3a run>"}

If you only have one jdbc input and you're not using the schedule feature Logstash will shut down once it has processed the results of a single query.

2 Likes

Could you show me example?

input {
jdbc {
schedule => "*****"
statement => "SELECT id, mycolumn1, mycolumn2 FROM my_table WHERE id > :sql_last_value"
use_column_value => true
tracking_column => "id"
# ... other configuration bits
}
}
Above schedule will run the statement, in Cron format for example: "* * * * *" (execute query every minute, on the minute)

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.