Logstash shutdown after executing command for MS SQL writing to elastic search

The logstash runner shutsdown after executing the index once for MS SQL writing data to elastic search db.

The conf file:

input {
jdbc {

		jdbc_driver_library => "C:\MicrosoftSQLJDBCDriver\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"
		jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
		jdbc_connection_string => "jdbc:sqlserver://localhost\\sqlexpress:1433;databaseName=CUSTOMER"
		jdbc_user => "sa"
		jdbc_password => "C0mplex@123"
		
		statement => "SELECT * from CUSTOMERMASTER"
		jdbc_paging_enabled => "true"
		jdbc_page_size => "50000"
		#schedule => "4 * * * *"
}

}

output {
elasticsearch{
hosts => ["localhost:9200"]
index => "customermaster"
}
stdout{
codec => rubydebug
}
}

Output:

C:\logstash-760>bin\logstash -f mssql.conf
Sending Logstash logs to C:/logstash-760/logs which is now configured via log4j2.properties
[2020-02-23T00:00:46,186][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-23T00:00:46,404][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-02-23T00:00:49,810][INFO ][org.reflections.Reflections] Reflections took 46 ms to scan 1 urls, producing 20 keys and 40 values
[2020-02-23T00:00:52,876][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2020-02-23T00:00:53,178][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-02-23T00:00:53,249][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-02-23T00:00:53,256][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2020-02-23T00:00:53,315][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2020-02-23T00:00:53,398][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-02-23T00:00:53,529][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-02-23T00:00:53,543][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-02-23T00:00:53,580][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["C:/logstash-760/mssql.conf"], :thread=>"#<Thread:0x20c65556 run>"}
[2020-02-23T00:00:55,445][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-02-23T00:00:55,621][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2020-02-23T00:00:56,490][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-23T00:00:57,413][INFO ][logstash.inputs.jdbc ][main] (0.057173s) SELECT CAST(SERVERPROPERTY('ProductVersion') AS varchar)
[2020-02-23T00:00:57,837][INFO ][logstash.inputs.jdbc ][main] (0.002497s) SELECT TOP (1) count(*) AS [COUNT] FROM (SELECT * from CUSTOMERMASTER) AS [T1]
[2020-02-23T00:00:57,961][INFO ][logstash.inputs.jdbc ][main] (0.000614s) SELECT * FROM (SELECT * from CUSTOMERMASTER) AS [T1] ORDER BY 1 OFFSET 0 ROWS FETCH NEXT 50000 ROWS ONLY
C:/logstash-760/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"phone" => "9833745155",
"email" => "jinalcomplex@gmail.com",
"@version" => "1",
"custname" => "JINAL JAIN",
"@timestamp" => 2020-02-22T18:30:58.016Z,
"custid" => 2
}
{
"phone" => "9833745155",
"email" => "jinalcomplex@gmail.com",
"@version" => "1",
"custname" => "JINAL JAIN",
"@timestamp" => 2020-02-22T18:30:58.015Z,
"custid" => 1
}
{
"phone" => "9004799042",
"email" => "yashcomplex@gmail.com",
"@version" => "1",
"custname" => "YASH JAIN",
"@timestamp" => 2020-02-22T18:30:57.999Z,
"custid" => 1
}
{
"phone" => "9619391301",
"email" => "sagrita@gmail.com",
"@version" => "1",
"custname" => "SAGRITA JAIN",
"@timestamp" => 2020-02-22T18:30:58.020Z,
"custid" => 4
}
{
"phone" => "9761977131",
"email" => "jrishabh@gmail.com",
"@version" => "1",
"custname" => "RISHABH JAIN",
"@timestamp" => 2020-02-22T18:30:58.020Z,
"custid" => 5
}
{
"phone" => "8879979896",
"email" => "shaycomplex@gmail.com",
"@version" => "1",
"custname" => "SHAY JAIN",
"@timestamp" => 2020-02-22T18:30:58.018Z,
"custid" => 3
}
[2020-02-23T00:01:00,983][INFO ][logstash.runner ] Logstash shut down.

Can anyone assit me whats wrong here !

What do you expect it to do? You have not set a schedule option so it will execute the query once and then has nothing left to do.

If I set the schedular it is not executing even once

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.