Logstash Not sending Data to Elastic Search only for jdbc input

Hi,

I am trying to get data from SQL server and push it into elastic search... Logstash and elastic search both are up and running ..
I can send sample data with below configuration and i can see index is created

input { stdin { } }
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

Which makes it clear that there is no issue while connecting logstash to sql server
However when i change input to JDBC data is not sent to elastic search neither it's printed ...

I ran logstash in debug mode there's no error on console.. and i also verified sql query is getting executed using profiler

Below is log extract
C:\logstash-6.8.0>bin\logstash -f config\request1.conf --debug
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/logstash-6.8.0/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/logstash-6.8.0/logs which is now configured via log4j2.properties

[DEBUG][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[logstash.outputs.elasticsearch] Using default mapping template
[logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xd662deb run>"}
[2019-06-13T23:34:44,732][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-06-13T23:34:44,770][DEBUG][logstash.agent ] Starting puma
[DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[DEBUG][logstash.api.service ] [api-service] start
[2019-06-13T23:34:45,154][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-06-13T23:34:45,626][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[INFO ][logstash.inputs.jdbc ] (0.113391s) SELECT CAST(SERVERPROPERTY('ProductVersion') AS varchar)
[INFO ][logstash.inputs.jdbc ] (0.000607s) SELECT TOP (1) count(*) AS [COUNT] FROM (select top(10)id,name from customer) AS [T1]
[DEBUG][logstash.inputs.jdbc ] Executing JDBC query {:statement=>"select top(10)id,perpname from Request", :parameters=>{:sql_last_value=>2019-06-13 13:33:29 UTC}, :count=>0}
[INFO ][logstash.inputs.jdbc ] (0.000781s) select top(10)id,perpname from Request
[DEBUG][logstash.inputs.jdbc ] Closing {:plugin=>"LogStash::Inputs::Jdbc"}
[DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0xd662deb sleep>"}
[DEBUG][logstash.pipeline ] Shutting down filter/output workers {:pipeline_id=>"main", :thread=>"#<Thread:0xd662deb run>"}
[DEBUG][logstash.pipeline ] Setting shutdown {:pipeline_id=>"main", :thread=>"#<Thread:0xd662deb run>"}
[2019-06-13T23:34:46,633][DEBUG][logstash.pipeline ] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x3dda03c2 run>"}
[2019-06-13T23:34:46,673][DEBUG][logstash.pipeline ] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x7fdfd396 dead>"}
[DEBUG][logstash.outputs.elasticsearch] Closing {:plugin=>"LogStash::Outputs::ElasticSearch"}
[DEBUG][logstash.outputs.elasticsearch] Stopping sniffer
[DEBUG][logstash.outputs.elasticsearch] Stopping resurrectionist
[DEBUG][logstash.outputs.elasticsearch] Waiting for in use manticore connections
[DEBUG][logstash.outputs.elasticsearch] Closing adapter LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter:0x413818d6
[DEBUG][logstash.outputs.stdout ] Closing {:plugin=>"LogStash::Outputs::Stdout"}
[2019-06-13T23:34:47,248][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0xd662deb run>"}
[DEBUG][logstash.instrument.periodicpoller.os] Stopping
[DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>0}
[2019-06-13T23:34:47,437][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>0}
[INFO ][logstash.runner ] Logstash shut down.

Additional info
elastic search version 7.1.1.
logstash version 6.8.8
OS windows 10

I am 100% sure it's connecting to sql server and executing query i can tell it looking at profiler.
I have also looked into elastic search log.. there's nothing .. no request came while log stash was executing

Please help

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.