Index not found in ES


(Dhairya) #1

I have configured the ES,LS and Kibana for my redshift database. I could see that LS started and working fine but dont see index in the ES. What could be reason here ?

index details :

root@ip-10-0-1-113:~# curl 'localhost:9200/_cat/indices?v'
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open logstash-2017.12.11 EXhGFTiNQOC5hEvhf0bJOQ 5 1 2 0 9.7kb 9.7kb
yellow open pg_stat_activity BrYAYp-jShSXt2-LM-zTSQ 5 1 40 0 386.7kb 386.7kb
yellow open .kibana jhE1s1hxTNq319Mu8BA3vQ 1 1 6 0 36.1kb 36.1kb
yellow open logstash-2017.12.13 LgEJJrS_TMuQwcMZruobgw 5 1 12 0 46.9kb 46.9kb
yellow open logstash-2017.12.12 CJuoyWY0QjyTTNNnTpTDlQ 5 1 2 0 9.7kb 9.7kb
root@ip-10-0-1-113:~#

LS output :

root@ip-10-0-1-113:/etc/logstash-6.0.1/conf.d# /etc/logstash-6.0.1/bin/logstash -f /etc/logstash-6.0.1/conf.d/conf_redshift.conf
Sending Logstash's logs to /etc/logstash-6.0.1/logs which is now configured via log4j2.properties
[2017-12-15T14:45:15,327][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/etc/logstash-6.0.1/modules/fb_apache/configuration"}
[2017-12-15T14:45:15,329][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/etc/logstash-6.0.1/modules/netflow/configuration"}
[2017-12-15T14:45:15,709][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-15T14:45:16,272][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-15T14:45:19,361][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"Redshift", document_type=>"Redshift", hosts=>["localhost:9200"], id=>"b6ac7f1bc9ff0cb43a6a03727b3a5bf9af47d9f61652fd1e08934877e4a33615">}
[2017-12-15T14:45:20,333][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-12-15T14:45:20,335][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-12-15T14:45:20,695][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-12-15T14:45:20,803][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2017-12-15T14:45:20,803][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2017-12-15T14:45:20,821][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-15T14:45:20,834][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-15T14:45:20,844][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-12-15T14:45:20,904][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x7e66d8e@/etc/logstash-6.0.1/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-15T14:45:21,584][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2017-12-15T14:45:21,749][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2017-12-15T14:46:08,329][INFO ][logstash.inputs.jdbc ] (0.666344s) SELECT * from pg_stat_activity where query_start > '2017-12-15 14:43:01.755373'
[2017-12-15T14:47:01,832][INFO ][logstash.inputs.jdbc ] (0.274178s) SELECT * from pg_stat_activity where query_start > '2017-12-15 14:46:07.518632'
[2017-12-15T14:48:01,285][INFO ][logstash.inputs.jdbc ] (0.236496s) SELECT * from pg_stat_activity where query_start > '2017-12-15 14:47:01.557541'
[2017-12-15T14:49:01,116][INFO ][logstash.inputs.jdbc ] (0.234937s) SELECT * from pg_stat_activity where query_start > '2017-12-15 14:48:01.047752'


(Dhairya) #2

Seems it is not able to connect the DB. I tried it with below codes but it's not displaying any output. Does it support Redhsift database ?

output {
stdout { codec => json_lines }
}

conf.d file details.
input {
jdbc {
jdbc_connection_string => "jdbc:redshift://cluster-name.c6srgdetn1g4.us-west-2.redshift.amazonaws.com:8192/DB-NAME"
jdbc_user => " ****** "
jdbc_password => " ***** "
jdbc_validate_connection => true
jdbc_driver_library => "/tmp/RedshiftJDBC42-1.2.10.1009.jar"
jdbc_driver_class => "com.amazon.redshift.jdbc42.Driver"
statement => "SELECT * from pg_stat_activity where query_start > :sql_last_value "
last_run_metadata_path => "/tmp/logstash-oradb.lastrun"
record_last_run => true
schedule => "* * * * *"
}
}

root@ip-10-0-1-113:/etc/logstash-6.0.1/# /bin/logstash --config /etc/logstash-6.0.1/conf.d/rs_3.conf
Settings: Default pipeline workers: 4
Pipeline main started


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.