Using sql server queries in elasticsearch 5.6.3

How can I use sql server queries in elastic search 5.6.3?
I don't know anything about this process.
Just I could install elastic search on windows successfully.
please help for more details.

Elasticsearch is not a relational database and does not have a SQL interface, so you can not use SQL queries. You query Elasticsearch through its query DSL.

I mean how can I connect elasticsearch to sql server?
AS this link : http://hintdesk.com/how-to-connect-elasticsearch-to-ms-sql-server/
But jdbc importer not work for elasticsearch 5

You can use logstash.

Note that you don’t really connect elasticsearch to the database but you use an ETL to extract data from the SQL DB transform to Json and load in elasticsearch.

I shared my thoughts here: http://david.pilato.fr/blog/2015/05/09/advanced-search-for-your-legacy-application/

Then you can use Logstash and its jdbc input plugin.

Thank you @Christian_Dahlqvist @dadoonet for guides.
When I try to install by this way:
bin/logsatsh-plugin install logstash-input-jdbc
I get this below error:

The system cannot find the path specified.
"could not find jruby in C:\Program Files\logstash-5.6.3\vendor\jruby"

But this path exists.
What is wrong

I don’t know but for sure on windows I’d avoid dir with space in their name.

Try by moving to another dir

Thank you so much. I moved "logstash-5.6.3" folder to d:\ and then I ran this command:

bin/logsatsh-plugin install logstash-input-jdbc

and it installed.Thank you.

I put jdbc42.jar file as jdbc_driver_library in this path:

D:\logstash-5.6.3\tools\ingest-converter\build\libs\sqljdbc42.jar

As the jdbc input plugin of logstash I create below file in bin path of logstash :

file: simple-out.conf

input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://localhost:1433/user=Elasticsearch;password=parsaparsa"
jdbc_user => "Elasticsearch"
jdbc_validate_connection => true
jdbc_driver_library => "/tools/ingest-converter/build/libs/sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT * from Video"
}
}
output {
elasticsearch {
protocol => http
index => "Video"
document_type => "video"
document_id => "%{uid}"
host => "localhost:9200"
}
}

And I run this command at cmd :

bin/logstash -f simple-out.conf Logstash startup

first : I get this error

ERROR: Unknown command 'Logstash'

second : is my conf file OK or what is wrong in my work?

The command you need to run is

bin/logstash -f simple-out.conf

Thank you. after run it the output is:

Sending Logstash's logs to D:/logstash-5.6.3/logs which is now configured via log4j2.properties
    [2017-10-18T22:38:53,550][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/logstash-5.6.3/modules/fb_apache/configuration"}
    [2017-10-18T22:38:53,753][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/logstash-5.6.3/modules/netflow/configuration"}
    [2017-10-18T22:38:53,769][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"D:/logstash-5.6.3/data/queue"}
    [2017-10-18T22:38:53,800][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"D:/logstash-5.6.3/data/dead_letter_queue"}
    [2017-10-18T22:38:53,862][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"ba9e9713-a8e0-4b31-aa55-5a1d955027b7", :path=>"D:/logstash-5.6.3/data/uuid"}
    [2017-10-18T22:38:55,816][ERROR][logstash.outputs.elasticsearch] Unknown setting 'protocol' for elasticsearch
    [2017-10-18T22:38:55,816][ERROR][logstash.outputs.elasticsearch] Unknown setting 'host' for elasticsearch
    [2017-10-18T22:38:55,972][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"Something is wrong with your configuration."}

You've been following outdated documentation. Please consult the documentation and examples for your version of Logstash to see what options are valid for the elasticsearch output.

Thank you,as jdbc-sqlserver-to-elastic-search I found my problems and I fix my problems, and it installed successfully.

I change my .conf file as below:

file: simple-out.conf

input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=myDB;user=Elasticsearch;password=parsaparsa"
jdbc_user => "Elasticsearch"
jdbc_validate_connection => true
jdbc_driver_library => "D:\logstash-5.6.3\tools\ingest-converter\build\libs\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT * from Video"
}
}
output {
elasticsearch {
index => "%{Video}"
document_type => "%{vid}"
document_id => "%{Subject}"
hosts => "localhost:9200"
}
}

then I called this url :

http://localhost:9200/Video/vid/Hello

but I get this output:

{"error":{"root_cause":[{"type":"index_not_found_exception","reason":"no such index","resource.type":"index_expression","resource.id":"Video","index_uuid":"_na_","index":"Video"}],"type":"index_not_found_exception","reason":"no such index","resource.type":"index_expression","resource.id":"Video","index_uuid":"_na_","index":"Video"},"status":404}

And what does the query "SELECT * from Video" return?

It contain Id,Subject,Description,,UploadDate,CategoryId,.... another fields.
and here exists the result of running this command:

logstash -f simple-out.conf

is:

Sending Logstash's logs to D:/logstash-5.6.3/logs which is now configured via log4j2.properties
[2017-10-19T16:27:26,976][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"D:/logstash-5.6.3/modules/fb_apache/configuration"}
[2017-10-19T16:27:26,976][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"D:/logstash-5.6.3/modules/netflow/configuration"}
[2017-10-19T16:27:28,236][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-10-19T16:27:28,236][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-10-19T16:27:28,392][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-10-19T16:27:28,439][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-19T16:27:28,439][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-19T16:27:28,455][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-10-19T16:27:28,455][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-10-19T16:27:29,142][INFO ][logstash.pipeline        ] Pipeline main started
[2017-10-19T16:27:29,408][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-10-19T16:27:30,392][INFO ][logstash.inputs.jdbc     ] (0.000000s) SELECT * from Video
[2017-10-19T16:27:32,158][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

What I must be do?
please help me.

It contain Id,Subject,Description,,UploadDate,CategoryId,.... another fields.

Okay, but does it contain columns named "Video" and "vid", since you with

index => "%{Video}"
document_type => "%{vid}"

have configured Logstash to pick the index and type name from the fields named "Video" and "vid".

it not contained columns named "Video" and "vid".
I suppose I want to search from column with "subject" name I mus be do:

index => "%{subject}"
document_type => "%{subject}"
document_id => "%{subject}"

is it true?
and how can call with url?

it not contained columns named "Video" and "vid".

Then your configuration doesn't make sense. As I said in my previous post, your current configuration stores events in an index named after the contents of the Video field. If you want it to be stored in an index called Video just say Video, not %{Video}.