ELK installation completed with latest version 6.x.x on linux servers But i am not able to access webpage

All services are up and running no issue with services and also with below test.

[root@xxx ~]# curl http://localhost:9200
{
"name" : "_ZCxA3a",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "2N1OSkYjSMmP3Nh8nYZsuw",
"version" : {
"number" : "6.1.2",
"build_hash" : "5b1fea5",
"build_date" : "2018-01-10T02:35:59.208Z",
"build_snapshot" : false,
"lucene_version" : "7.1.0",
"minimum_wire_compatibility_version" : "5.6.0",
"minimum_index_compatibility_version" : "5.0.0"
},
"tagline" : "You Know, for Search"
}
[root@xxx ~]# curl http://localhost:5601

var hashRoute = '/app/kibana'; var defaultRoute = '/app/kibana'; var hash = window.location.hash; if (hash.length) { window.location = hashRoute + hash; } else { window.location = defaultRoute;

What does it says?

page cannot display

Can you share logs of elasticsearch and kibana?

FWIW - Mine displays and I get that same curl response, so that appears normal. Oddly, sometimes I've found I need to use the IP of the host and not "localhost" to get a response. In some cases it's because I edited the Kibana.yml file and added the real IP to the server variable so that was my fault but in other cases I had not changed it with the same result. Try your actual IP and see...

FYI. Should be your kibana.yml config I guess:

# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.
# The default is 'localhost', which usually means remote machines will not be able to connect.
# To allow connections from remote users, set this parameter to a non-loopback address.
server.host: 0.0.0.0

Hey Iauea ,

Thanks for Help after changing in Kibana.yml file its working fine :slight_smile :smile:

Great!

How to install jdbc driver for logstash

You mean jdbc plugin? if yes, you can use it directly, which installed when you installed logstash.
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#_description_19

i did the jdbc configuration and logstash.conf as well but kibana not fetching data... Please suggest.

[root@tip jre8]# cat /etc/logstash/conf.d/logstash.conf
input {
jdbc {

                    jdbc_connection_string => "jdbc:sqlserver://ill01\INST1:1433;databasename=threat;user=root;password=root"
                    jdbc_user => 'root'
                    jdbc_password => 'root'
                    jdbc_driver_library => "/usr/share/logstash/sqljdbc_4.2/enu/jre8/sqljdbc42.jar"
                    jdbc_driver_class => "com.mysql.jdbc.Driver"
                    statement=> "select * from temp_table;"

    }

}
output {

      elasticsearch {
            hosts => ["tip:9200"]
            index => "scom_all_nodes_count_ci"
            document_type => "data"
            id => "AV6ZVU-gMiaMkuWGIl0a"
    }

Lack of scheduler.

jdbc {
    # others configuration
    schedule => "* * * * *"
  }

still same issue

I moved your question to #logstash

Sorry, maybe need more log information, or'll be hard to locate what issue.

let me know which logs you need?

Logstash log is preferred. :slightly_smiling_face:

@panakaj_patel

You can use Kibana to create a Pipeline using JDBC, but you still have some work to do at the filesystem level. Here is a working version of a pipeline or conf file using JDBC and MySQL:

'''

input {
jdbc {
jdbc_driver_library => "/usr/share/java/mysql-connector-java-5.1.45-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://server:3306/dbname"
jdbc_user => "username"
jdbc_password => "password"
schedule => "*/5 * * * *"
jdbc_page_size => "50000"
jdbc_paging_enabled => "true"
statement => "SELECT * blah blah blah"
}
}

filter {
}

output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "index-%{+YYYY.MM.dd}"
document_id => "%{optional_column}"
}
}

'''

Next, make sure you have downloaded the proper JDBC connector for your DB type and have the path correct in the top line above. I went to the MySql site and downloaded the JDBC connector but you appear to need the MS one at https://www.microsoft.com/en-us/download/details.aspx?id=55539. Put the JAR file in the location given in the input section.

Verify permissions and be sure the user specified has rights to that DB/Table and it is accessible at that IP.

Finally, I run this:

/usr/share/logstash/bin/logstash --verbose &

then

tail -f /var/log/logstash/logstash-plain.log

Any errors will show up and you can see the query run (every 5 minutes in my example). Also, in my example, I break the query into 50000 record chunks.

If you see the query execute and no errors pop up, you should then be able to go into Kibana and create a new Index Pattern with the name you specified in the output section.

I then kill my manual run and launch the logstash service now that I know it's happy.

Hope that helps!

How to find correct driver for mysql...

here is my logstash configuration file & error logs file ..
input {
jdbc {

                    jdbc_connection_string => "jdbc:mysql://itt-self-service;databasename=threats;user=root;password=Its@123"
                    jdbc_user => "root"
                    jdbc_password => "IT-threats@123"
                    jdbc_driver_library => "/usr/share/logstash/sqljdbc_4.2/enu/jre8/sqljdbc42.jar"
                    jdbc_driver_class => "com.mysql.jdbc.Driver"
                    schedule=> "* * * * *"
                    statement=> "select * from select * from Test_DB;"

}
}

output {

    stdout {
     codec => rubydebug
 }

      elasticsearch {
            hosts => ["http://tip:9200"]
            index => "scom_all_nodes_count_ci"

}
}

Error as below...........

[root@tip logstash]# bin/logstash -f /etc/logstash/conf.d/logstash.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
{ 2010 rufus-scheduler intercepted an error:
2010 job:
2010 Rufus::Scheduler::CronJob "* * * * *" {}
2010 error:
2010 2010
2010 LogStash::ConfigurationError
2010 com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.3/lib/logstash/plugin_mixins/jdbc.rb:159:in open_jdbc_connection' 2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.3/lib/logstash/plugin_mixins/jdbc.rb:227:inexecute_statement'
2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.3/lib/logstash/inputs/jdbc.rb:271:in execute_query' 2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.3/lib/logstash/inputs/jdbc.rb:250:inblock in run'
2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:234:in do_call' 2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:258:indo_trigger'
2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:300:in block in start_work_thread' 2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:299:inblock in start_work_thread'
2010 org/jruby/RubyKernel.java:1292:in loop' 2010 /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:289:inblock in start_work_thread'
2010 tz:
2010 ENV['TZ']:
2010 Time.now: 2018-02-01 13:26:00 +0200
2010 scheduler:
2010 object_id: 2008
2010 opts:
2010 {:max_work_threads=>1}
2010 frequency: 0.3
2010 scheduler_lock: #Rufus::Scheduler::NullLock:0x4aef3fe1
2010 trigger_lock: #Rufus::Scheduler::NullLock:0x22f3383
2010 uptime: 29.648361 (29s648)
2010 down?: false
2010 threads: 2
2010 thread: #Thread:0x37518325
2010 thread_key: rufus_scheduler_2008
2010 work_threads: 1
2010 active: 1
2010 vacant: 0
2010 max_work_threads: 1
2010 mutexes: {}
2010 jobs: 1
2010 at_jobs: 0
2010 in_jobs: 0
2010 every_jobs: 0
2010 interval_jobs: 0
2010 cron_jobs: 1
2010 running_jobs: 1
2010 work_queue: 0

I'm so sorry, I was away and did not see your question. I hope you found it by now but the MySQL driver is located at:

https://dev.mysql.com/downloads/connector/j/

Just download it, extract it and place the mysql-connector-java-5.1.45-bin.jar file in the path specified in jdbc_driver_library => "/usr/share/java/mysql-connector-java-5.1.45-bin.jar".