How to Capture the data from a DB server using a query and visualise in kibana

Hi All,

We have requirement to capture the data from a remote DB server using a DB query/table and visualise that metrics into dash board. Is there any step by step procedure available to get it done?

Thanks,
Chaitanya.

Have a look at https://www.elastic.co/blog/logstash-jdbc-input-plugin.

Hi Magnus,

Thanks for sharing. I have installed logstash-jdbc-input-plugins.

But i don't see any files created like simple-out.conf and contacts-index-logstash.conf. Is there any specific location i can find them and alter the settings.

And also see that its required to run the logstash with "simple-out.conf". Will that impact the data that i am getting from beats, because in logstash config file we have given the beats as input.

Please advise.

Thanks for you help,
Chaitanya.

But i don't see any files created like simple-out.conf and contacts-index-logstash.conf. Is there any specific location i can find them and alter the settings.

You are expected to create those files yourself.

And also see that its required to run the logstash with "simple-out.conf". Will that impact the data that i am getting from beats, because in logstash config file we have given the beats as input.

If you use simple-out.conf in the same Logstash instance as the beats input you'll most likely have to make adjustments to at least one of the files. Get things working when running in a separate instance first, then worry about doing it all in one instance (if that's even a requirement).

Thanks for the guidance.

So you want me to install the logstash in separate server and integrate with the same kibana what we are using for beats data.

Will that work?

Thanks,
Chaitanya.

Yes, but you don't have to install it in a different server. Just run a second instance on the same server.

Ohh.. That's interesting. I will take a look into that. Thank You very Much for the help.

I have created required file for this requirement and started the new logstash instance. but it is giving below drivers error. is there any specific location that i need to copy these drivers.

PS C:\logstash_DB\bin> .\logstash -f .\simple-out.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Error: sun.jdbc.odbc.JdbcOdbcDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_lib
rary?
You may be interested in the '--configtest' flag which you can
use to validate logstash's configuration before you choose
to restart a running system.

This is the configuration i used

# The path to our downloaded jdbc driver
        jdbc_driver_library => "C:\logstash_DB\sqljdbc_6.0\jdbc\ojdbc5.jar"
        # The name of the driver class for Postgresql
        jdbc_driver_class => "sun.jdbc.odbc.JdbcOdbcDriver"

Thanks,
Chaitanya.

I assume you've double-checked that C:\logstash_DB\sqljdbc_6.0\jdbc\ojdbc5.jar exists and is readable to the user running Logstash? I'd try using forward slashes in the path instead of backslashes.

I am able to fix the issue with the drivers, I need to mention Java:: before to the driver.

The query is giving the results and able to see in Kibana but it is not updating, its only showing one record since i restart the logstash DB instance.

How can i make this to update for every 10 mins ?

The jdbc input documentation contains a section on how to run the query periodically.

Yes, i have given it for every minute.

schedule => "* * * * *"

I am using below command to start it
.\logstash -f .\simple-out.conf

As soon as I close the power shell or terminate the command its not recording. If I leave the PS open its recording the log.

Is that how it suppose to be? I need some automatic way to get the data, like how we get it from filebeat or topbeat.

Thanks.

Well, the schedule will only run if Logstash is running.

But I still see the logstash service is running without any issues. is there any other way to automate this?

Sorry, I have no idea what you're asking. You say it works if Logstash when starting it with .\logstash -f .\simple-out.conf. So... what's the problem?

I am running this command Pwersell window. Logastash is sending the data if I keep open the power sell after executing this command. As soon as I tried to terminate or if I close the PS, it not recording anything.

Even i tried giving N for Terminate batch job (Y/N)? but still no use.

PS C:\logstash_DB\bin> .\logstash agent -f .\simple-out.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Logstash startup completed
{
      "count(1)" => 13,
      "@version" => "1",
    "@timestamp" => "2016-02-16T10:08:03.330Z"
}
←[33mSIGINT received. Shutting down the pipeline. {:level=>:warn}←[0m
^CTerminate batch job (Y/N)? Logstash shutdown completed
n
PS C:\logstash_DB\bin>
PS C:\logstash_DB\bin> 

So how can keep the logstash to run always and execute this query for every minute

You need to run Logstash as a service. I'm not very familiar with that myself and I don't think there's an official way of doing it, but googling the topic results in lots of hits.