Configure logstash input from DB values

Hi Team, as I am new to the ELK technology i need some help on this. I've a requirement in which i need to get log location(More than 1 for sure)s from a DB table and pass those values in logstash input to view the data in kibana. Can you please share some examples from which I'll refer and go ahead .

hello @Ryan2 welcome :blush:

I understood your requirement -

please follow step/action require by you to perform the same-

Step 01-

You need database driver as per below logstash config.

lostashfile.conf

input {
  jdbc {
    jdbc_driver_library => "C:\Users\xxxx\Downloads\Elasticsearch\sqljdbc_4.2\enu\jre8\sqljdbc42.jar"
    #jdbc_driver_class => "com.microsoft.jdbc.sqlserver.SQLServerDriver"
	jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver:/XXXXXX;integratedSecurity=false;"
	
    
    jdbc_user => "user"
	jdbc_password => "pass"
	jdbc_validate_connection => true

    statement => "SELECT * FROM table1"
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "dbdataindex"
  }
}

after data available as index in elastic search then add use kibana as default config if using localhost.

for more reply

Thanks
HadoopHelp

Thanks ramesh for the reply. But my requirement is different from what you have answered. Its like there is table in database which stores log path location of different servers. I need to get those location from the table first and then I need pass those table data into input section of logstash and show the logs in kibana.