ELK stack Integrating with Oracle DB

Hi Team,
I have basic ELK use case wherein I am consuming syslog data from client through Logstash, storing in ES & visualize in Kibana.

In addition to this, In Syslog incoming data I have IP address field, but I don’t have hostname of respective IP address. Now my requirement is I want to consume this respective Hostname of given IP address from my Oracle DB table. I have a ‘Device’ table in my DB which has ‘IP Address’ & ‘Hostname’ field. Which is the correct way to consume this info & how?

  1. Is there any Logstash or Elastic search plugin which can make query to DB in runtime to get this hostname info? If so, will there be any DB performance issue as I may have many such request in production environment.

  2. Should we write some script to get ‘Device’ table info from DB & store in some CSV file(some folder/filepath) then make query from Logstash/ES to get this hostname info ?

Thanks

What problem you are facing with IP?

Thanks
HadoopHelp

@rameshkr1994 , See we need to visualize our syslog data in Kibana, recognizing each customer or site with the IP address is difficult but hostname naming convention help us to get the meaning full info of device.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.