Hi, I have a stream server. I store the logs on elasticsearch. The stream server's log contain requested urls from clients. The logs are similar to nginx logs but with more fields. one of the query params in the url is "product id". Every product has unique id in mysql database.
My problem is that when I create charts on kibana, I use these products IDs and I have to query mysql to find out what are the names of most visited products.
Is there any solution that I be able to use some text or csv file or even connect to that specific mysql table to add an extra field in logstash ?
I know there is some plugin like ip2location which can use local binary database, maybe I should build some database like that ... I have no idea... but the scenario is similar ... the input is some field and plugin ( ip2location ) add more fields based on input field.
Anyone had any experience ?
sorry for my English.
Thanx a lot
You can use Jdbc_streaming or Jdbc_static for that. (The latter might fit your purpose better, if your product names don't change frequently.)
1 Like
If you have a CSV file then a translate filter would also be an option.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.