Architectural advise requested

Hi all,

I'm new to ESK and I could really use some help in figuring out the best way to set up what I'm trying to achieve. I have an SQL database that contains sales data, one row for each sold item. That table is a combination of sales from several stores. I want to combine that data with two other data sets: one is geolocation of the individual stores, which is just a fixed set of about 20 stores. The other is weather data, which obviously differs per store and even per timestamp.

I have this set-up in mind:
-Run ESK (duh!)
-Run LogStash
-Use JDBC plugin for Logstash to get the SQL Data
-Run Kibana to visualise

So this is where I'm getting mixed up. For the weather data I could write my own input for LogStash and create an index of that. But that doesn't help me in combining the sales data with the weather data. Would that be something LogStash can do (by creating nested objects for example) automatically or should I write my own separate scripts to handle this?

Same goes for the Geo data. The number of stores is fixed and small, so I might as well get the Long/Lat manually, but how to then merge that with the sales index?

Hope you can help me in figuring this out. Thanks for any help.

What is the S in ESK?

What you can do is have a translation table (see the translate LS filter) for the geodata. Then it's easy to attach the details to each sales record. Weather though is harder, where do you expect to get that info from?

1 Like

Hi Mark. Thanks for your reply! The S should be an L, newbie alert!

Will look into the translate LS filter, thanks for pointing me there. The weather data can be retrieved through an API like Darksky.net using lat/long (I've done this before) . That should be parsable using Logstash, although again leaving the question of merging that with the sales data.

It'd be worth moving this thread to the Logstash category (just edit the subject)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.