Is ELK the right tool for tracking business metrics from a RDBMS?

I am new to the ELK stack and I'm wondering if it is the right tool for tracking business metrics.

I understand it's great at DevOps, crunching logs, and monitoring application status. But what about tracking revenue over time for example, or the number of users' growth?

This data would be stored in the application's Postgres database. Elasticsearch doesn't do relational, so how do you get one to work with the other and update as new rows come in?

Another approach I could think of is to have the application itself generate separate "events" that I could send to logstash every time a certain thing occurs (new user signup, new sale...). I could serialize the data myself this way independently of what gets stored in the application's Postgres database.
A potential downside would be that if I wanted to measure something that I didn't think of earlier, I'd have to add the event generation in the code for all future events of the type, and extract the historical data myself to feed it into logstash. I don't think that's a huge deal, but it's not as flexible as being able to query every single column in every single database right away.

Does this make any sense, or am I trying to use the ELK stack for something it's not meant to be? If ELK isn't the right tool, then what would I need?

The Elastic Stack, formerly known as ELK, is a great tool for exploring and visualizing any type of data. Logs and DevOps are just 2 very popular usecases.

Logstash has a JDBC input plugin that could be used to pull data from Postgres on a scheduled basis. Each row from a Postgres table would then get inserted into Elasticsearch as a document.

Good to know, thank you. It's not explicitly said in the documentation, but I assume the jdbc statement can be any SQL, including one with joins?

The only difference between "devops" data and "business" data is the end user.

The structure of a log is fundamentally no different from a financial transaction. They have a timestamp, associated parties (IPs or sender/receiver), values of some sort (bytes or dollars) and status codes. There is more, but I think it gives you an idea of what I mean.

With that in mind, there's absolutely no difference for the underlying Elastic Stack. It's just time based data you can analyse however you want.

What confuses me is that devops data comes from filebeat or from parsing logs.

In the case of business data, it's stored in a Postgres database and not in logs. ELK seems made to parse logs, hence my question.

It was made to be a search engine, it just happened to be good for logs and other time based data analysis.

Yep, which is why the JDBC input exists.

It can be anything that provides a valid output (from the query).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.