Index / document design for API logs, plus some extra data

Hello,

I am struggling a bit with a business use case, and I am trying to see if Elastic is the right fit.

The use case is to be able query our systems' API logs (calls to our APIs from the outside) to answer questions. The API logs contain timestamp info and the request / response payload. There are separate logs for requests and responses.

We'd done a V1 of this approach with ElasticSearch: 2 indexes: "request*" and "response*" (each an index per-day). There's a guid to unify the response to the request.

I think I am struggling on the disparate or eventually-consistent nature of the data that we have even now..

How's the V1 approach look? Given that a request will be present, and a response will eventually be present, is this the right index approach?

I am coming from the relational DB world and I envision the structure where "Request" leads everything and then I'd eventually add rows in related tables, such as "Response", etc.

More data is expected to be piled-on and I am def. struggling with dealing with the "related" data that will eventually be added. It's just so hard to break relational DB thinking habits. There's a very large volume of data to deal with and we can use any solution if we keep the time horizon to 10 days, but that's changing as more data comes in..

My questions are:

  1. Given the case above. Did we design our indexes well?
  2. Given that we want to add more related data. Are we in the right tool's neighborhood with Elastic?

Thank you so much!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.