Binary files decoded to c++ structs transfered to elasticsearch while the library is continuously updated -How to do all of that the fastest way


I am wondering if I can use filebeat or Logstash or something else for streaming data from binary files to elastic the way I mentioned in the heading.
I need to know the fastest way to do that.

Here is what I am trying to do.
Those files are binary encoded files to save space. After decoding each file I get human readable structs (in c++). A lot of regular c++ structs.
I want my folder to be checked for new files automatically and then decode the new files ןinto structs . i want every struct object to be a record in elastic. and i want every struct object to be moved to elastic automatically.

Please help me , it’s important
I googled so much variations of my question and didn’t find an answer

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.