I have a general question about Elasticsearch / Kibana and if it is the right tool for the job I am trying to do. Apologies in advance if this is the wrong place for this or if the post is too long.
I have a file that contains network device outputs such as version, network connections, power usages, etc, etc. When I generate the file, my script places the name of the device and the command on every line of output. It looks something like this:
DeviceA, show version, v1.0
DeviceA, show connections, connected to deviceX
DeviceA, show connections, connected to deviceY
DeviceB, show version, v2.0
DeviceB, show connections, connected to deviceAlpha
DeviceB, show connections, connected to deviceBeta
What I end up with is a searchable file of state information for a large infrastructure. Currently I search this file you using, grep, sed, awk, uniq, wc (word count) etc. So for example, to check what devices have the most connections, I would do the following: grep "connected to" | print the first column only using awk | then count all the uniq entries. In this case I would end up with DeviceA - 2, and DeviceB -2. Or I could just grep "show version" and see the version of every device.
With this tagged file, unix search, and filters, I can search my entire infrastructure for an almost limitless combination of variables. My only problem is that nobody will use it because it is command line and not remotely resembling a product.
It would be great if anyone has an opinion if Elasticsearch and Kibana would work for this. Also it would be great to know if Logstash is required as these are static files. If this works, Elastic could make our entire infrastructure like a searchable pivot table. Servers, Storage, Databases could all be easily added.
Finally, advice on how to get the required stack up and running as quickly as possible would be great. I could have cloud provider as a proof of concept but would eventually probably host on premise.
Thanks!