Looking for some help to implement disk and file system (files and directories) usage collection by Logstash and storage in Elasticsearch.
Basically, data processing workflows that we want to evaluate server performance along side data access and data storage. We are using Topbeat with Logstash, Elasticsearch and Kibana and it would be good to have a comprehensive time series data set that can all be easily analysed. We will have application logs to store along with the disk and file system usage metrics (and/or logs).
Collectd (df and disk modules) is storing data in InfluxDB for redundancy.
Have looked at 'df' and 'du' commands and they seem to produce the needed information which could be scripted, logged, parsed and collected. Not sure if this is the way to go.
Hope there are plenty of operations people out there that have addressed this use case.
Grateful for assistance.