I'm thinking how to monitoring LDAP server. For example I'd like to store every hour the situation of LDAP server and I'd like to respond to question like this
- How many users are active on LDAP?
- How many groups are active on LDAP?
- How many users are blocked on LDAP?
I think to use a script python that run every hour extract LDAP information and save the result on the CSV. Than I'd like to use logstash to load the file in elasticsearch.
Do you konw plug-in ot best practice to do this?
In order to be most helpful to you in querying for the statistics you describe, the best approach would be for you to present exactly what information you are going to be indexing on an hourly basis.
As far as indexing from your csv file to logstash, you'll want to investigate the CSV filter plugin 
Thank you Glen
I think that I could use a script python in order to collect information, transform them in JSON format and then store them in the elasticsearch. What do you think of this method?
On the other hand I think to use the CSV file with logstash, but I think that the JSON is better, do you agree with me?
Either of these methods of indexing is fine. Which is preferable is probably most dependent on personal preference/situational criteria. The method of indexing, though, doesn't really affect the ability to subsequently use queries to extract the information you describe. That's all up to the mapping/document structure you choose.
Thank you Glen.
At the end I used the python script and create some report that I stored in elasticsearch with logstash and then I created a dashboard with the different index used to collect the information.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.