How to aggregate data from 2 different lines of a LDAP log


(Noureddine Brahmi) #1

Hello, I'm fresh to the ELK but I'm really liking the work with it so far.

I'm current working on a solution that allows us to monitor LDAP logs. and I was wondering if I could display data from several documents in one line. Knowing that they have a field in common.

Here's an exemple to make things clear:

LDAP log lines :

*[04/Aug/2018:22:34:15 +0200] conn=184214 op=-1 msgId=-1 - fd=52 slot=52 LDAP connection from 10.169.146.54:3625 to 10.68.27.65
*[04/Aug/2018:22:34:15 +0200] conn=184214 op=0 msgId=1 - BIND
dn="cn=azerty,ou=manager,o=s" method=128 version=3
*[04/Aug/2018:22:34:15 +0200] conn=184214 op=0 msgId=1 - RESULT err=0 tag=97 nentries=0 etime=0.000450 dn="cn=poiuyr,ou=manager,o=s"

Knowing that every field in this log document is well filtered using the grok pattern. I want to display for example in one chart the IP address of connection(10.169.146.54), the BIND message(dn="cn=azerty,ou=manager,o=s) and the etime of the result (etime=0.000450). and the common field of everything is the conn number("conn=184214")

Currently in the chart that is provided in the discovery mode I only get the fields that are in the same document.

Thank you very much for the help !


(Chris Davies) #2

I can't think of a good way to do that. It's similar to a SQL join, but for performance reasons, Elasticsearch doesn't do joins.

Generally, the strategy is to try to store your data in a format that is search-friendly for your intended usecases. So, for example, instead of storing your LDAP logs as a document per log-file line, you'd store them as a document per IP address, maybe with a new entry per day / week / whatever time interval makes sense, and update the appropriate document as new log info comes in.

That may not be the best approach, but that's the general idea. Can you store your data in a way that makes search and retrieval easy and efficient?


(Noureddine Brahmi) #3

Hello Chris and thank you for your respond.

Unfortunately I can't store the data in another way at this moment. Because within my company another team is incharge of the logstash parse of the logs. And we require a budget to make any change ( we don't have a budget at this moment ). So I've been looking for another solution using Kibana 6.3 .


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.