Hello!
Im trying to add Squid and MS Exchange logs to my ELK cluster. We need to store information about where users go on the Internet and user mail forwarding logs for the last 180 days.
The problem with squid is that a separate document is created for each line of the log. Because of this, the volume of logs becomes too large. More than 60GB per day, and its too much for us. Is it possible to reduce the size of indexes by combining user data into a single document?
A little different problem with Exchange. I thought that i can try to filter a log of Exchange so that to cut off excess fields, having left only timestamp, sender, recipients, messagesubject, and also to delete logs with system accounts. All remaining data will be recorded in one daily index as one document. But I'm not sure that this is technically possible, as well as do not know whether it will be possible to search in such indexes. For example, in Exchange I can get all the necessary logs with one command -
Get-MessageTrackingLog -Sender user@domain.com | ftTimestamp, sender, recipient, message, subject-AutoSize. Is this possible in elasticsearch?
I saw some threads about Squid and Exchange logging in ELK, but it looks like they didn't have a problem with indices volume.
Not sure if my question related to logstash or filebeat or elasticsearch. If moderator can, please move this topic to the appropriate section.