Can't seem to find a solution for this. Here's the situation:
I am setting up a PoC environment, which has two Azure event hubs configured to input Azure AD data (which themselves are coming from separate environments).
I have a filebeat instance I'm trying to pull the data in from, and pass off to a Logstash instance to eventually send to ES Cloud.
Filebeat works fine if I set up a single azure module definition, which points to a specific event hub. It doesn't appear I can duplicate the block to point to a second event hub.
What would be the recommended setup to achieve this goal? I'm kind of new to filebeat modules, so I'm not entirely sure if in this instance I even NEED the module, or if plain inputs are fine?
Yeah, so for the moment assume they are both activitylogs. (The point about the different log types is good, but the setup isn't that far yet...and likely we'd have different ones for the different log types).
I've kind of figured out that the Azure module, if you put two entries in like so:
The second one will be "active" and the first one ignored. If I flip them around it works.
You might wonder why we're doing this. It's a bit of a multi-tenant environment, and the separate event hubs is part of how this particular client separates data...not something I can directly change
What we've done as a bandaid for the moment is have two filebeat agents running on the same box we're pulling these logs from (using different filebeat.yml files / path.data paths) and that does work, but feels a bit clunky. I don't know if this is the recommended way or not.
That won't work. What version of beats are u using? This was a bug that was fixed. There was a bug with how the filebeat.modules section was parsed. If u copy the config that u have to the modules.d/azure.yml, it will work as expected
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.