Hi @jasonwomack welcome to our discuss forum, and thanks for the post!
An optimal data ingestion architecture for your SIEM can depend on several factors that may be specific to your environment, such as the number of distinct data sources from which you want to collect events, the scale of events you expect to process in the SIEM, your organization's existing enterprise information architecture, the availability of parsers/shippers that format data into Elastic Common Schema (ECS) format, and your team's ability/willingness to create and maintain new parsers/shippers if necessary.
Elastic SIEM requires data (logs, events, alerts, etc.) to be normalized into ECS format. You can learn about ECS on our ECS documentation page, and there's a handy SIEM reference page that tells you which ECS fields the SIEM app relies upon.
The easiest way to accomplish ECS normalization is to use the Elastic-supplied Beats modules, which convert data from its original format into ECS format while shipping it into Elasticsearch, where the SIEM app can work on it. Beats can run directly on end systems, or you can designate a system in your environment on which you run Beats (e.g., Filebeat), then you can send logs from your systems to the Beats system for ECS conversion and shipment to Elasticsearch. You can find a list of currently available data source integrations on our Security Integrations web page.
If you don't find a Beats module for one of your data sources, you could also consider creating one yourself. Here's a good blog post that describes how one user created a new Filebeat module for their endpoint security product, so they could use it with Elastic SIEM.
Logstash is not required to use the SIEM app, and currently, may require more effort on your team's part to find/create Logstash configurations that can parse your data and convert it to ECS.
Logstash might be appropriate if your enterprise information architecture requires sending data streams to multiple analytics systems, or already uses message queueing, or has special requirements for ingest-time processing that only Logstash can perform.
Hope this is helpful, and please let us know how you're doing.