I'm working on a project to integrate IBM QRadar SIEM with Elastic Security. The goal is to use Elastic specifically for its SOAR capabilities (Case Management, Automation, Response Actions) while keeping QRadar as the primary log collector.
The setup:
All logs are currently ingested by QRadar.
I want to forward these logs to Elastic without redeploying agents (like Elastic Agent) to the endpoints.
Elastic should act as the incident management and orchestration layer.
My questions:
Ingestion Strategy: What is the best practice for forwarding data from QRadar? Should I use a standard Syslog Destination (LEEF format) or poll the QRadar API for Offenses/Events via Logstash?
ECS Mapping: Does anyone have experience mapping QRadar LEEF fields to Elastic Common Schema (ECS)? I'm looking for Logstash configurations or Ingest Pipelines to ensure the "Security" app in Elastic correctly recognizes the data.
SOAR Efficiency: Since I won't have Elastic Agents on the hosts for "Response Actions" (like host isolation), how far can I get with Webhook/Rest API connectors for automated response?
Any advice, architecture diagrams, or common pitfalls would be greatly appreciated. Thanks!
I think this scenario is pretty rare and I'm not sure if this makes much sense to implement.
The features you mentioned that you want to use basically requires that your data is indexed in Elasticsearch, so you would need to duplicate the data you have in QRadar into Elastic and create the Security Rules in Elastic search, not QRadar.
So to use SIEM features you would need to have your data duplicated in two different tools.
This depends a lot on what are your data sources, what kind of data are you indexing in QRadar? You would need to send the raw message from them to Elasticsearch and create the parsers yourself or if it is something that has a native integration you could use the ingest pipelines from the integration to parse the data, but this expect the raw data without any changes on the format.
This can be done using logstash as a proxy to receive the data and send it to the correct ingest pipeline, but since you are not using Elastic Agent to collect the data, this may require a lot of work.
This depends on the license, to use webhooks you need a paid license, platinum or enterprise, without it the only actions that Kibana can do is index the alert to an index or write the alert in the kibana log file.
You could use logstash to read the alerts index and perform some requests to webhook, but again, you would need to write the pipelines etc.
What exactly you want to send from QRadar to Elastic? Just alerts or the raw data?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.