I'm trying to use logstash to send data directly to an http event collector (HEC). The HEC collector accepts the following correctly. But I'm trying to translate the to the appropriate HTTP Output config for logstash.
There may not be a lot of folks here who use Logstash with Splunk. I can't tell you much about the HTTP output. Splunk did recently announce improved support ingesting from Kafka (about time!), so perhaps you could go...
logstash -> kafka -> splunk
Of course many of us here would encourage you to just store the data in Elasticsearch instead of Splunk.
I was able to successfully send data directly to the HEC HTTP Event Collector with the following settings. Keys were to use the 'raw' input and to have a valid certificate for the destination. I'm sure a JKS trusted store would work as well. HEC expects JSON, and make sure acknowledgement is off on the HEC side.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.