I am delivering logs from Pivotal web service (based on Cloud Foundry) to remote ELK.
On Pivotal I've created log drain service which delivers logs to logstash via syslog TCP connection. Everything works fine but apparently anyone could connect to my remote box and write data.
Pivotal allows delivering custom key with value which could serve as Auth mechanism, but I didn't see simple password option available in TCP Logstash plugin.
I am curious anyone had similar scenario and describe how they solved it?
Current config:
input {
tcp {
port => 5000
type => syslog
}
}
On Pivotal:
Syslog Drain Url syslog://example.com:5000
Good Question.
Now when I think these key/value pairs are only for internal authorization (when internal service is bound to internal app).
I don't know much about syslog as protocol in the example above, my naive thinking would be I should be able to provide some kind of auth token, mockup:
syslog://example.com:5000?token=TOKEN_VALUE
which I should be somehow able to validate on the logstash end
Yes, was trying to search up on google but I couldn't find any descriptive howto for TCP Logstash plugin with SSL. I would be able to put certificate and password on my ELK box and setup Logstash input, but I don't have a clue how then Pivotal web service authorizes themselves.
Option 2, probably I could set up iptables and drop anything that's not coming from *.cf-apps.io on port 5000.
Option 3 seems easiest to achieve, but would require sending a token with each log entry.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.