I am delivering logs from Pivotal web service (based on Cloud Foundry) to remote ELK.
On Pivotal I've created log drain service which delivers logs to logstash via syslog TCP connection. Everything works fine but apparently anyone could connect to my remote box and write data.
Pivotal allows delivering custom key with value which could serve as Auth mechanism, but I didn't see simple password option available in TCP Logstash plugin.
I am curious anyone had similar scenario and describe how they solved it?
port => 5000
type => syslog
Syslog Drain Url
Now when I think these key/value pairs are only for internal authorization (when internal service is bound to internal app).
I don't know much about syslog as protocol in the example above, my naive thinking would be I should be able to provide some kind of auth token, mockup:
which I should be somehow able to validate on the logstash end
No, that's not possible. The only means of authentication for a tcp input are
- SSL client certificate validation,
- network firewalling, or
- custom filtering of payload (e.g. requiring clients to prefix the payload with an authentication token and dropping all events that don't).
Yes, was trying to search up on google but I couldn't find any descriptive howto for TCP Logstash plugin with SSL. I would be able to put certificate and password on my ELK box and setup Logstash input, but I don't have a clue how then Pivotal web service authorizes themselves.
Option 2, probably I could set up iptables and drop anything that's not coming from *.cf-apps.io on port 5000.
Option 3 seems easiest to achieve, but would require sending a token with each log entry.
A creative poor mans idea would be to use pivotal app guid, which could serve as app secret, they confirmed it's unique and persist across instances:
cf app <YOUR_APP> --guid
and then expect them in logstash filter as they appear as
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.