I even 'vim'ed the stderr and stdout files and I don't see the logs yet on Kibana. Is it my path definition?
But the logstash seems to read the path though as per the logs.
It could read the path /var/lib but not /var/lib/mesos. Both with root as the owner. Also I'm using Logstash 2.3.4, Elasticsearch and Kibana of version 5.5.1.
@guyboertje As of now only that version of Logstash package is available on DCOS and yes Logstash is running as a service. By default logstash package is running with the root user.
There is not really a work around for this. If the Logstash IRB shell Dir.glob('path') does not return an array of files then the file input will not discover any files - because this Dir.glob is exactly what the file input does under the hood.
I think this is a DCOS permissions issue. How do you know that files exists at that deep level of nesting?
This link has some output mentioning a path that resembles what you are trying to glob.
@Suman_Reddy1 - wildcard characters are supported in the path setting of the file input.
I don't know DCOS. Is there a community forum that you can ask?
If ls -l /var/lib/mesos gives permission denied then you should investigate what you need to do to get read access to the files you want Logstash to read.
We do not control it. This may help https://github.com/mesos/logstash. You may have to build a more up to date LS version yourself. After you have sorted out the permissions thing though.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.