The hdfs files are getting opened and data has been written to it. However, the files are kept open with 0 bytes data until i stop logstash. I think it might be expecting more output. How can i role the file after the event.?
The pipe output's ttl option should with its default value close the pipe after 10 seconds of not receiving any events. Can you try enabling debug logging by starting Logstash with --debug? Then the plugin will log extra messages when it's closing the pipes.
Are you sure the pipe isn't getting events constantly? What if you replace the beats input with a stdin or something else where you control exactly which events are emitted?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.