I have used logstash to build csv files, but I noticed i needed to exit logstash to be certain the output file is closed. While I am ok with this, it seems silly to have to shutdown logstash. The enclosed example is one that succeeded, so the answer isn't time critical.
I am running logstash 6.1 on linux ubuntu 16.04LTS server.
The task: read in a list of ip addresses from a file, and write them out with their locations.
Is there a way to have logstash complete (close) a csv file output when the input EOF is reached, or is exiting logstash the only option?
Would this change if the input was elasticsearch?
It matters because I don't know how often buffers in logstash are flushed. If I can be certain the output buffers are empty, it should be fine as-is. Historically (unix/linux-wise) it's considered a BAD THING (tm) to expect a file is complete even if it's not been closed.
It matters because I don't know how often buffers in logstash are flushed.
The flush interval is configurable in the file output. I don't think there's any additional buffering in the JVM layers, but double-checking wouldn't hurt.
Historically (unix/linux-wise) it's considered a BAD THING (tm) to expect a file is complete even if it's not been closed.
Well, there are a lot of historical things that are no longer true.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.