Looks like currently there is no way for config live reload (please, correct me if I am wrong). And the only option is to shutdown logstash and start it again.
In that case, question - how much logs I will loose if I have pipeline that reads from file and sends to elasticsearch. My understanding is:
- File pointer is being saved, so no log messages duplication or loss.
1.a If log file got rotated while logstash is down - messages are being lost from remaining rotated file.
1.b In case of rotation, will there be conflict with saved pointer to file position
- Probably small amount of messages being lost from internal logstash queue
- Messages that are subject to shipping or in the middle of shipping to elasticsearch are also being lost.
Am I correct?
What are the options to prevent log loss and allow to add new files to config at runtime?