I've got what appears to be a unique situation. I am using Kibana to dashboard test pass data after a dev commits code. Because of that, my log data isn't received in real time. I basically have a file with all my logging data in it, and now I want to upload that data to Kibana. All the stuff I read about logstash appears to be geared towards a terminate-and-stay paradigm. I've been looking at command line options to see if I can't just get logstash to upload my file to kibana and then exit. But now I'm questioning if I'm doing the right thing. Should I even be using logstash for this? Or is there a better way of dealing with my needs?
In short, what I want is to automate the upload of a completed logfile to kibana. What is the recommended way to accomplish this.
You could do it using a stdin input, which causes logstash to exit when it reaches EOF. That said, logstash can burn a minute of CPU time during startup, so that's pretty expensive. It may be possible with filebeat, which is lighter weight, but I believe that it also architected to run forever as a service.
That said, if you have the resources to let logstash run as a service there are a number of ways you could feed files to it from time to time. file input, curl into an http input...
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.