For a variety of reasons, I am running a series of logstash jobs to handle a large amount of data in a series of buckets.
Under 2.4 I used something like:
logstash -f my-index.conf &> my-index-bucket-100-12000-14000.log
This allowed me to see the outcome of each bucket, and rerun buckets that failed.
Under v5, if I use something like:
logstash --path.logs . -f my-index.conf &> my-index-bucket-100-12000-14000.log
I get a logfile my-index-bucket-100-12000-14000.log that just contains a message indicating that the logfile is being written elsewhere, and a logfile named logstash-plain.log that contains the interesting stuff.
I can't see how to set that logfile name so I get separate files for each invocation, rather than all jobs appending to the same file. I tried to fully qualify --path.logs with a file name, but that merely creates a directory with that name, and creates logstash-plain.log within the directory. While it's something I could use in a pinch, it's not as useful as having a single directory with multiple logfiles.
I'm using --path.logs in the command as this setting in logstash.yml points to a directory that I can't write to, and that I don't want these logs going to. It complains that the directory is unwriteable unless I use --path.logs. But it appears that using --path.logs changes logstash to direct to logstash-plain.log instead of stdout, which is what I really want/had under 2.4.
Is there a way I can do this?