Sure. A few caveats. I used a preview release of Logstash v 6.3.2 (its being released now, as I type)
In the logstash file config/logstash.yml
config.support_escapes: true
pipeline.batch.size: 1
pipeline.workers: 1
one worker and one event per batch - because the file with 20 000 QSO lines (due to the split) will explode the batch to 20 000. There will be a lot of duplicated data. You might be advised to do some surgery on the very big files - copy and remove the top half of the QSO lines from one and the bottom half from the other.
You will need to install the latest version or the file input
bin/logstash-plugin install logstash-input-file --version 4.1.5
Also we should only read one file at a time too.
File input:
file {
path => "/path/to/radio/sample.txt" # replace
sincedb_path => "/dev/null" # replace with real path to a sincedb file when ready
delimiter => "§¶¶§" # improbable delimiter, all data is accumulated until EOF
mode => "read"
max_open_files => 1
file_completed_action => "log"
file_completed_log_path => "/path/to/radio/completed.txt" # replace
}
Good luck.