Yes, use the "Upload File" integration in Kibana (supports files up to 100MB by default), but note this is a manual process. You'll need to upload the file each day.
You definitely can use logstash and could set it up to parse the files automatically.
If you are looking to completely automate the process I'd go for number 2 as a solution. Depending upon your use case you could also use filebeat to send the data to the cluster and then use and ingest pipeline. Both work in a similar fashion.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.