I am new here for ELK(ElasticSearch,Logstash,kibana). I would like to know how logstash parse log files and upload into an ElasticSearch. Lets take an example, If I want to load around 250GB of log files data into an ElasticSearch from S3 storage by using logstash. In logstash how that log files are parsed and send into an elasticsearch, is it stored the whole S3 log files data in local memory and then parsed that log files and then put into an elasticsearch? or it will parsed by reading one by one line from S3 Storage without downloading all the log files from S3 storage. And also how to increase the read and write throughput in logstash. is it support multithread process?