How can I send large JSON file (6 GB) to Elasticsearch using bulk API?

Finally, I have succeeded in sending the whole JSON file to Elasticsearch, using Logstash. There was no need to split the file in chunks. I thank a lot to the guys in the post discuss.elastic.co which inspired me very much. The logs they had are very similar with mines. This way, I want to post here my config. file in order to help other people which may have similar problems like me:

input {
  file {
    path => ["/home/...../file.json"]
    start_position => "beginning"
    sincedb_path => ["/home/...../sincedb"]
    codec => "json"
  }
}

filter {
 mutate {
  rename => { "_id" => "idoriginal" }
  rename => { "_index" => "indexoriginal" }
  rename => { "_type" => "typeoriginal" }
  rename => { "_source" => "sourceoriginal" }
 }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "jsonlogs-%{+YYYY.MM.dd}"
  }
  
  stdout {
	codec => rubydebug
  }
}

If it happens at some time after you started Logstash to have more than 90% of disk full (because of the large data you are sending to Elasticsearch), having the error like here discuss.elastic.co, you have to go in Dev Tools/Console and execute the code below:

PUT /jsonlogs-2020.04.20(in fact, your index name)/_settings
{
"index.blocks.read_only_allow_delete": null
}

Hope it will be useful for all the people who had or will have similar problems like me. Best wishes!