Hello
I have a CSV file separated by:, I send it to elasticsearch in this way
elasticsearch {
hosts => [""]
sniffing => true
manage_template => false
index => "csv_indice"
user => ""
password => ""
}
the data is coming to elasticsearch but when reviewing the index created I get incomplete.
the csv file has a size of 60 MB, I do not know if it is necessary to configure logstash.yml, in order to send a heavy file.
the csv has more than fifteen hundred records, but in the created index will be saved almost fifteen hundred, always the most recent records.