Can't import data after reaching a certain amount in storage capacity

Hello guys,

I am having an issue.
My present scenario is that I have 2 indices in my Elasticsearch instance.
index 1==> Used to store logs (86 000 documents/lines)
index 2 ==> Used to store data from a SQL Server Database (trying to import 200 millions rows)

I encounter the error below after having imported nearly 5 millions rows from the SQL server:

image

From one of the answer that I found on the forum, that is to delete the data, I still can't resolve my problem. Even after having deleted some data, the storage on the linux machine where Elasticsearch has been installed, does not appear to be reduced.

image

image

As a result, I'm unabled to ingest any more data into Elasticsearch.
So my question would be as follows:

  1. By deleting the data in one of the index (index 2: 5 millions rows) using the command delete index_name, does it actually delete data on the physical drive???
    Or is there any cache that needs to be flushed down?

The location of the data files: /var/lib/elasticsearch

image

Note:
I have not found any accumulation of large logs generated due to error generated which could be causing this rise in storage used:

Logs of Logstash:

Logs of kibana:

image

Logs of elasticsearch:

root@AZEMEAPCRMDOS02:/var/log/elasticsearch# ls -sh
total 885M
4.0K elasticsearch-2018-03-28-1.log.gz  304K elasticsearch-2018-05-25-1.log.gz  508K elasticsearch-2018-07-19-1.log.gz   1.1M elasticsearch-2018-08-09-36.log.gz
4.0K elasticsearch-2018-03-29-1.log.gz  508K elasticsearch-2018-05-26-1.log.gz  504K elasticsearch-2018-07-20-1.log.gz   1.1M elasticsearch-2018-08-09-37.log.gz
4.0K elasticsearch-2018-03-30-1.log.gz  260K elasticsearch-2018-05-27-1.log.gz  504K elasticsearch-2018-07-21-1.log.gz   1.1M elasticsearch-2018-08-09-38.log.gz
4.0K elasticsearch-2018-03-31-1.log.gz  400K elasticsearch-2018-05-28-1.log.gz  512K elasticsearch-2018-07-22-1.log.gz   1.1M elasticsearch-2018-08-09-39.log.gz
4.0K elasticsearch-2018-04-01-1.log.gz  516K elasticsearch-2018-05-29-1.log.gz  508K elasticsearch-2018-07-23-1.log.gz   1.1M elasticsearch-2018-08-09-4.log.gz
4.0K elasticsearch-2018-04-02-1.log.gz  520K elasticsearch-2018-05-30-1.log.gz  508K elasticsearch-2018-07-24-1.log.gz   456K elasticsearch-2018-08-09-40.log.gz
4.0K elasticsearch-2018-04-03-1.log.gz  508K elasticsearch-2018-05-31-1.log.gz  508K elasticsearch-2018-07-25-1.log.gz   1.1M elasticsearch-2018-08-09-5.log.gz
 12K elasticsearch-2018-04-04-1.log.gz  504K elasticsearch-2018-06-01-1.log.gz  504K elasticsearch-2018-07-26-1.log.gz   1.1M elasticsearch-2018-08-09-6.log.gz
4.0K elasticsearch-2018-04-05-1.log.gz  504K elasticsearch-2018-06-02-1.log.gz  508K elasticsearch-2018-07-27-1.log.gz   1.1M elasticsearch-2018-08-09-7.log.gz
4.0K elasticsearch-2018-04-06-1.log.gz  508K elasticsearch-2018-06-03-1.log.gz  508K elasticsearch-2018-07-28-1.log.gz   1.1M elasticsearch-2018-08-09-8.log.gz
4.0K elasticsearch-2018-04-07-1.log.gz  508K elasticsearch-2018-06-04-1.log.gz  508K elasticsearch-2018-07-29-1.log.gz   1.1M elasticsearch-2018-08-09-9.log.gz
4.0K elasticsearch-2018-04-08-1.log.gz  508K elasticsearch-2018-06-05-1.log.gz  504K elasticsearch-2018-07-30-1.log.gz   1.1M elasticsearch-2018-08-10-1.log.gz
4.0K elasticsearch-2018-04-09-1.log.gz  500K elasticsearch-2018-06-06-1.log.gz  508K elasticsearch-2018-07-31-1.log.gz   904K elasticsearch-2018-08-10-10.log.gz
4.0K elasticsearch-2018-04-10-1.log.gz  504K elasticsearch-2018-06-07-1.log.gz  224K elasticsearch-2018-08-01-1.log.gz   1.1M elasticsearch-2018-08-10-11.log.gz
4.0K elasticsearch-2018-04-11-1.log.gz  508K elasticsearch-2018-06-08-1.log.gz  4.0K elasticsearch-2018-08-02-1.log.gz   1.1M elasticsearch-2018-08-10-12.log.gz
8.0K elasticsearch-2018-04-12-1.log.gz  508K elasticsearch-2018-06-09-1.log.gz  4.0K elasticsearch-2018-08-03-1.log.gz   1.1M elasticsearch-2018-08-10-13.log.gz
4.0K elasticsearch-2018-04-13-1.log.gz  504K elasticsearch-2018-06-10-1.log.gz  4.0K elasticsearch-2018-08-04-1.log.gz    82M elasticsearch-2018-08-10-14.log

Kind regards,
Ganessen

Yes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.