I have exported elastic indices using logstash with the following logstash configuration:
- pipeline.id: export-process
pipeline.workers: 4
config.string: |
input {
elasticsearch {
hosts => "http://elastic:80/elasticsearch/"
user => "elastic"
password => ""
ssl => "false"
index => "metricbeat-*"
docinfo => true
query => '{
"query": {
"bool": {
"filter": {
"range": {
"@timestamp": {
"gte": "now-35m",
"lte": "now",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
}
}'
}
}
output {
file {
gzip => "true"
path => "/usr/share/logstash/export/export_%{[@metadata][_index]}.json.gz"
}
}
Now I am trying to import it back into another instance. I have unzipped the gz json file, and I am going over each line in the document and doing:
curl -s -XPOST http://1.2.3.4:9000/metricbeat/_doc/ -H "Content-Type: application/json" -d "$1"
where $1 is a line item from the json file. This method is very slow. I started the import of one index which is 1.7Gb and it is still running after 90 minutes. Is there a better way of doing this?