what is the best design approach to load data from large parquet files of 500M records to ES.
Just use Logstash with the CSV filter.
Thank you . Is there any ideal design / configuartion consideration to get maximum performance gain
what is the best design approach to load data from large parquet files of 500M records to ES.
Just use Logstash with the CSV filter.
Thank you . Is there any ideal design / configuartion consideration to get maximum performance gain
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.