Is there a way to export an index's schema and crawler configuration on Elastic Cloud? We are looking to create some workflows in Git and would like to be able to store a JSON schema to deploy an index with crawler onto other Elastic cloud deployments. Is there an easy way to do this? We essentially want to copy all config only, no indexed data.
Unfortunately, there isn't currently a mechanism to export or import the Elastic Crawler, today. And since the Elastic Crawler doesn't have a REST api, you cant even really make one yourself, im afraid.
However, we’re addressing exactly this type of problem with our new Open Crawler. It’s still in beta, but if you're looking to have some better programmatic crawl controls, you may want to check it out.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.