I have a Python script that synchronizes the data between a relational database and ElasticSearch, i.e. on every new insert into the relational database, the data is being inserted into the ES index too. But, whenever I create a new column or a table in the relational database, I manually update the mapping for the ES index. So, I was wondering if there is a way for this to be automated? Whenever I create a new table in the relational database, I'd like a new, let's say, nested field to be created in the ES index, and for each new column in the relational database, a new field to be added in the ES index mapping.
Any ideas and suggestions on how to do this are appreciated.
The only way to handle that would be to build more code, there's nothing native to Elasticsearch that can be leveraged here sorry to say.
In rough comparison SQL table maps to an index and SQL table column maps to a field in Elasticsearch.
Whenever you post a document to an index, Elasticsearch's default behavior will create a new index if it does not exist. If you define index templates you can control settings/mappings of the new index. Upfront index creation is not mandatory. Index API | Elasticsearch Guide [8.11] | Elastic
Similarly Elasticsearch will update index mapping whenever it sees a new field in the document. The data type is determined by value. You can control it by defining dynamic templates. In short upfront field definition is not mandatory either. Dynamic mapping | Elasticsearch Guide [8.11] | Elastic
@Vinayak_Sapre Thank you for the answer. But, in this way 'nested' type fields won't be created, which I'll need further for better and easier querying the index. Or, am I wrong?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.