Index data to elasticsearch from SQL Server

I have a database in my SQL Server which has multiple tables and the size of the database is around 25 GB.
I want to index this data to Elasticsearch 5.6 (As it supports mapping of multiple types). However the column names in some of the tables have invalid name such as a [.] in between words, etc.
I'd like to convert all the tables into a single json file and then fix the field names and then index the data to elasticsearch. I'm having a hard time figuring out how to do this. I've read some links such as this and this but I need to fix the column names before indexing.

How do I convert the SQL tables (multiple) into json and then after fixing the names, index it to elasticsearch? Are there any other ways to fix this?

Hello,

we can load data by writing the query, and the names which are not proper can be replaced by giving alias names,
Ex
select id_value as id, name.keyword as name from sample

there are hundreds of such columns. I was thinking of doing a search and replace after converting the tables into JSON as that feels easier

yup, as per my knowledge,

create a new table with proper data and columns and load the data or else using alias names

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.