Load index with data from document db collection and Azure SQL database

Appreciate if any one can point me where I can find information/guidance how to load data in index from document db and Azure SQL database.

I have to create couple of indexes both will have geo location. I have to perform search on geo location column. For first index data source is a collection from document db. On this index I will perform search for given latitude longitude against stored points.

For second index the data source is Azure SQL database. The data type of column is Geometry which has multi polygon data. This data I have to populate in index to perform search based on given polygon. At present I am using STIntersects to find out if given polygon intersects include that record in result via stored procedure.

At present I am using Azure search and with the help of data source and indexer able to populate index and keep it in sync with data changes using change detection policy. With Elastic search how I can keep data in sync?

https://www.elastic.co/guide/en/logstash/5.1/plugins-inputs-jdbc.html perhaps?

Otherwise search through the posts here, this has been asked many times before :slight_smile:

Thanks Mark, to test at present I am using .NET helper NEST with bulk api to populate index with data.

At present the challenge I am facing is loading data in geo_shape type column. By applying geoshape attribute and auto map I am able to create index.

I am able to add data in index using Kibana console using below api same thing I am trying to achieve via NEST with same data. With NEST bulk method I am not able to add data in index getting error "Points of LinearRing do not form a closed linestring" though both starting and ending points are same.

Note: This is not complete api request.

POST /neighborhood-index/neighborhood/_bulk
{ "index": { "_id": 313 } }
{ "neighborhoodId": 313, "name": "Test", "state": "Test", "boundaryPolygon": {"type":"polygon","coordinates":[[[-111.823899, 33.242376], [-111.8239, 33.242737], [-111.823916, 33.243526], [-111.823934, 33.244196], [-111.822165, 33.244218], [-111.82216, 33.244083], [-111.82202, 33.244085], [-111.821976, 33.244114], [-111.820824, 33.244141], [-111.819695, 33.244151], [-111.81969, 33.2432], [-111.819673, 33.241017], [-111.819668, 33.240624], [-111.820629, 33.240609], [-111.822488, 33.240581], [-111.823912, 33.240548], [-111.823909, 33.241226], [-111.823906, 33.241468], [-111.823898, 33.242], [-111.823899, 33.242376]]]}}

Partial routine code which I am using to create sample data:
private static IEnumerable GetNeighborhoodData()
{
var neighborhoods = new List();

        IEnumerable<IEnumerable<GeoCoordinate>> coordinates = new[]
        {
            new GeoCoordinate[]
            {
                new [] {33.242376, -111.823899}, new [] {33.242737, -111.8239}, new [] {33.242376, -111.823899},
                new [] {33.242737, - 111.8239}, new [] {33.243526, -111.823916}, new [] {33.244196, -111.823934},
                new [] {33.244218, -111.822165}, new [] {33.244083, -111.82216}, new [] {33.244085, -111.82202},
                new [] {33.244114, -111.821976}, new [] {33.244141, -111.820824}, new [] {33.244151, -111.819695},
                new [] {33.2432, -111.81969}, new [] {33.241017, -111.819673}, new [] {33.240624, -111.819668},
                new [] {33.240609, -111.820629}, new [] {33.240581, -111.822488}, new [] {33.240548, -111.823912},
                new [] {33.241226, -111.823909}, new [] {33.241468, -111.823906}, new [] {33.242, -111.823898},
                new [] {33.242376, -111.823899 }
            }
        };

        neighborhoods.Add(new Neighborhood()
        {
            NeighborhoodId = 313,
            Name = "Test",
            State = "Test",
            BoundaryPolygon = new PolygonGeoShape() { Coordinates = coordinates },

I am looking for sample code how to pass value for polygon in geo shape using NEST bulk api.

Hi Mark, I have resolved above issue it was because of incorrect points in the points array. Thanks :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.