Convert lat, , long coming as two attributes to geo point

Hi,

The input data that I'm getting from my application is returning as below

{
"timestamp": 2020-07-02,
"name" : "location",
"latitude": 39.01289
"longtitude" -76.979
}

how do i update the elasticsearch index to convert this attribute to geopoint/geo json, so that it is available in the kibana dashboard

I tried the put http post query
/trucks/update_by_query
`
{
"script" : {
"lang" : "painless",
"source" : """

ctx._source.area.location=[ctx._source.area.longtitude, ctx._source.area.latitude] """
}
} `

but when i run this query getting json parse exception. Any help is appreciated

}

I'd probably use an ingest pipeline and reindex the whole data with the reindex API.
So you will have at the end a correct mapping instead of keeping old non needed fields.

Sorry. How to add the ingestion pipeline. I can delete the old data & create a new index. That isn't issue.

Here are some useful links:

Here you can probably do something like:

{
  "rename": {
    "field": "area.longtitude",
    "target_field": "area.location.lon"
  }
},
{
  "rename": {
    "field": "area.latitude",
    "target_field": "area.location.lat"
  }
}

See https://www.elastic.co/guide/en/elasticsearch/reference/7.8/rename-processor.html

Also may be easier, you can look at https://www.elastic.co/guide/en/elasticsearch/reference/7.8/docs-reindex.html#docs-reindex-change-name

And may be write something like:

POST _reindex
{
  "source": {
    "index": "test"
  },
  "dest": {
    "index": "test2"
  },
  "script": {
    "source": "ctx._source.area.location.lat = ctx._source.remove(\"area.latitude\"); ctx._source.area.location.lon = ctx._source.remove(\"area.longitude\");"
  }
}

I'm not a painless user so I'm unsure if this will work :slight_smile:

i created a pipeline and verified it using _ingest/pipeline/mypipeline/_simulate API. Its working fine.

However in the actual setup, it just rejects the document & nothing gets index.
If it isn't indexing, where can I see the rejected/failed to index doc ?

Also besides this, I need to make some dynamic fields (which is a known set). Is there a way to convert this static attributes.

ie. "name": "parameter1"
"value" : "value1"

to "parameter1":"value1"

I need this transformation so that i can use the kibana dashboard. appreciate your support

Please share the exact steps you ran. Anything else than that is just guessing.

Thanks.

Sorry. I modified the data source, so that there is no need for the pre-processing in elastic search.

Now my data comes as separate messages. How should the index mapping be created for me to display all the vehicle locations in kibana visualisation

Automatic Index created in elasticsearch and is appears a below

{
  "_index": "vehicles",
  "_type": "vehicle",
  "_id": "a22519d1-5f24-4ba2-9494-93a6818e96fa",
  "_version": 1,
  "_score": 0,
  "_source": {
    "timestamp": "2020-07-05 08:26:48.924000000",
    "trip_id": "1cda4a17-25aa-497c-9f5e-b61b9db58d52",
    "vehicle_id": "VEHICLE_01",
    "name": "location",
    "latitude": 38.88248,
    "longitude": -77.02767,
    "location": "-77.02767, 38.88248"
  }
}

{
  "_index": "vehicles",
  "_type": "vehicle",
  "_id": "2c94ca4e-254c-4243-a385-0add55f73c77",
  "_score": 1,
  "_source": {
    "timestamp": "2020-07-05 08:18:52.616000000",
    "trip_id": "cc2708bf-9b6f-40b8-b20c-dc4fe7e66199",
    "vehicle_id": "VEHICLE_01",
    "name": "gear_pos",
    "value": "third",
    "transmission_gear_position": "third"
  }

I tried to put all the parameters and create a mapping file, but it don't seem to work..Is it because every message has only one parameter (i.e. domain attribute) e.g. gear_pos,location. Any suggestion on the mapping to be created for visualisation in kibana.

When I create it is saying that the index pattern don't have any compatible geo_point field

I can't tell. I don't know what commands you ran exactly.

I created the elastic search index with the below mappings via., postman.
After issuing this command, then send the data. But the data isn't getting indexed in elasticsearch

HTTP PUT https://elasticsearch-url/vehicles/?include_type_name=false

{
"mappings": {
    
      "properties": {
        "acceleration" : {
          "type" : "long"
        },
        "accelerator_pedal_position" : {
          "type" : "float"
        },
        "latitude" : {
          "type" : "float"
        },
        "location" : {
          "type" : "geo_point"
          }
        },
        "longitude" : {
          "type" : "float"
        },
        "name" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }},
        
         "timestamp": {
          "type": "long",
          "copy_to": "datetime"
        },
        "datetime": {
          "type": "date",
          "store": true
        },
        
        "trip_id" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "value" : {
          "type" : "float"
        },
        "vehicle_id" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        }
      
  }
}

What did you do exactly?

I meant the data source is started., so the elastic search starting receiving data..However it isn't getting indexed..

However if I don't create the index manually (as described above) prior to sending the data, an automatic index is created by elastic search and the location field datatype is automatically classified as "text" instead of geo_point.

If I manually re index I could get the data index

What else could go wrong here ?

I can't help I'm afraid unless you share exactly all what you are doing. Try to create a minimalist script which reproduces the problem.

I'm afraid we can't help more.

The issue was the timestamp field was coming in a different format and got auto-mapped as "string". After I resolved that data is getting indexed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.