Jdbc river and geo_point from longitude and latitude

I am getting error while creating geo_point from longitude and latitude. It does not recognize location variable.

PUT /_river/fault/_meta
{
    "type" : "jdbc",
    "jdbc" : {
        "url" : "jdbc:postgresql://localhost:5432/gis",
        "user" : "test",
        "password" : "test",
        "sql" : [
             {
                "statement" : "select location_id as _id, latitude as \"location.lat\",longitude \"location.lon\",zone from alarms"
            }
        ],   
        "index" : "fault",
        "type" : "alert_status",
        "schedule": "0 0/1 * * * ?"
    },
    "type_mapping" : {
              "alert_status" : {
                  "properties" : {
                        "location" : { 
                            "type" : "geo_point",
                            "lat_lon" : true
                       }
                  }
              }
        }
}

How to create geo_point without error? I am using postgres/elasticsearch1.5.1

[2015-05-10 10:20:00,063][ERROR][river.jdbc.RiverPipeline ] java.lang.IllegalArgumentException: illegal head: location
java.io.IOException: java.lang.IllegalArgumentException: illegal head: location
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:353)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.fetch(SimpleRiverFlow.java:220)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.execute(SimpleRiverFlow.java:149)
        at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.request(RiverPipeline.java:88)
        at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:66)
        at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:30)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: illegal head: location
        at org.xbib.elasticsearch.plugin.jdbc.util.PlainKeyValueStreamListener.merge(PlainKeyValueStreamListener.java:294)
        at org.xbib.elasticsearch.plugin.jdbc.util.PlainKeyValueStreamListener.values(PlainKeyValueStreamListener.java:153)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.processRow(SimpleRiverSource.java:824)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.nextRow(SimpleRiverSource.java:777)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.merge(SimpleRiverSource.java:510)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.execute(SimpleRiverSource.java:405)
        at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:332)
        ... 9 more

You must assign IDs to your ES docs you create from DB using the "_id" column name.

Otherwise you will merge all rows in the result set into one ES doc which is probably not what you want.

Also, the "type_mapping" must be moved inside "jdbc" object and must be accompanied by and "index_settings" parameter, or it won't be used.

When I concatenate latitude and longitude in query then I can use geo_point mapping but not with current situation.
e.g. latitude || ',' || longitude as \"geopoint\"

Although location would appear as geo_point field I can not create map from it as the field as location field (geo_point alias) column is blank.

My new PUT request is as follows:

PUT /_river/fault/_meta
{
    "type" : "jdbc",
    "jdbc" : {
        "url" : "jdbc:postgresql://localhost:5432/gis",
        "user" : "test",
        "password" : "test",
        "sql" : [
             {
                "statement" : "select location_id as _id, latitude as \"location.lat\",longitude \"location.lon\",latitude||','||longitude as geopoint ,zone from alarms"
            }
        ],   
        "index" : "fault",
        "type" : "alert_status",
        "schedule": "0 0/1 * * * ?",
        "index_settings" : {
            "index" : {
                "number_of_shards" : 1
            }
        },
        "type_mapping" : {
              "alert_status" : {
                  "properties" : {                       
                        "location" : { 
                            "type" : "geo_point",
                            "lat_lon" : true
                       }
                  }
              }
        }
    }    
}

I have tried with this tutorial as well but same error.

There is no need to concatenate.

You can try to adapt this working example from new upcoming "noriver' branch, it is very close to actual river based releases.

@jprante It solved the issue.But when I add 100 other columns it suddenly stops working. Thanks.

Yes, the order of columns is significant for correct JSON construction.