I'd like to index the location .lon and .lat value from the latitude and
longitude fields. I tried the copy_to function with no success :
"latitude": {
"type": "double",
"copy_to": "location.lat"
},
"longitude": {
"type": "double",
"copy_to": "location.lon"
},
Is there any way to feed the "location" property from latitude and
longitude fields at indexation ?
My point is that I don't want to modify the input csv file to adapt it to
the GeoJSON format (i.e concat lat and lon in one field in the csv file).
I dont know about your stack, but maybe logstash would be a good idea to
add it in there. It is more flexible than the csv river and features a CSV
input as well. You can easily change the structure of the data you want to
index. This is how the logstash config would look like
if [latitude] and [longitude] {
mutate {
rename => [ "latitude", "[location][lat]", "longitude",
"[location][lon]" ]
}
}
I am currently working on a blog post how to utilize elasticsearch,
logstash and kibana on CSV based data and hope to release it soonish on the
.org blog - which covers exactly this. Stay tuned!
I'd like to index the location .lon and .lat value from the latitude and
longitude fields. I tried the copy_to function with no success :
"latitude": {
"type": "double",
"copy_to": "location.lat"
},
"longitude": {
"type": "double",
"copy_to": "location.lon"
},
Is there any way to feed the "location" property from latitude and
longitude fields at indexation ?
My point is that I don't want to modify the input csv file to adapt it to
the GeoJSON format (i.e concat lat and lon in one field in the csv file).
I have included logstash in my stack and started to play with it. I'm sure
it can do the trick I was looking for, and much more.
Thank you ...
[waiting for your blog post :)]
Pascal.
On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen alr@spinscale.de wrote:
Hey,
I dont know about your stack, but maybe logstash would be a good idea to
add it in there. It is more flexible than the csv river and features a CSV
input as well. You can easily change the structure of the data you want to
index. This is how the logstash config would look like
if [latitude] and [longitude] {
mutate {
rename => [ "latitude", "[location][lat]", "longitude", "[location][lon]" ]
}
}
I am currently working on a blog post how to utilize elasticsearch,
logstash and kibana on CSV based data and hope to release it soonish on the
.org blog - which covers exactly this. Stay tuned!
I'd like to index the location .lon and .lat value from the latitude and
longitude fields. I tried the copy_to function with no success :
"latitude": {
"type": "double",
"copy_to": "location.lat"
},
"longitude": {
"type": "double",
"copy_to": "location.lon"
},
Is there any way to feed the "location" property from latitude and
longitude fields at indexation ?
My point is that I don't want to modify the input csv file to adapt it to
the GeoJSON format (i.e concat lat and lon in one field in the csv file).
Thank you for any hints.
Pascal.
--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
But when i check location field type then it is not created as geo_point.
when i am trying to search a geo_point then i am getting below error.
QueryParsingException[[logstash-2014.09.11] failed to find geo_point field [location1]];
can you help me to resolve this
On Thursday, April 10, 2014 2:42:22 AM UTC+5:30, Pascal VINCENT wrote:
Hi,
I have included logstash in my stack and started to play with it. I'm sure
it can do the trick I was looking for, and much more.
Thank you ...
[waiting for your blog post :)]
Pascal.
On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen <a...@spinscale.de
<javascript:>> wrote:
Hey,
I dont know about your stack, but maybe logstash would be a good idea to
add it in there. It is more flexible than the csv river and features a CSV
input as well. You can easily change the structure of the data you want to
index. This is how the logstash config would look like
I am currently working on a blog post how to utilize elasticsearch,
logstash and kibana on CSV based data and hope to release it soonish on the
.org blog - which covers exactly this. Stay tuned!
--Alex
On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT <pasvi...@gmail.com
<javascript:>> wrote:
Hi,
I'm new to elasticsearch. My usecase is to load a csv file containing
some agencies with geo location, each lines are like :
id;label;address;zipcode;city;region;latitude;longitude;(and some
others fields)+
I'd like to index the location .lon and .lat value from the latitude and
longitude fields. I tried the copy_to function with no success :
"latitude": {
"type": "double",
"copy_to": "location.lat"
},
"longitude": {
"type": "double",
"copy_to": "location.lon"
},
Is there any way to feed the "location" property from latitude and
longitude fields at indexation ?
My point is that I don't want to modify the input csv file to adapt it
to the GeoJSON format (i.e concat lat and lon in one field in the csv file).
Thank you for any hints.
Pascal.
--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com <javascript:>.
But when i check location field type then it is not created as geo_point.
when i am trying to search a geo_point then i am getting below error.
QueryParsingException[[logstash-2014.09.11] failed to find geo_point field [location1]];
can you help me to resolve this
On Thursday, April 10, 2014 2:42:22 AM UTC+5:30, Pascal VINCENT wrote:
Hi,
I have included logstash in my stack and started to play with it. I'm
sure it can do the trick I was looking for, and much more.
Thank you ...
[waiting for your blog post :)]
Pascal.
On Mon, Apr 7, 2014 at 9:38 AM, Alexander Reelsen a...@spinscale.de
wrote:
Hey,
I dont know about your stack, but maybe logstash would be a good idea to
add it in there. It is more flexible than the csv river and features a CSV
input as well. You can easily change the structure of the data you want to
index. This is how the logstash config would look like
I am currently working on a blog post how to utilize elasticsearch,
logstash and kibana on CSV based data and hope to release it soonish on the
.org blog - which covers exactly this. Stay tuned!
--Alex
On Thu, Apr 3, 2014 at 12:21 AM, Pascal VINCENT pasvi...@gmail.com
wrote:
Hi,
I'm new to elasticsearch. My usecase is to load a csv file containing
some agencies with geo location, each lines are like :
id;label;address;zipcode;city;region;latitude;longitude;(and some
others fields)+
I'd like to index the location .lon and .lat value from the latitude
and longitude fields. I tried the copy_to function with no success :
"latitude": {
"type": "double",
"copy_to": "location.lat"
},
"longitude": {
"type": "double",
"copy_to": "location.lon"
},
Is there any way to feed the "location" property from latitude and
longitude fields at indexation ?
My point is that I don't want to modify the input csv file to adapt it
to the GeoJSON format (i.e concat lat and lon in one field in the csv file).
Thank you for any hints.
Pascal.
--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.