I have existing logs which are being added to elasticsearch as a single field called, 'log'. Sample of log content is something like below: [2018-03-28T14:20:16.994Z] "GET /places/v1/radius?maxresults=25&offset=0&locale=eng&lat=47.164444&long=13.571355&radius=700 HTTP/1.1" 200 - 0 12 560 557 "10.240.0.4" "Apache-HttpClient/4.5.3 (Java/1.8.0_111)"
Is there any way I can extract the lat and long from the string from the existing data in elasticsearch and convert to new geopoint field and visualize in kibana?
You could create a new pipeline that reads from elasticsearch using the Elasticsearch Input, extracts the location using either the Grok Filter or the KV Filter, and then updates the existing documents in Elasticsearch using the Elasticsearch Output
If your query for the input only selected documents that did not have a value for the lat/lon present as a geo-point, each time the pipeline ran it would avoid repeating work.
You'll need to tell the index that the field is a geo-point, in the mapping for the existing indices, and also in the template so that when a new index is created, it knows how to declare the type.
There are several acceptable forms for the geo-point to take, but the object-form ({"lat":12.34, "lon":56.78}) may be easiest to get to from where you're at: by giving the kv filter a target, you can have it place the extracted bits together.
You may need to add directives to tell the Elasticsearch output which index to put the document in (there should be an existing @metadata field) and what the document's id should be (be careful here -- you'll need to be careful to send the whole document to avoid replacing a complete document with a subset).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.