Geo_point in logstash

hello ,
currently to define any field as geo_point i using following command in kibana dev tool -

PUT t5
{
  "mappings": {
    "dataa": {
      "properties": {
        "G": {
          "type": "geo_point"
        }
      }
    }
  }
}

But i need to define this for every index .

Now i am creating index for everyday .

my sample .csv

A,B,C,D,E,F,G
A1,2017/09/16 00:00:00,3U0604,2017/09/16 19:30:00,2017/09/16 19:30:00,1,"26,54"
A2,2017/09/16 00:00:00,3U0604,2017/09/16 00:00:00,2017/09/16 19:30:00,2,"27,54"

and my config file -

input {
	file {
	path => "E:\Local_Elasticsearch\logstashv5\everydaylog/*.csv"
	start_position => "beginning"
	
	}
}

filter {
  csv {
      separator => ","
     columns => ["A","B","C","D","E","F","G"]
  }
  mutate {convert =>["F" , "integer"]}

  date {
      match => [ "B", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
      target => "B"
      locale => "en"
    }
	
 date {
      match => [ "D", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
      target => "D"
      locale => "en"
    }
	
 date {
      match => [ "E", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
      target => "E"
      locale => "en"
    }
  }

output {
	elasticsearch {
		hosts => "localhost"
		index => "testindex-%{+YYYYMMdd}"
		
		
	}
	stdout{}

}

in this case how can i define field "G" as geo_point for every day data file automatically .

thanks

Create a template :slight_smile:

GeoIP in the Elastic Stack - Elasticsearch, Logstash, Ingest API | Elastic Blog has some guidance on this, in particular the Custom Index Names section. But let us know if you get stuck!

hi i not able to make field data type as geo_point from log stash config file ,

form one of thread i tried change elasticsearch json template also , i have 3 json file -
elasticsearch-template-es2x
elasticsearch-template-es5x
elasticsearch-template-es6x

this is updated template file part -

 "properties" : {
        "@timestamp": { "type": "date", "include_in_all": false },
        "@version": { "type": "keyword", "include_in_all": false },
        "geoip"  : {
          "dynamic": true,
          "properties" : {
            "ip": { "type": "ip" },
            "location" : { "type" : "geo_point" },
            "latitude" : { "type" : "half_float" },
            "longitude" : { "type" : "half_float" }
          }
        },
		"location" : { "latlong" : "geo_point" }
      }

but still value as string in kibana .

my data file -

dilip,30,"34,3"
d2,31,"35,3"
d4,32,"44,3"
d5,35,"34,23"

and config file -

input {
	file {
	path => "E:\Local_Elasticsearch\logstashv5\latllongtest/*.csv"
	start_position => "beginning"
	sincedb_path => "/dev/null" 
	}
}

filter {
	csv {
		separator => ","
		columns  => [ "name","amount","latlong"]

	}
	mutate {convert =>[amount , "integer"]}
  }

output {
	elasticsearch {
		hosts => "localhost"
		index => "lattest2"
		
		
	}
	stdout{}

}

What does your full template look like? Did you create a new index? What does an example document from that index look like? What do the actual mappings of that index look like?

hi i able to do this -

PUT /_template/templlatlong
{
  "order": 0,
  "template": "*",
  "settings": {},
  "mappings": {
    "_default_": {
      "properties": {
        "latlong": {
          "type": "geo_point"
        }
      }
    }
  }
}

this will create latlong column as geo point at time of index creation . thanks .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.