How to update the type of a field


#1

I am having the same problem as listed at this link

I first attempted to update the logstash template as described, but that appeared to completely wipe out everything but the field for location and caused writes from logstash to elasticsearch to fail.

I deleted my entire elasticsearch data directory after being unable to recover it.

I then read more closely the last post, where the geoip.location field is set as a geo_point type by default in the default logstash template.

So, I updated logstash to send to logstash-netflow-date and logstash-syslog-date, and updated my geoip filters so that the location field would fall under a parent named geoip:

if [type] == "netflow" {
    geoip {
        source => "[netflow][ipv4_src_addr]"
        target => "[netflow][src][geoip]"
    }
    geoip {
        source => "[netflow][ipv4_dst_addr]"
        target => "[netflow][dst][geoip]"
    }
}

But the location field still doesn't have a type. It just has two children of latitude and longitude.

How do I get the location field to be recognized as geo_point?

            "location": {
              "properties": {
                "lat": {
                  "type": "float"
                },
                "lon": {
                  "type": "float"
                }
              }
            },

#2

Take a look at this thread, which shows a couple of ways of doing the template. And possibly this one.


Logstash Geoip filter with Packetbeat
#3

Okay. My takeaway from that thread is that there are differences between the default template that I'm using and the template that the person/people in the initial thread I linked.

But which differences matter?

Attempting this (with the obvious change of the index name), wiped the template again.

Is it supposed to be adding it to the existing template? Or should I be copying my existing template, editing this part in, and then posting it?


#4

Sorry, I didn't save my edit. See the second link in my first post. That shows how to add a second template that contains the mapping you want whilst retaining the default template.


#5

So I understand that if I perform a put action on a template, that template will be overwritten. Is that correct?

I also have the option of creating a second template and structuring my index names so that they match one or more templates, which will be merged for that index.

With that information I tried the following

PUT _template/logstash-netflow
{
  "order":0,
  "index_patterns":"logstash-netflow*",
  "mappings": {
    "calls": {
      "dynamic_templates": [
        {
          "location_as_geopoint": {
            "match": "*location",
            "mapping": {
              "type": "geo_point"
            }
          }
        }
      ]
    }
  }
}

However, this failed with the following error from logstash:

"reason"=>"Rejecting mapping update to [logstash-netflow-2018.08.01] as the final mapping would have more than 1 type: [calls, doc]"

This hinted that the name of the mapping I used in the second template had to match that of the existing template:

PUT _template/logstash-netflow
{
  "order":0,
  "index_patterns":"logstash-netflow*",
  "mappings": {
    "doc": {
      "dynamic_templates": [
        {
          "location_as_geopoint": {
            "match": "*location",
            "mapping": {
              "type": "geo_point"
            }
          }
        }
      ]
    }
  }
}

This ended up working after deleting my existing index, and mapped all fields ending in "location" as the type geo_point as expected.

Thank you!


(Robert Cowart) #6

@jmoffitt, I don't want to deter you from trying to build this all yourself, however you might want to consider looking at ElastiFlow (https://github.com/robcowart/elastiflow). It handles the lookups you are trying get working, and A LOT more. It will at least give you examples of almost anything you would ever want to try to do with Logstash.


#7

I believe I initially used that solution and it was great at pointing me in the right direction. However I recall that it used the netflow module rather than a netflow pipeline so it prevented me from using that instance of logstash for syslogs or anything else.

In any case this is as much an exercise in learning Logstash and Elasticsearch as anything else.

thanks!


(Robert Cowart) #8

ElastiFlow definitely doesn't use the Netflow module. The Netflow module was based on v1.0.0 of ElastiFlow and is actually quite dated at this point. The current ElastiFlow release is 3.2.1.

Since you are in a learning phase, you might also want to take a look at...

And to see a good example of using grok patterns to handle a bunch of the weird variety of syslog headers that vendors can throw at you...


(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.