Problem with geoip.location

Hi all

I'm trying to load in ES a SSH.cef file.

I'm used this configuration file for logstash

input {
tcp {
# The delimiter config used is for TCP interpretation
codec => cef { delimiter => "\r\n"}
port => 5000
type => syslog
}
}

filter {

To map the attacker Geo IP if plausible

geoip {
source => "sourceAddress"
}

To map the target Geo IP if plausible

geoip {
source => "destinationAddress"
}

To map the log producing device Geo IP if plausible

geoip {
source => "deviceAddress"
}

#Map startTime to @timestamp field
date {
match => ["startTime","MMM dd YYY HH:mm:ss"]
}
}

output {
elasticsearch {

Add host server ELK

    hosts => ["server:9200"]
    index => "cef-ssh-%{+YYYY.MM.dd}"
}

}

Then I started logstash and I don't see error. I loaded the data with the command

cat SSH.cef | nc localhost 5000

And the log of elastic search was ok.

Then I started kibana and create new index. If I see the data in discover I see this

immagine

And I don't use the Coordinate-Map for this error message


Could you help me to understand where is the error.

A line of the sample
CEF:0|Unix|Unix|5.0|cowrie.session.connect|New connection: 192.168.1.105:60740 (192.168.1.105:2222) [session: 6e99ac86]|Unknown|externalId=1 st
artTime=Nov 15 2016 19:18:21 destinationHostName=elastic_honeypot destinationAddress=192.168.20.2 deviceReceiptTime=Nov 15 2016 19:18:21 device
TimeZone=Z transportProtocol=TCP applicationProtocol=SSHv2 destinationServiceName=sshd devicePayloadId=1 message=New connection: 192.168.1.105:
60740 (192.168.1.105:2222) [session: 6e99ac86] destinationAddress=192.168.1.105 destinationTranslatedAddress=192.168.1.105 deviceTranslatedAddr
ess=192.168.1.105 deviceAddress=192.168.1.105 destinationTranslatedPort=2222 destinationPort=2222 categoryOutcome=None categoryBehaviour=cowrie
.session.connect sourceTranslatedAddress=192.168.1.105 sourceAddress=192.168.1.105 sourceTranslatedPort=60740 sourcePort=60740 deviceDirection=
1 cs1=0 cs1Label=isError cs2=cowrie.ssh.factory.CowrieSSHFactory cs2Label=system cs4=6e99ac86 cs4Label=session

Thank you
Franco

I think the problem would be with the tempate that your index is using. If you look at this geoip you will see that ip and location have types different to yours...

AA1

Correct Badger

I'm doing the same check, but I don't use a template for indexing and so I don't understand how could I set the a different mode to use geoip.

I set in logstash configuration file the parameter
geoip {
source => "destinationAddress"
}
where destiantionAddress is the field that contain IP address.

In the output I don't set any template

output {
elasticsearch {
hosts => ["server:9200"]
index => "cef-ssh-%{+YYYY.MM.dd}"
}

OK, if you don't have a template then I would not expect it to work. Try writing some of those events to an index that starts with logstash-, e.g. logstash-justtesting (so that you can delete it). I would then expect that the default template would match and that template defines geoip.

You will need to install a template that matches your index name. Something like this...

PUT _template/ssh-cef
{
  "order": 1,
  "version": 10,
  "index_patterns": [
    "ssh-cef-*"
  ],
  "mappings": {
    "_default_": {
      "properties": {
        "geoip": {
          "dynamic": true,
          "properties": {
            "ip": {
              "type": "ip"
            },
            "location": {
              "type": "geo_point"
            },
            "latitude": {
              "type": "half_float"
            },
            "longitude": {
              "type": "half_float"
            }
          }
        }
      }
    }
  }
}

Thank Badger.
I tried to create the template like this

{
"order": 100,
"template": "test-*",
"settings": {
"index": {
"number_of_shards": "3",
"number_of_replicas": "0"
}
},
"doc": {
"mappings": {
"properties": {
"destinationPort": {
"type": "integer"
},
"flexDate1": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:
ss",
"type": "date"
},
"sourcePort": {
"type": "integer"
},
"baseEventCount": {
"type": "integer"
},
"destinationAddress": {
"type": "ip"
},
"destinationProcessId": {
"type": "integer"
},
"oldFileSize": {
"type": "integer"
},
"destination": {
"dynamic": true,
"properties": {
"city_name": {
"type": "keyword"
},
"country_name": {
"type": "keyword"
},
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
},
"region_name": {
"type": "keyword"
}
}
},
"source": {
"dynamic": true,
"properties": {
"city_name": {
"type": "keyword"
},
"country_name": {
"type": "keyword"
},
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
},
"region_name": {
"type": "keyword"
}
}
},
"deviceReceiptTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"destinationTranslatedPort": {
"type": "integer"
},
"deviceTranslatedAddress": {
"type": "ip"
},
"deviceAddress": {
"type": "ip"
},
"agentReceiptTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"startTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"sourceProcessId": {
"type": "integer"
},
"bytesIn": {
"type": "integer"
},
"bytesOut": {
"type": "integer"
},
"severity": {
"omit_norms": true,
"type": "text"
},
"deviceProcessId": {
"type": "integer"
},
"agentAddress": {
"type": "ip"
},
"sourceAddress": {
"type": "ip"
},
"sourceTranslatedPort": {
"type": "integer"
},
"deviceCustomDate2": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"deviceCustomDate1": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"flexNumber1": {
"type": "long"
},
"deviceCustomFloatingPoint1": {
"type": "float"
},
"oldFileModificationTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"deviceCustomFloatingPoint2": {
"type": "float"
},
"oldFileCreateTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"deviceCustomFloatingPoint3": {
"type": "float"
},
"sourceTranslatedAddress": {
"type": "ip"
},
"deviceCustomFloatingPoint4": {
"type": "float"
},
"flexNumber2": {
"type": "long"
},
"fileCreateTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"fileModificationTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"fileSize": {
"type": "integer"
},
"destinationTranslatedAddress": {
"type": "ip"
},
"endTime": {
"format": "epoch_millis||epoch_second||date_time||MMM dd yyyy HH:mm:ss",
"type": "date"
},
"deviceCustomNumber1": {
"type": "long"
},
"deviceDirection": {
"type": "integer"
},
"device": {
"dynamic": true,
"properties": {
"city_name": {
"type": "keyword"
},
"country_name": {
"type": "keyword"
},
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
},
"region_name": {
"type": "keyword"
}
}
},
"deviceCustomNumber3": {
"type": "long"
},
"deviceCustomNumber2": {
"type": "long"
},
"categoryOutcome": {
"type": "keyword"
},
"destinationHostName": {
"type": "keyword"
},
"destinationAddress": {
"type":"ip"
}
}
}
},
"aliases": {}
}

I included this template in the logstash configuration file but I have the same result.

Thank you
Franco

Does the third line of the template match the name of the index you are writing to?

My new index name is test-ssh-*. The output configuration of logstash is

output {
elasticsearch {
hosts => ["server:9200"]
template_name => "test"
template => "./test_template.json"
template_overwrite => true
index => "test-ssh-%{+YYYY.MM.dd}"
}
}

I cannot solve this issue, but ... If you go to Dev Tools and GET _template/* you will see that what actually got installed is

  "test": {
    "order": 100,
    "index_patterns": [
      "test-*"
    ],
    "settings": {
      "index": {
        "number_of_shards": "3",
        "number_of_replicas": "0"
      }
    },
    "mappings": {},
    "aliases": {}
  }
Your template has a doc containing a mappings containing properties. It should have a mappings containing some named objects, e.g. doc, which then contain properties. However, I cannot get the location fields to be geo_points using this mapping / doc / properties / device / properties / location structure however. What does work is having a second mapping like this, but it does not solve the general problem.
"mappings": {
"_default_": {
  "dynamic_templates": [
    {
      "locations": {
        "match": "location",
        "mapping": {
          "type": "geo_point"
        }
      }
    }
  ]
},    
"doc": {
"properties": {

Hi Badger
thank you for suggestion.

I add the " _ default _ " field in the json file like this

"mappings": {

 "_default_": {
    "dynamic_templates": [
      {
        "message_field": {
          "path_match": "message",
          "match_mapping_type": "string",
          "mapping": {
            "norms": false,
            "type": "text"
          }
        }
      },
      {
        "string_fields": {
          "match": "*",
          "match_mapping_type": "string",
          "mapping": {
            "fields": {
              "keyword": {
                "ignore_above": 256,
                "type": "keyword"
              }
            },
            "norms": false,
            "type": "text"
          }
        }
      }
    ],

I obtained this file seeing the mapping of other logstash mapping, and then I delete index-pattern from kibana and index from elasticsearch too.
Then I ran again logstash with configuration file modified and it's go on!
immagine
Thank you for the exchange of information
Franco

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.