GeoIp based on custom field source.ip

Hi all,

I am trying to parse a log message. The original log looks like this:

Jul 10 08:51:10 prometheus sshd[19074]: Accepted password for my_user from 1.1.1.1 port 1111 ssh2

My filebeat conf is bellow:

---
filebeat.inputs:
  - type: filestream
    id: default-filestream
    paths:
      - ingest_data/*.log
      - /var/log/auth.log
filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true
processors:
  - add_docker_metadata: null
  - drop_event:
      when:
        not.contains:
          message: Accepted
  - dissect:
      tokenizer: "%{} %{} %{} prometheus sshd[%{pid|integer}]: Accepted password for %{service.user} from %{source.ip} port %{source.port} ssh2"
      field: "message"
      target_prefix: ""
      overwrite_keys: true

setup.kibana:
  host: ${KIBANA_HOSTS}
  username: ${ELASTIC_USER}
  password: ${ELASTIC_PASSWORD}
output.elasticsearch:
  hosts: ${ELASTIC_HOSTS}
  username: ${ELASTIC_USER}
  password: ${ELASTIC_PASSWORD}
  ssl.enabled: true
  ssl.certificate_authorities: certs/ca/ca.crt
  pipeline: geoip-info

I would like to have all the fields for the geoip enrichment.
The geoip is taken from the default processor ==> geoip

Basicity I plan to do the geolocation based on this custom field source.ip field from the log/message.

What I am doing wrong?

The error seems:

{\\\\\\\"type\\\\\\\":\\\\\\\"illegal_argument_exception\\\\\\\",\\\\\\\"reason\\\\\\\":\\\\\\\"pipeline with id [geoip-info] does not exist\

You need to create a "geoip-info" pipeline:

PUT _ingest/pipeline/geoip-info
{
"description": "Add geoip info",
"processors": [
{
"geoip": {
"field": "client.ip",
"target_field": "client.geo",
"ignore_missing": true
}
},
{
"geoip": {
"database_file": "GeoLite2-ASN.mmdb",
"field": "client.ip",
"target_field": "client.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true
}
},
{
"geoip": {
"field": "source.ip",
"target_field": "source.geo",
"ignore_missing": true
}
},
{
"geoip": {
"database_file": "GeoLite2-ASN.mmdb",
"field": "source.ip",
"target_field": "source.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true
}
},
{
"geoip": {
"field": "destination.ip",
"target_field": "destination.geo",
"ignore_missing": true
}
},
{
"geoip": {
"database_file": "GeoLite2-ASN.mmdb",
"field": "destination.ip",
"target_field": "destination.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true
}
},
{
"geoip": {
"field": "server.ip",
"target_field": "server.geo",
"ignore_missing": true
}
},
{
"geoip": {
"database_file": "GeoLite2-ASN.mmdb",
"field": "server.ip",
"target_field": "server.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true
}
},
{
"geoip": {
"field": "host.ip",
"target_field": "host.geo",
"ignore_missing": true
}
},
{
"rename": {
"field": "server.as.asn",
"target_field": "server.as.number",
"ignore_missing": true
}
},
{
"rename": {
"field": "server.as.organization_name",
"target_field": "server.as.organization.name",
"ignore_missing": true
}
},
{
"rename": {
"field": "client.as.asn",
"target_field": "client.as.number",
"ignore_missing": true
}
},
{
"rename": {
"field": "client.as.organization_name",
"target_field": "client.as.organization.name",
"ignore_missing": true
}
},
{
"rename": {
"field": "source.as.asn",
"target_field": "source.as.number",
"ignore_missing": true
}
},
{
"rename": {
"field": "source.as.organization_name",
"target_field": "source.as.organization.name",
"ignore_missing": true
}
},
{
"rename": {
"field": "destination.as.asn",
"target_field": "destination.as.number",
"ignore_missing": true
}
},
{
"rename": {
"field": "destination.as.organization_name",
"target_field": "destination.as.organization.name",
"ignore_missing": true
}
}
]
}

Thank you! I did! I added the pipeline, but despite it exists, the logs from filebeat says it doesn't exist as stated in the comment. So the pipeline is in the kibana/es and is not seen by filebeat.

I am retrieving the source.ip from the log file. It might be changed the order?

In my case I am:

  1. taking the log
  2. retrieve source.ip
  3. send for geolocation

It might be a different order for the pipeline?

  1. take the log
  2. do geolocation
  3. retrieve source.ip

Because the field doesn't exist while taking the log the geolocation is not done?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.