Elasticsearch Error: java.lang.IllegalArgumentException

Hi,

Several days ago, my filebeat agent started hiccuping with the following error:

2018-03-30T14:02:55.027-0400    DEBUG   [elasticsearch] elasticsearch/client.go:507     Bulk item insert failed (i=0, status=500
): {"type":"exception","reason":"java.lang.IllegalArgumentException: java.lang.IllegalArgumentException: field [destination-ip]
not present as part of path [json.destination-ip]","caused_by":{"type":"illegal_argument_exception","reason":"java.lang.IllegalA
rgumentException: field [destination-ip] not present as part of path [json.destination-ip]","caused_by":{"type":"illegal_argumen
t_exception","reason":"field [destination-ip] not present as part of path [json.destination-ip]"}},"header":{"processor_type":"g
eoip"}}

It had been working fine for about a month until then. I don't remember changing anything either so I'm at a loss as to what would cause such an error.

I've tried going into /etc/filebeat/filebeat.yml and lowering the bulk_max_size to 10.
I've also tried setting json.keys_under_root: true.
I've tried disabling the pipeline: geoip pipeline as well.

None of the above worked, unfortunately.

Does anyone happen to have any insight as to what causes this error and perhaps some troubleshooting methods to fix it?

I'd like to keep all of the data already shipped during the past month in the index but if it must be that I have to create a new index or scrap the data, then it must be.

My /etc/filebeat/filebeat.yml is below:

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

- type: log

  enabled: true
  paths:
    - /var/log/data/data-out/*
  json.add_error_key: true
  json.message_key: log
  #json.keys_under_root: true

  pipeline: geoip

#============================= Filebeat modules ===============================

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

#==================== Elasticsearch template setting ==========================

setup.template.enabled: false
setup.template.name: "my-filebeat"
setup.template.pattern: "my-filebeat-*"

#================================ General =====================================

name: my-filebeat

#================================ Outputs =====================================

#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
  hosts: ["10.10.10.5:9200"]

  protocol: "http"

  index: "u2elk"

  pipeline: geoip

  bulk_max_size: 10

#================================ Logging =====================================

logging.level: debug

Example data being shipped:
{"dport-icode": 1000, "vlan-id": null, "destination-ip": "10.255.12.21", "source-ip": "8.8.4.4", "sensor-id": "5, "event-second": 1519907325, "data": "TEST DATA", "data-printable": "test data", "msg": "Firewall Client long host entry", "event-microsecond": 341757}

Example PUT:

  "prospector": {
    "type": "log"
  }
}
2018-03-30T14:20:51.457-0400    DEBUG   [event] common/event.go:55      Dropped nil value from event where key=json.vlan-id
2018-03-30T14:20:51.457-0400    DEBUG   [publish]       pipeline/processor.go:275       Publish event: {
  "@timestamp": "2018-03-30T18:20:51.455Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.2.3",
    "pipeline": "geoip"
  },
  "beat": {
    "name": "my-filebeat",
    "hostname": "test-VM",
    "version": "6.2.3"
  },
  "source": "/var/log/data/data-out/events.json-20183827",
  "offset": 1612599,
  "json": {
    "classification": "Attempted User Privilege Gain",
    "event-microsecond": 341757,
    "dport-icode": 1000,
    "error": {
      "type": "json",
      "message": "Key 'log' not found"
    },
    "msg": "Firewall Client long host entry",
    "source-ip": "8.8.4.4",
    "data": "TEST DATA",
    "sensor-id": "5",
    "log": "",
    "destination-ip": "10.255.15.27",
    "event-second": 1519907325,
    "data-printable": "test data"
  },
  "prospector": {
    "type": "log"
  }
}

Hello @vopsec, I think that you are using a custom ingest pipeline, from what I see this error is coming from the ingest node and the geoip processor, maybe some of your data doesn't have the destination-ip and by default the processor will do a hard fail.

You can either make sure your data always have that field if possible or ignore the error by configuring the options ignore_missing and set it to true

Bless you, pierhugues! That did the trick! Thank you for the quick reply! I appreciate it!

I'm still trying to learn the jargon of Elasticsearch and its inner workings. Would you happen to have a theory as to why this would happen all of a sudden after working fine for a month? All the data now flooding in contains json.destination-ip.

Do plugins such as ingest-geoip update automatically?

For posterity, I did:
curl -H "Content-Type: application/json" -XPUT 'http://10.10.10.5:9200/_ingest/pipeline/geoip?pretty' -d @new-geoip-plugin.json
with new-geoip-plugin.json containing:

{
  "description" : "Add geoip info",
  "processors" : [
    {
      "geoip" : {
        "field" : "json.destination-ip",
        "target_field" : "geoip.destination-ip",
        "ignore_missing": true
      }
    },
    {
      "geoip" : {
        "field" : "json.source-ip",
        "target_field" : "geoip.source-ip",
        "ignore_missing": true
      }
    }
  ]
}

Good!

I would think the producers had changed format? If nothing was changed on elasticsearch or filebeat. This is the only thing I could think of.

Plugins have to be updated when you update elasticsearch, new version of the plugins are released at the same time of es.

How bizarre...

Did you do an update on filebeat and you started to have problems?

I did not- at least, not manually. Hmmm...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.