Migrating logs sources from Elasticsearch to Logstash

I have 12 log sources running Packetbeat, WinLogbeat while some additionally run Filebeat & Metricbeat. They currently feed directly into a single Elasticsearch instance that is running processor at ~95%

I am thinking of offloading all the ingestion and enrichment of logs (such as the geopipeline running on the elasticsearch node) to a single Raspberry Pi running logstash pipeline, which is running at much lesser compute:

I could not find pipeline configuration to move WinLogbeat and Packetbeat data to Logstash pipeline. Could someone guide me to configurations?

Here is the configuration of the Geo Pipeline in Elasticsearch:

{
  "description": "Add geoip info",
  "processors": [
    {
      "geoip": {
        "field": "client.ip",
        "target_field": "client.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "source.ip",
        "target_field": "source.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "destination.ip",
        "target_field": "destination.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "server.ip",
        "target_field": "server.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "host.ip",
        "target_field": "host.geo",
        "ignore_missing": true
      }
    }
  ]
}

How do I create similar configuration in Logstash?

      {
      "geoip": {
        "field": "client.ip",
        "target_field": "client.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "source.ip",
        "target_field": "source.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "destination.ip",
        "target_field": "destination.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "server.ip",
        "target_field": "server.geo",
        "ignore_missing": true
      }
    },
    {
      "geoip": {
        "field": "host.ip",
        "target_field": "host.geo",
        "ignore_missing": true

I have copied the pipeline configuration from Elasticsearch. How do I add batch information to send logs in a reduced manner to help reduce the compute on the Elasticnode.

If a member here feels there will be no compute offset directing logs to Logstash first and later moving it to Elasticsearch in batches, please do let me know.

Thank you for the guidance.

The logstash equivalent of that would be

geoip {
    source => "[server][ip]"
    target => "[server][geo]"
    tag_on_failure => []
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.