Split a field in elastic search

Heya @Tim_Rice, my simulated pipeline is indeed wrong :). Looking at how the data is structured, it seems that it is of the form

{"event_data":{"ruleName":"technique_id=T1130,technique_name=Install Root Certificate"}}

So, I wrote a new simulation and it works just fine with data in that format:

POST _ingest/pipeline/_simulate
{
  "pipeline" : {
    "processors": [
      {
        "kv": {
          "field": "event_data.ruleName",
          "field_split": ",",
          "value_split": "="
        }
      }
    ]
  },
  "docs" : [
    { "_source": {"event_data":{"ruleName":"technique_id=T1130,technique_name=Install Root Certificate"}} }
  ]
}

As for updating the ephemeral settings:

PUT winlogbeat-*/_settings/
{
  "index.default_pipeline":"my-pipeline"
}

CAUTION: I am not sure that winlogbeat does not already have a default_pipeline defined, you should be able to see if there is just by looking at the settings.

To make sure that the pipeline is continually set on new winlogbeat-* indices, you can update the existing template (to see templates call GET _cat/templates).