How to test a pipeline with a CSV processor

I want to ingest this csv file:

text,value
The triangle is blue.,10
The square is red.,20
There are two red circles.,30
The triangle is green.,40

I created this ingestion pipeline:

[
  {
    "csv": {
      "field": "message",
      "target_fields": [
        "text",
        "value"
      ]
    }
  },
  {
    "remove": {
      "field": "message"
    }
  }
]

Now I want to test it. In the Kibana Test Documents tool I add a single document to process.

[
  {
    "_source": {
      "message": "text,value\\nThe triangle is blue.,10\\nThe square is red.,20\\nThere are two red circles.,30\\nThe triangle is green.,40"
    }
  }
]

I want the pipeline to produce the following:

{
  "docs": [
    {
      "doc": {
        "_source": {
          "text": "The triangle is blue.",
          "value": 10
        }
      }
    },
    {
      "doc": {
        "_source": {
          "text": "The square is red.",
          "value": 20
        }
      }
    },
    {
      "doc": {
        "_source": {
          "text": "There are two red circles.",
          "value": 30
        }
      }
    },
    {
      "doc": {
        "_source": {
          "text": "The triangle is green.",
          "value": 40
        }
      }
    }
  ]
}

Instead I get this

{
  "docs": [
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "value": "value\\nThe triangle is blue.",
          "text": "text"
        },
        "_ingest": {
          "timestamp": "2025-03-20T18:56:49.224817335Z"
        }
      }
    }
  ]
}

What am I doing wrong?

I don't think that there is anything wrong, what you want to do is not possible with ingest pipelines.

If your file have multiple lines, each line needs to be sent as a separated event (this is how beats or logstash will do).

Of course. I was thinking of CSVs as being ingested all at once because that's what the Upload integration does. But of course one input per document makes more sense.