Parse csv log file using filebeat

I have a log file which is in csv format and I need it to parse to elastic search using filebeat with the fields like IP, Client.OS, url, datafield etc in that line of csv file.
I have read that I have to use logstash to parse and enhance the data fields. But not able to do that.

So kindly please tell me the steps how to achieve this?

Thanks and Regards,

Hello @Tayyab, I've listed the possibilities in a [previous thread] (Question on filebeat multiline pattern)

1 Like

Hi @Tayyab,

I had a similar situation. I sorted it out for time being using a combination of split and script processors in the elasticsearch pipeline.

Here is a sample ingest pipeline you can use as starting point. This one is written with a 3 field csv in mind. It will store the values in three fields called ApplicationId, Level and Error.

PUT _ingest/pipeline/test_pipeline
{
  "processors": [
    {
      "split": {
        "field": "message",
        "target_field": "test",
        "separator": ","
      }
    },
    {
      "script": {
        "lang": "painless",
        "source": "ctx.ApplicationId = ctx.test[0];ctx.Level = ctx.test[1];ctx.Error = ctx.test[2]"
      }
    },
    {
      "remove": {
        "field": ["test"]
      }
    }
  ]
}

There are some edge cases you have to be careful of. I wrote a small blog entry on the same. I am waiting for an official csv processor to be released for pipelines. Once that happens we can avoid all this hacking around.

1 Like

Thank you.

Thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.