Let CSV columns be fields

Hello I'm trying to get the names of the columns from my csv file to become field names. However, they all seem to be staying in the message field

input
{
    beats {
            port => 5044
    }
}
filter
{
    csv {
            columns => ["Package","Class","Test","10"]
            separator => ","
        }
}
output
{
    stdout {
            codec => rubydebug
    }

    elasticsearch {
            hosts => ["http://localhost:9200"]
            index => "csv-data-2019"
    }
}

Hi

Its the CSV comming in "message" field?
If not, you need yo specify a source field with "field" parameter.

Could you post an example if the stdout output without the CSV filter?

How do I go about specifying a source field?

I got rid of the CSV filter and the output stayed the same.

I would not expect the message field to change just because you are using a csv filter, however, I would expect with the filter you show that the event would have [Package], [Class], [Test], and [10] fields added.

If that did not happen, and you did not get a parse failure then that really suggests you were not running with the configuration you thought you had in place. How are you starting logstash?

Can you add '--log.level debug --config.debug' to your command line? That will cause it to log the actual configuration it is running with.

It was my mistake I was going straight to elasticsearch from filebeat and not sending it to logstash. I have fixed that now on the filebeat.yml fille.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.