Extract data from CSV column

Hey I have logs as CSV and one column which is in string and has multiple values that need to extracted and stored as new fields.
The column name is comment and it's value is

VPN token auth failed. Destination was: 62.105.17.111, login name: user2, desc: Authentication failed: Invalid username or password , Auth type: profile"

From this column I need to extract:
ip_address == 64.135.77.120
user_name == Admin
description == Authentication failed: Invalid username or password

I have following solution but it's not optimal

We can use another grok over that column as

grok{
match=> ["comment"=> "%{GREEDYDATA:junk} auth %{WORD:action}. %{GREEDYDATA:temp2} %{IPV4:ip_address}, login name: %{WORD:user_name}, %{GREEDYDATA:desc} ,"]
}

How can I extract all the required values from the csv column in an optimized way ?

Personally I would do it using

grok {
    break_on_match => false
    match => {
        "comment" => [
            "%{IPV4:ip_address}",
            "login name: %{WORD:user_name},",
            "desc: (?<description>[^,]*),"
        ]
    }
}

Hey, @Badger, thanks again, I thought there is a missing paranthesis at end of regex

but even after appending it, the whole commentis getting extracted as description. Did you mean to write something else?

I updated it to add the closing parenthesis.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.