Not able to parse fields with logstash

Hello everyone,

Am using Logstash and am a new user. I have a log file which I need to parse and get some meaningful fields like Timestamp, User, Domain, message, response code, error code etc.,
I use Grok pattern to parse the log file and my filter plugin of the ‘test’ configuration file looks like the below.

filter {
grok {
match => { “message" => "%{USERNAME}@%{HOSTNAME}" }
}
}

Here in this filter, I just tried to parse only the user@domain field from the log line. The sample log file which I have created is given below. This file has some rough logs just for testing purpose to see whether my logstash is working.

2016-10-23 18:57:00 firstuser@elk.com
2016-10-23 18:58:17 seconduser@elk.co.in
2016-10-23 18:58:17 thirduser@elasticsearch.com

Here the user@domain (i.e firstuser@elk.com) should be parsed like below.
{
"USERNAME": [
[
"firstuser"
]
],
"HOSTNAME": [
[
"elk.com"
]
]
}

But the actual result I got after stashing is like this:

{
“MESSAGE”: “2016-10-23 18:57:00 firstuser@elk.com
}

I also used this filter to parse all possible fields, but this does not seem to work even.

filter {
grok {
match => { “message" => "%{COMBINEDAPACHELOG}" }
}
}

Note: All the configuration file sample code segments which I have given above does not have errors, as the "--configtest" passes fine. If you find any errors in that, that must be of the typo error.

match => { “message" => "%{USERNAME}@%{HOSTNAME}" }

This matches the message field of each event against a grok expression but no fields are extracted into the events. You'll want something like this:

match => { “message" => "%{USERNAME:username}@%{HOSTNAME:hostname}" }
match => { “message" => "%{COMBINEDAPACHELOG}" }

Your log isn't in Apache combined format so that pattern won't help you.

Thanks for the reply.

And yeah my log is not combined apache.

Also I found out that the log line has fields at specific number of spaces in between them. So when we configure the fields to how it has to be parsed, we must give that specific number of spaces in the config file. The configuration of the field should also be in exact order as in the log file line.

For ex:
2016-10-20 16:33:30 9 Query firstuser@elk.com

The above log line has timestamp, a number (consider this as id), a word (consider this as command) and user@domain fields in it..
Each of the above mentioned fields have exactly three spaces between each other..
So the filter configuration to parse this file must also be in the same format(i.e., same order of occurrence and also same number of spaces between the fields)

So the configuration for filter should be like this

filter {
grok {
match => { 'message' => '%{TIMESTAMP_ISO8601:LogTime} %{INT:ID} %{WORD:Command} %{USERNAME:User}@%{HOSTNAME:Domain}' }
}
}

Here, between the timestamp, int, word and username, hostname there are exactly three spaces and also the order of occurrence is corresponding to the log line.

Correct me if am wrong.

Yeah, this should work.