Extracting the message and adding new fields dynamically

Hi,

I have below mentioned log events. From the last part, slog_message, "GREEDYDATA" in the grok, I would like to match again on that and extract various data out of it. For example, device id's, usernames etc. How can that be achieved? I want to create Kibana dashboards grouped by the device ids or usernames etc.

==========Log events=================
2018/02/09 09:21:08.800 CONSOLE 7aab3115-03e8-42d4-94ed-a7af754ffa6e [0000000-0000000] (12) Info a.b.c.d.e.listener 0 Agent Profiles To Install For Device ID 2
2018/03/30 09:47:35.081 CONSOLE 95e3c58a-07bc-4354-8f71-0f3d133daa43 [0000000-0000000] (8) Info a.b.c.d.e.LoginController Invalid user for console login: adminafaff
2018/03/30 09:47:35.081 CONSOLE 597bd229-3f41-49ac-9e44-6559dcb63a81 [0000000-0000000] (10) Info a.b.c.Web.d.e.authenticate User is not logged in, redirecting to login page
2018/03/30 09:47:35.128 CONSOLE 95e3c58a-07bc-4354-8f71-0f3d133daa43 [0000000-0000000] (8) Info a.b.c.Web.d.e.Controllers.LoginController New invalid user: adminafaff

==========Configuration=================
input {
beats {
port => 5044
}
}

filter {
grok {
match => {"message" => "(?<slog_date>^\d{4}/\d{2}/\d{2})\s{1,}%{TIME: slog_time_stamp}\s{1,}%{USERNAME:slog_host}\s{1,}%{UUID:slog_uuid}\s{1,}(?<slog_thread-id>[[0-9-]+])\s{1,}((?<slog_pid>[0-9]+))\s{1,}%{LOGLEVEL:slog_level}\s{1,}%{GREEDYDATA:slog_message}"}
}

//Something of this sort.
if [slog_message] =~ "New invalid user:" {
//extract the username from the slog_message
match => {slog_message => "^[a-zA-Z.]+\s{1,}New invalid user:\s{1,}(?<invalid_user>[a-zA-Z]+)\s{0,}$"}
}
if [slog_message] =~ "Device ID" {
//extract device id "2"
}
if [slog_message] =~ "Invalid user for console login:" {
//extract the username
}
}

output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

//Something of this sort.
if [slog_message] =~ "New invalid user:" {
//extract the username from the slog_message
match => {slog_message => "[1]+\s{1,}New invalid user:\s{1,}(?<invalid_user>[a-zA-Z]+)\s{0,}$"}
}

You're close:

//Something of this sort.
if [slog_message] =~ "New invalid user:" {
  grok {
    //extract the username from the slog_message
    match => {slog_message => "^[a-zA-Z.]+\s{1,}New invalid user:\s{1,}(?<invalid_user>[a-zA-Z]+)\s{0,}$"}
  }
}

  1. a-zA-Z. ↩︎

1 Like

Thank you @magnusbaeck for the quick response. That worked.

Was my approach the efficient or the right way to do? Are there any other better ways of achieving it?

You could've just skipped the conditional and tried the grok filter with tag_on_failure set to an empty array.

1 Like

@magnusbaeck can this step be done with elastic search instead of doing this in the log stash phase? If we can, then how to do it with elastic search?
Which is more efficient? groking in log-stash or doing it with elastic search?

You should be able to set up an Elasticsearch ingest pipeline to perform the same thing. I don't know which options performs best.

1 Like

Thank you for a quick response.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.