Can't get log level in filebeat


(Xdholy) #1

Hi, in our project we store all our logs in one log file with the log pattern:
timestamp-server-id-loglevel-program-module-...

I set up a filebeat-logstash-es stream and i want to apply different grok patterns to different log levels.
The problem is, since i can't(shouldn't) define multiple prospectors over one file, I need another way to get the log levels before send to logstash.
First i want to use processors to find out if a log contains the level keyword. But i can't find a support processor that allows me to add additional fields, which is very easy to do in the prospectors config using:
fields:
level: log
fields:
level:error
..etc..
(the [include fields] processor can't add fields and [rename] processor can't change field's value )

So my questions are:

  1. Can I define more than one prospector over one file?
  2. If not, how can i get the log level field before send it to logstash? Is there a support processors that allows me to add fields when the message contains level keyword?

(Xdholy) #2

Here is a sample of our logs.
4 different log levels [DEBUG] [ERROR] [UNIQ] [TRACE] in the same file.


Log Text:

2018-05-30 19:04:33.605 sceneserver[2102] ERROR: [LUA][Random] RandomSucc player:20000032 XD level:17 group:1 zone:20 VIP:10 source:23 type:67
2018-06-01 16:02:42.646 sceneserver[2102] DEBUG: [LUA][Charge] PlayerCharge player:20000032 XD level:17 group:1 zone:20 VIP:8 source:6 gear:4999 diamond:9040
2018-06-02 12:02:42.646 sceneserver[2102] DEBUG: [LUA][Charge] PlayerCharge player:20000021 HOLLY level:17 group:1 zone:20 VIP:8 source:6 gear:4999 diamond:10040
2018-06-02 19:03:06.691 sceneserver[2102] DEBUG: [LUA][Charge] PlayerCharge player:20000006 Korry level:17 group:1 zone:20 VIP:10 source:7 gear:9999 diamond:21040
2018-06-04 20:06:32.956 sceneserver[2102] UNIQ: [LUA][LEVEL] LEVELUP player:20000021 HOLLY level:17 group:1 zone:20 VIP:10 source:35 score:220 area:4
2018-06-03 19:04:33.605 sceneserver[2102] DEBUG: [LUA][Random] RandomSucc player:20000006 Korry level:17 group:1 zone:20 VIP:10 source:23 type:67
2018-06-04 19:04:33.605 sceneserver[2102] ERROR: [LUA][Active] Player Active Config Error!
2018-06-04 19:04:33.605 sceneserver[2102] ERROR: [LUA][Active] Player Active Config Error!
2018-06-04 19:04:33.605 sceneserver[2102] ERROR: [LUA][Active] Player Active Config Error!
2018-06-04 19:04:33.605 sceneserver[2102] TRACE: Player 20000001 login
2018-06-04 20:06:32.956 sceneserver[2102] DEBUG: [LUA][Game] Fish player:20000006 Korry level:17 group:1 zone:20 VIP:10 source:35 AddExp:400
2018-06-04 20:06:32.956 sceneserver[2102] UNIQ: [LUA][LEVEL] LEVELUP player:20000006 Korry level:17 group:1 zone:20 VIP:10 source:35 score:0 area:3

usual struct is:
timestamp-server-id-loglevel-prog-module-...

Now, i can only get the log level after send it to logstash, after grok.


(Xdholy) #3

And my current filebeat config:

  • type: log
    enabled: true
    paths:

    • /home/dev/log/sceneserver*.log.*
      fields:
      level: scene
  • type: log
    enabled: true
    paths:

    • /home/dev/log/sessionserver*.log.*
      fields:
      level: session
  • type: log
    enabled: true
    paths:

    • /home/dev/log/.log.

    multiline.pattern: ^[2][0]
    multiline.negate: true
    multiline.match: after

    fields:
    level: error


(Tom Callahan) #4

You should spend the time parsing the logs in Logstash instead of trying to pre-parse in Filebeat. Filebeat is really more designed to send the logs upstream.

use a multi-stage grok or dissect pattern match. Ie.
dissect example:

filter {
dissect {
mapping => {
"message" => "%{ts} %{+ts} %{+ts} %{src}[%{pid}] %{loglevel}: %{msg}"
}
}
grok {
match => { "msg" => "^[%{WORD:type}][%{WORD:something}]\s?%{WORD:function}\s?player:%{INT:player_id}\s?%{WORD:something2}\s?level:%{INT:level}\s?group:%{INT:group}\s?zone:%{INT:zone}\s?VIP:%{INT:vip}\s?source:%{INT:source}\s?type:%{INT:type}"
}
}

This is an example, you should tweak for your actual environment


(Xdholy) #5

Wow..I didn't know i could do a multi-stage grok.
It works perfectly.
Thanks a lot.


(Tom Callahan) #6

There's quite a bit you can do with logstash, the tradeoff is how much CPU/Memory is consumed by the filters in your pipelines. Glad to see that worked.


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.