How to split log data into message fields when we have dynamic logs

i have created some log patterns in logstash but when i get new log pattern it is parsed into the kibana as a message filed but is not split up into other fields. I would like to know do i need to add log patterns for every type of logs or is there any other way logstash or elastic search can divide all types of logs into our required fields

This is a sample data

Dec 27 06:25:05 desk.outlet19 MD: Sending desk is closed to UI
Dec 27 06:25:03 desk.outlet17 MD: Sending desk is closed to UI
Dec 27 06:25:03 desk.outlet17 MD: State change done!
Dec 27 06:25:03 desk.outlet17 MD: Request state change enter standby to UI
Dec 27 06:25:03 desk.outlet12 MD: Sending desk is closed to UI
Dec 27 06:25:06 desk.outlet19 MD: Sending desk is closed to UI
Dec 27 06:25:06 desk.outlet19 MD: State change done!

my logstash configuration for this is

input {
    beats {
        port => "5044"
    }
}
filter {
    grok {
        match => {"message" => '(?<timestamp>[\w\s\d\:]+)\s(?<outlet>[\w\d\.]+)\s(?<ID>[\w\:]+)\s(?<Status>[\w\s\!]+)'
        }
    }}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

but when i received new log data like this

Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] ABSTRACT: SyncMode=Speed Speed=0.100 Delay=0.000 HasSG=0 StartRamp=0.030 StopRamp=0.030 MasterAxisId=0
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] ABSTRACT.AxisId=312 AxisName=Center:v PlannedStart=4.161 Target=4.324 UseCurrPos=1
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] EXPLICIT: DuraNoRamp=1.627 MaxMinDuraNoRamp=0.163 MaxSpeed=1.000 MinActP=0.000000 MaxActP=0.000000 ActP=0.000000
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] EXPLICIT.AxisId=312 AxisName=Center:v PlannedStart=4.161 Start=4.161 ActIsPos=4.161 Target=4.324 Speed=0.100 StartRamp=0.030 StopRamp=0.030 LimRampScaling=1.000
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [ApiUiJob] PlanJob(): return Speed=0.100000 Duration=4.658009 MinDuration=4.658009 MaxSpeed=1.000000

how can i handle this data in my logstash ,
i mean is there any configuration which can only fetch the required data from the log files even when the logs are dynamic and it is difficult for a log pattern to recognize

you might want to share sample input and current logstash configuration to allow others to help you.

Hi @ptamba ,

i have added the data to my post please check once and share your views

If you add

kv {}

to your filter it will parse the key/value pairs, resulting in events like

{
       "AxisName" => "Center:v",
   "PlannedStart" => "4.161",
       "ActIsPos" => "4.161", ...
          "Start" => "4.161",
 "LimRampScaling" => "1.000",
     "@timestamp" => 2020-06-04T13:46:28.181Z,
         "outlet" => "27",
         "Target" => "4.324",
      "StartRamp" => "0.030",
       "StopRamp" => "0.030",
      "timestamp" => "Dec",
"EXPLICIT.AxisId" => "312",
        "message" => "Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] EXPLICIT.AxisId=312 AxisName=Center:v PlannedStart=4.161 Start=4.161 ActIsPos=4.161 Target=4.324 Speed=0.100 StartRamp=0.030 StopRamp=0.030 LimRampScaling=1.000",
             "ID" => "10:38:29",
         "Status" => "ff402",
          "Speed" => "0.100"
}

Personally I would use dissect rather than grok for this

        dissect { mapping => { "message" => "%{timestamp} %{+timestamp} %{+timestamp} %{outlet} %{ID} %{Status} %{restOfLine}" } }
        kv { source => "restOfLine" }
1 Like

Hi @Badger thanks for sharing this pattern i will try with both patterns and will update if i get everything running

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.