i have created some log patterns in logstash but when i get new log pattern it is parsed into the kibana as a message filed but is not split up into other fields. I would like to know do i need to add log patterns for every type of logs or is there any other way logstash or elastic search can divide all types of logs into our required fields
This is a sample data
Dec 27 06:25:05 desk.outlet19 MD: Sending desk is closed to UI
Dec 27 06:25:03 desk.outlet17 MD: Sending desk is closed to UI
Dec 27 06:25:03 desk.outlet17 MD: State change done!
Dec 27 06:25:03 desk.outlet17 MD: Request state change enter standby to UI
Dec 27 06:25:03 desk.outlet12 MD: Sending desk is closed to UI
Dec 27 06:25:06 desk.outlet19 MD: Sending desk is closed to UI
Dec 27 06:25:06 desk.outlet19 MD: State change done!
my logstash configuration for this is
input {
beats {
port => "5044"
}
}
filter {
grok {
match => {"message" => '(?<timestamp>[\w\s\d\:]+)\s(?<outlet>[\w\d\.]+)\s(?<ID>[\w\:]+)\s(?<Status>[\w\s\!]+)'
}
}}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
but when i received new log data like this
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] ABSTRACT: SyncMode=Speed Speed=0.100 Delay=0.000 HasSG=0 StartRamp=0.030 StopRamp=0.030 MasterAxisId=0
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] ABSTRACT.AxisId=312 AxisName=Center:v PlannedStart=4.161 Target=4.324 UseCurrPos=1
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] EXPLICIT: DuraNoRamp=1.627 MaxMinDuraNoRamp=0.163 MaxSpeed=1.000 MinActP=0.000000 MaxActP=0.000000 ActP=0.000000
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [PlanJob] EXPLICIT.AxisId=312 AxisName=Center:v PlannedStart=4.161 Start=4.161 ActIsPos=4.161 Target=4.324 Speed=0.100 StartRamp=0.030 StopRamp=0.030 LimRampScaling=1.000
Dec 27 10:38:29 ff402-srv1 MC: DEBUG {2151} [ApiUiJob] PlanJob(): return Speed=0.100000 Duration=4.658009 MinDuration=4.658009 MaxSpeed=1.000000
how can i handle this data in my logstash ,
i mean is there any configuration which can only fetch the required data from the log files even when the logs are dynamic and it is difficult for a log pattern to recognize