Hello,
We are trying to make some sense of our voip logs using ELK. We are currently running ELK 6.0, feeding logstash via filebeat running on syslog server. Voip logs are being deposited in separate .log file in /var/log and are formatted as seen below. Using the grokdebug.herokuapp site the filter shown below works to parse the log lines. We have tried adding additional "match => " lines with different renditions of the grokdebug filter, but logstash fails to start.
Are there some docs/suggestions about how to incorporate this grokdebug filter into the logstash conf?
Voip output:
11/30 12:49:05 0000:00:54 T3001 *0018 9999999999 3000 3014 001 9999999999 9999999999 W0013550 A S0018485 1
Herokuapp filter (works):
%{SPACE}(?[0-9]++/[0-9]++ ++)%{TIME:CallOriginTime}%{SPACE}(?[0-9]{4}:[0-9]{2}:[0-9]{2} ++)(?[A-Z0-9* ]{8})%{SPACE}(?[0-9]{4})%{SPACE}(?[0-9A-Z])%{SPACE}(?[0-9A-Z ] 001)%{SPACE}(?[0-9A-Z]++)%{SPACE}(?[0-9A-Z]++)
logstash.conf (without voip filter):
input {
beats {
port => 5044
}
}
filter {
if [type] == "log" {
grok {
match => { "message" => "%{COMMONAPACHELOG}" }
match => { "message" => "%{COMBINEDAPACHELOG}" }
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
}
}
output {
elasticsearch {
hosts => [ "xxx.xxx.xxx.xxx:9200" ]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "doc"
}
}