Grokking Voip logs - grokdebug to logstash.conf

Hello,

We are trying to make some sense of our voip logs using ELK. We are currently running ELK 6.0, feeding logstash via filebeat running on syslog server. Voip logs are being deposited in separate .log file in /var/log and are formatted as seen below. Using the grokdebug.herokuapp site the filter shown below works to parse the log lines. We have tried adding additional "match => " lines with different renditions of the grokdebug filter, but logstash fails to start.

Are there some docs/suggestions about how to incorporate this grokdebug filter into the logstash conf?

Voip output:
11/30 12:49:05 0000:00:54 T3001 *0018 9999999999 3000 3014 001 9999999999 9999999999 W0013550 A S0018485 1

Herokuapp filter (works):

%{SPACE}(?[0-9]++/[0-9]++ ++)%{TIME:CallOriginTime}%{SPACE}(?[0-9]{4}:[0-9]{2}:[0-9]{2} ++)(?[A-Z0-9* ]{8})%{SPACE}(?[0-9]{4})%{SPACE}(?[0-9A-Z])%{SPACE}(?[0-9A-Z ] 001)%{SPACE}(?[0-9A-Z]++)%{SPACE}(?[0-9A-Z]++)

logstash.conf (without voip filter):
input {
beats {
port => 5044
}
}

filter {
if [type] == "log" {
grok {
match => { "message" => "%{COMMONAPACHELOG}" }
match => { "message" => "%{COMBINEDAPACHELOG}" }
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
}
}
output {
elasticsearch {
hosts => [ "xxx.xxx.xxx.xxx:9200" ]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "doc"
}
}

Hi, what do this failure and then subsequent logstash logging look like?

Hello,
I have gotten Logstash to start. There must have been an errant typo.

Parsing does not seem to be happening. After refreshing field list in Kibana, these entries are still just blob messages. I am going to try separate instance of logstash and feeding by stdin to stdout and see what results are. Will post results if necessary.

Thanks,
W

Hello,

Separate logstash instance doing stdin to stdout. The grok filter is working and the fields are being parsed and output.

Now the question is how to troubleshoot the parsed fields not showing up in ES?

The log lines do to ES, but they are not parsed. So, logstash parsing correctly, sending to ES, in Kibana/ES, the parsed fields do not appear, only the bulk message. What would be the next troubeshooting steps?

Thank you,
W

UPDATE
This is now working. The issue was the "if [type]" statement. Removed that and new log lines are showing up and parsed correctly.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.