Elastic Parsing

I'm using Security Onion 2 and it uses logstash to ingest logs and sends them to elasticsearch. Their documents say elastic parses the logs and the parsers are located under elasticsearch/ingest/. In that directory there is a syslog parser and it's configuration is below. I'm sending logs to my box and they are showing up in kibana under syslog. But they are not being parsed. I'm assuming because the parser is kind of generic and doesn't match my syslog. My question is do I just need to take the existing syslog parser, add the processors and grok to match my packet below? I'm not sure if i'm looking in the right place.

{
"description" : "syslog",
"processors" : [
{+
"dissect": {
"field": "message",
"pattern" : "%{message}",
"on_failure": [ { "drop" : { } } ]
},
"dissect": {
"field": "msg:",
"pattern" : "%{message}",
"on_failure": [ { "drop" : { } } ]
},
"remove": {
"field": [ "type", "agent" ],
"ignore_failure": true
}
},
{
"grok":
{
"field": "message",
"patterns": [
"^<%{INT:syslog.priority}>%{DATA:syslog.timestamp} %{WORD:source.application}: %{GREEDYDATA:real_message}", "^%{SYSLOGTIMESTAMP:syslog.timestamp} %{SYSLOGHOST:syslog.host} %{SYSLOGPROG:syslog.program}: CEF:0\\|%{DATA:vendor}\\|%{DATA:product}\\|%{GREEDYDATA:message2}",
],
"ignore_failure": true
}
},
{ "set": { "if": "ctx.source?.application == 'filterlog'", "field": "dataset", "value": "firewall", "ignore_failure": true } },
{ "set": { "if": "ctx.vendor != null", "field": "module", "value": "{{ vendor }}", "ignore_failure": true } },
{ "set": { "if": "ctx.product != null", "field": "dataset", "value": "{{ product }}", "ignore_failure": true } },
{ "set": { "field": "ingest.timestamp", "value": "{{ @timestamp }}" } },
{ "date": { "if": "ctx.syslog?.timestamp != null", "field": "syslog.timestamp", "target_field": "@timestamp", "formats": ["MMM d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601", "UNIX"], "ignore_failure": true } },
{ "remove": { "field": ["pid", "program"], "ignore_missing": true, "ignore_failure": true } },
{ "pipeline": { "if": "ctx.vendor != null && ctx.product != null", "name": "{{ vendor }}.{{ product }}", "ignore_failure": true } },
{ "pipeline": { "if": "ctx.dataset == 'firewall'", "name": "filterlog", "ignore_failure": true } },
{ "pipeline": { "name": "common" } }
]
}

19:32:41.831733 IP (tos 0x0, ttl 63, id 44489, offset 0, flags [DF], proto UDP (17), length 768)
10.10.10.73.57725 > 10.10.10.18.syslog: [udp sum ok] SYSLOG, length: 740
Facility user (1), Severity info (6)
Msg: 2020-12-09T19:32:41.813Z 10.10.10.73 CEF:0|McAfee|ESM|10.3.4|47-4000141|Suspicious - Internal Host Logon without Logoff|0|start=1607508014000 end=1607511661000 rt=1607542288000 cnt=3 eventId=47037728 nitroUniqueId=47037728 deviceExternalId=Correlation Engine deviceTranslatedAddress=127.0.0.1 externalId=47308390 cat=Login nitroNormID=408944640 act=success deviceDirection=0 dst=10.10.10.16 src=10.10.11.130 spt=50894 nitroTrust=2 nitroAppID=microsoft-windows-security-auditing sntdom=mydomain shost=dc1.mydomain.home.com nitroObjectID=kerberos suser=superuser$ nitroSecurity_ID=S-1-5-21-369414021-3906933990-4067070837-19988 nitroLogon_Type=3 - Network nitroSource_Logon_ID=0x0 nitroDestination_Logon_ID=0xa0df8dd0

Welcome to our community! :smiley:

Yes, your approach seems to make sense. I would use the grok debugger in Kibana to help with your pattern matching.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.