Syslog input to elastic using logstah

Hi Guys,

Im pretty much new to ELK stack, im currently fetching the syslog logs to logstash through syslog plugin and trying to push it to elastic, as we need to provide the json as an input to elastic, im trying to convert the below syslog format to JSON. Im not much sure the implementation is correct or not. Hope any of the expertise help in this.

Extracted data from syslog with below logstash conf

{
"message" => "2/21/2020 19:29.34;Appliance Warning:Cluster-AutoUpdate:Exception while updating cluster-status: (<class 'ksa.backend.client.ClientException'>,..;Type:System;Product:ccc Appliance;Host:xxxxxxxx;Severity:Critical;User:-;Description:Cluster-AutoUpdate:Exception while updating cluster-status: (<class 'ksa.backend.client.ClientException'>, ClientException(), <traceback object at 0x7f0ece93ca70>);--",
"severity_label" => "Emergency",
"facility" => 0,
"timestamp" => "2/21/2020 19:29.34",
"host" => "xxxxxxxx",
"severity" => 0,
"facility_label" => "kernel",
"@timestamp" => 2020-02-21T11:29:35.308Z,
"logmessage" => "Appliance Warning:Cluster-AutoUpdate:Exception while updating cluster-status: (<class 'ksa.backend.client.ClientException'>,..;Type:System;Product:ccc Appliance;Host:xxxxxxxxx;Severity:Critical;User:-;Description:Cluster-AutoUpdate:Exception while updating cluster-status: (<class 'ksa.backend.client.ClientException'>, ClientException(), <traceback object at 0x7f0ece93ca70>);--",
"priority" => 0,
"@version" => "1",
"tags" => [
[0] "_dateparsefailure",
[1] "_jsonparsefailure"
]
}

conf file used:

input {
syslog {
port => 514
codec => plain
syslog_field => "message"
grok_pattern => "%{DATA:timestamp};%{GREEDYDATA:logmessage}"
}
}

output {
elasticsearch { hosts => ["xxxxxxxxxxxx:9200"]
index => "%{"TEST"}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

i need to convert the "logmessage" field seperated with ";" in to json and store it to elastic.
even i tried adding filter tag to the code and no luck.

Use a kv filter.

Thank you @Badger .

may be useful for others, below are the changes made to work.

    input {
  syslog {
    port => 514
    codec => plain
   syslog_field => "message"
   grok_pattern => "%{DATA:timestamp};%{GREEDYDATA:logmessage}"
    }
}


#Save Output in JSON format
filter {
   kv {
     source => "logmessage"
     field_split => ";"
     value_split => ":"
     trim_key => " "
   }


mutate {
  remove_field => ["logmessage","message","tags","timestamp"]
}

}


output {
   elasticsearch {
         hosts => ["XXXXXXXXXXXX:9200"]
         index => "XXXXX-syslog-%{+YYYY.MM.dd}"
                  }
    stdout { codec => json }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.