Parse syslog from Syslog

(Digitalism) #1

Hi all,

I am new in ELK solution :smiley: and currently I am working on Logstash -> Elasticsearch -> Kibana.
I send the syslog from a checkpoint to logstash however I would like to parse correctly the syslog with logstash.

I think I didn't understand well how to parse a character in string element. Currently in Kibana I have this result:
Capture22

We can see one field "message" and on this one I have:
<134>1 2019-02-12T21:47:03Z info CheckPoint 2105 - [action:"Drop"; flags:"393216"; ifdir:"inbound"; ifname:"bond2"; loguid:"{0x5c633gd8,0x6,0x24bv703e,0x47f4985}"; origin:"9.1.0.1"; originsicname:"CN=name-1,O=name.domain.com.com"; sequencenum:"201"; time:"1550008023"; version:"5"; __policy_id_tag:"test vpn & FireWall-1[db_tag={E53339C-8456-1C43-AF36-F12340F57FDA};mgmt=name1;date=1549371733;policy_name=office\]"; dst:"3.2.1.2"; message_info:"Missing OS route"; product:"VPN-1 & FireWall-1"; proto:"17"; s_port:"1633"; service:"1633"; src:"192.168.2.3"; ]

You can see other parameters and I would like to parse this parameters is it possible?
Can you help me on that?

Thank you very much for your help and your time!

#2

Try this

    dissect { mapping => { "message" => "<%{level}>%{} %{ts} %{loglevel} %{something} - [%{kvString}" } }
    date { match => [ "ts", "ISO8601" ] }
    kv { source => "kvString" field_split => ";" value_split => ":" trim_key => " " }
(Digitalism) #3

Hi Badger,

Thank you for your reply. However, do you have a documentation on that? I would like to understand how he works :stuck_out_tongue:
I will try your solution next Monday.

#4

dissect, kv and date are all standard filters.

(Digitalism) #5

Thank you for your help. However I added your proposition in my "GROK" and when I would like to execute logstash I receive an error (failed to execute action (grok...))

Maybe my configuration file on my logstash is not correct?! :confused:

input {
  tcp {
port => 5140
type => syslog
  }
  udp {
port => 5140
type => syslog
  }
}

filter {
  kv {
  }
  grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"           }

    dissect { mapping => { "message" => "<%{level}>%{} %{ts} %{loglevel} %{something} - [%{kvString}" } }
    date { match => [ "ts", "ISO8601" ] }
    kv { source => "kvString" field_split => ";" value_split => ":" trim_key => " " }
  }
  mutate {
     gsub => ["timestamp"," "," "]
  }
  date {
     match => [ "timestamp", "ddMMMYYYY HH:mm:ss" ]
  }
}

output {
  elasticsearch {
hosts => "127.0.0.1"
index => "syslog-%{+YYYY.MM.dd}"
  }
}

Can I have your support on the filter "grok"?

Thank you very much!

#6

Get rid of grok and also the kv{} filter with no options. Just use

filter {
    dissect { mapping => { "message" => "<%{level}>%{} %{ts} %{loglevel} %{something} - [%{kvString}" } }
    date { match => [ "ts", "ISO8601" ] }
    kv { source => "kvString" field_split => ";" value_split => ":" trim_key => " " }
}
(Digitalism) #7

Han ok. It's strange it's not necessary to specify the syslog format? Now I receive:
Feb 18 18:15:44 bcb-chp-mgmt2.localdomain logstash[13078]: [2019-02-18T18:15:44,241][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"syslog-2019.02.18", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x6db15994>], :response=>{"index"=>{"_index"=>"syslog-2019.02.18", "_type"=>"doc", "_id"=>"lYabAWkBSNN2StILLB_S", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [syslog-2019.02.18] has been exceeded"}}}}

#8

If you go to Kibana and look at the index mapping, do the field names look reasonable? You may need to update the mapping so as not to autoindex, or you may just need to start over with a new daily index.

(Digitalism) #9

Hi Badger,

thank you very much for your support your solution is working now!
However I am little lost on the filter xD

filter {
dissect { mapping => { "message" => "<%{level}>%{} %{ts} %{loglevel} %{something} - [%{kvString}" } }
date { match => [ "ts", "ISO8601" ] }
kv { source => "kvString" field_split => ";" value_split => ":" trim_key => " " }

}

On this step: "message" => "<%{level}>%{} %{ts} %{loglevel} %{something} - [%{kvString}"

#10

The dissect filter parses a field using delimiters. In that case the delimiters are <, >, space, space, space, and " - [".

(Digitalism) #11

Is it possible to use one field present in the syslog in the output element?
I would like to sort the syslog before to send all in the DB.

For exemple

> output {
>      If (origin = 192.168.1.1){
>            hosts => "127.0.0.1"
>            index => "index1"
>      }
>      If (origin = 192.168.1.2){
>            hosts => "127.0.0.1"
>            index => "index2"
>      }
>      If (origin = 192.168.1.3){
>            hosts => "127.0.0.1"
>            index => "index3"
>      }
> }
#12

Try

if [origin] == "192.168.1.1" {
(system) closed #13

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.