Logstash for cyberoam filtering

Dear All
I have an elk stack used with cyberoam, and I want to parse this message with logstash can you help me please:
"<30>date=2017-02-19 time=21:59:15 timezone="IST" device_name="CR200iNG" device_id=C20313272882-BQ2EUG log_id=010302602002 log_type="Firewall" log_component="Appliance Access" log_subtype="Denied" status="Deny" priority=Information duration=0 fw_rule_id=0 user_name="" user_gp="" iap=0 ips_policy_id=0 appfilter_policy_id=0 application="" application_risk=0 application_technology="" application_category="" in_interface="PortF" out_interface="" src_mac=dd:dd:dd:02:1c:e4 src_ip= src_country_code= dst_ip= dst_country_code= protocol="UDP" src_port=32771 dst_port=7423 sent_pkts=0 recv_pkts=0 sent_bytes=0 recv_bytes=0 tran_src_ip= tran_src_port=0 tran_dst_ip= tran_dst_port=0 srczonetype="" srczone="" dstzonetype="" dstzone="" dir_disp="" connid="" vconnid=""",

and can you kindly tell me how to parse the captured packets using logstash, because there is an ability in the cyberoam to capture the packets in the network and I sent this data to logstash but logstash is not showing data in kibana

best regards

Have a look at the KV filter :slight_smile:

Thanks for your reply
I am using following:

kv {
    source => "syslog_message"
  mutate {
    replace => ["type", %{syslog_program}"]
    remove_field => ["syslog_message", "syslog_timestamp"]

but I'm not getting anything into my elasticsearch neither kibana

What does the rest of your config look like?
Have you tried a stdout?
What version are you on?

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "utm-%{+YYYY.MM.dd}"
  stdout { codec => rubydebug }

This is my output

Is there anything in stdout then?

Yes there is this output when I do the configuration check
"<14>date" => "2017-02-20",
"time" => "12:10:52",
"timezone" => "IST",
"device_name" => "CR200iNG",
"device_id" => "C20313272882-BQ2EUG",
"log_id" => "050901616001",
"log_type" => "Content Filtering",
"log_component" => "HTTP",
"log_subtype" => "Allowed",
"status" => """",
"priority" => "Information",
"fw_rule_id" => "0",
"user_name" => "bag21",
"user_gp" => "Bag_Emp",
"iap" => "13",
"category" => "Bag",
"category_type" => "Productive",
"url" => "kh.google.com/flatfile?f1c-0201211130221313231-t.383",
"contenttype" => "application/octet-stream",
"override_token" => """",
"httpresponsecode" => """",
"src_ip" => "",
"dst_ip" => "",
"protocol" => "TCP",
"src_port" => "49864",
"dst_port" => "80",
"sent_bytes" => "0",
"recv_bytes" => "110",
"domain" => "kh.google.com"

It'd be useful if you provided your entire config as a single post.

Thanks alot, I don't know that happened but after a couple of restart it worked but I still have this tage message

"tags" => [
        [0] "_grokparsefailure",

do you know why?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.