Parsing logs from Fortinet WAF

Hello all,

Im new using ELK, so I have some doubts in some stuff I´m trying to do. I configured ELK Stack and im trying to ingest some logs to Elasticsearch. One of them are logs from Fortinet Web Application Firewall.

The logs are build as follows:

v009xxxxdate=2020-02-15 time=01:55:48 log_id=000001 msg_id=000001device_id=111111
vd="root" timezone="(GMT+1:00)Brussels,Copenhagen,Madrid,Paris" timezone_dayst="GMTc-1" type=attack pri=alert main_type="Signature Detection" sub_type="Information Disclosure" 
trigger_policy="Syslog_WAF" severity_level=Low proto=tcp service=http backend_service=http action=Alert policy="SYSLOG_GENERATOR" src= 00000 src_port=0000 dst=00000 dst_port=00000
http_method=get http_url="http://google.es" http_host="user" http_agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) 
Chrome/79.0.3945.130 Safari/537.36" http_session_id=9999msg="HTTP Header triggered signature ID 1 of Signatures policy SYSLOG_GENERATOR" signature_subclass="HTTP Header Leakage" 
signature_id="000001" signature_cve_id="N/A" srccountry="Reserved" content_switch_name="none" server_pool_name="SYSLOG_GENERATOR" false_positive_mitigation="none" user_name="Unknown" monitor_status="Enabled" 
http_refer="google" http_version="1.x" dev_id="none" es=0 threat_weight=5 history_threat_weight=0 threat_level=Low ftp_mode="N/A" ftp_cmd="N/A" cipher_suite="none" ml_log_hmm_probability=0.000000 ml_log_sample_prob_mean=0.000000 
ml_log_sample_arglen_mean=0.000000 ml_log_arglen=0 ml_svm_log_main_types=0 ml_svm_log_match_types="none" ml_svm_accuracy="none" ml_domain_index=0 ml_url_dbid=0 ml_arg_dbid=0 ml_allow_method="none" owasp_top10="A3:2017-Sensitive Data Exposure" bot_info="none"

I´m triying to use logstash to parse it, and then send them to Elasticsearch. I tryed it with the follow configuration file, but it does not work:

input {
file {
path =>"/var/log/messages"
start_position => "beginning"
}
}

filter{

dissect { mapping => { "message" => "%{[@metadata][ts]} %{+[@metadata][ts]} %{+[@metadata][ts]} %{[@metadata][restOfLine]}" } }
kv { source => "[@metadata][restOfLine]" }
date { match => [ "[@metadata][ts]", "MMM dd HH:mm:ss" ] }

}

output {
elasticsearch { 
index => "fortinet"
hosts => ["localhost:9200"]}
}

It´s not working. Logstash is not inserting the logs into Elasticsearch. When I run it, I don´t see any errors, but I don´t know why is not working. I don´t know exactly how to do it.
What I should do?

Thank you very much.

Regards

You do not need the dissect for that format, just kv the message field.

Thank you @Badger for answering,

So you mean to do it in this way?

input {
file {
path =>"/var/log/messages"
start_position => "beginning"
}
}

filter  {
kv { 
source => "message" 
value_split => "="
 }
 }

output {
elasticsearch { 
index => "fortinet"
hosts => ["localhost:9200"]}
}

Thank you very much for help!

Regards,

Yes, if your messages have that format then that should work.

@Badger, I tryed the above code I showed you.
It did not work, I don´t know why. After use it, I can see data in Elasticsearch, but data is not properly formated:

  • If I use v 009xxxxdate as index, no data is found into Kibana after creating index.
  • If I use timestamp, the data I see into Kibana has the following format after creating index:

http_session_id:none message:v009xxxxdate=2020-02-07 time=15:30:24 log_id=00001 msg_id=0001 device_id=device vd="root" timezone="(GMT+1:00)Brussels,Copenhagen,Madrid,Paris" timezone_dayst="GMTc-1" type=attack pri=alert main_type="Signature Detection" sub_type="SQL Injection" trigger_policy="Syslog" severity_level=High proto=tcp service=http backend_service=http action=Alert_Deny policy="xxdd" src=4ip src_port=000 dst=ip dst_port=0001 http_method=get http_url="google' ORDER BY 1005--

Also I saw this output in console:
[2020-03-10T20:49:35,718][WARN ][logstash.codecs.plain ][main] Received an event that has a different character encoding than you configured. {:text=>"v009xxxxdate=2020-02-05 time=09:16:52 log_id=0001 msg_id=00002 device_id=device vd=\\\"root\\\" timezone=\\\"(GMT+1:00)Brussels,Copenhagen,Madrid,Paris\\\" timezone_dayst=\\\"GMTc-1\\\" type=attack pri=alert

Any idea what could be happening?
Thank you very much

Could the fields be tab separated rather than space separated?

@Badger, fields are separated by one space, and like mostly logs, each log is one line of a document.
So I don´t know what could be happening...

Ok, I found the solution! It was without the value_split => "="

Thank you very much for help!!

Anyway @Badger, How would it be if the log were similar like this one, but the fields separated by commas instead spaces?
For example:

"2015-01-16 17:48:41","ActiveDirectoryUserName",
"ActiveDirectoryUserName,ADSite,Network",
"10.10.1.100","24.123.132.133","Allowed","1 (A)",
"NOERROR","domain-visited.com.",
"Chat,Photo Sharing,Social Networking,Allow List"

Could be something like this?

filter {
    csv {
    separator => ","
    columns => [ "Timestamp", "Most Granular Identity", "Identities", "InternalIp", "ExternalIp", "Action", "QueryType", "ResponseCode", "Domain", "Categories" ]
    }
}

Thank you!!!

That looks reasonable.

For this example case, is it also necessary to use the source indicator? as follows:

filter {
csv {
separator => ","
source => "message"
columns => [ "Timestamp", "Most Granular Identity", "Identities", "InternalIp", "ExternalIp", "Action", "QueryType", "ResponseCode", "Domain", "Categories" ]
   }
}

Thank you.

"message" is the default value for the source option, so that should not be required.

@Badger I wrote the log wrong, sorry for it. The log file has the next format:

"{"sourceFile":"log.csv.gz","EventType":"DNSLog","Timestamp":"2020-03-03 20:26:20","MostGranularIdentity":"DNS","Identities":"DNS","InternalIp":"IP","ExternalIp":"IP","Action":"Allowed","QueryType":"1 (A)","ResponseCode":"NOERROR","Domain":"google.","Categories":"Infrastructure"}{"sourceFile":"logs.csv.gz","EventType":"DNSLog","Timestamp":"2020-03-03 20:26:22","MostGranularIdentity":"DNS","Identities":"DNS","InternalIp":"ip","ExternalIp":"ip","Action":"Allowed","QueryType":"28 (AAAA)","ResponseCode":"NOERROR","Domain":"google.","Categories":"Infrastructure"}{"sourceFile":"log.csv.gz","EventType":"DNSLog"........................

I tryed the above code and it does not ingest data to Elasticsearch. I also tryed with kv filter (same code than before), but also no works. Maybe its cause by the format of log.
Any idea?

Thank you

That appears to be JSON, so I would try a json filter.

Im trying with this code:

filter {
        json {
        source => "message"
        }
    }

    output {
     stdout {}
    }

I don´t see the output, so it´s not working properly... I mean, data is not getting parsed.
I even tryed to ingest it to Elastic, but no index was created.

Solved, it was a format problem of the log file.

Thank you very much for help @Badger !

@Badger do you know how to remap fields into ECS standard?
I´m getting the problem that I don´t see fields into Elastic SIEM even I can see data into Elasticsearch. It´s because data is not in ECS standar, but how to do it?

Thank you very much!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.