Logstash grok customized filter f5 big-ip

Dear All,

I installed ELK to use it to log F5 big ip , it is receiving logs successfully and I can search within logs in kibana,search is working very fast compared to any syslog I worked with it.
however, I cannot parse the logs to get any statics from them using grok.I need statistics like which ip addresses are visiting our website, statistics about the http response codes.
any help is highly appreciated.

kindly find my logstash.conf

input {

tcp {
type => "f5-access"
port => 5045
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

kindly find sample of logs

<134>Mar 20 13:07:30 F5.aast.edu ASM:unit_hostname="F5.aast.edu",management_ip_address="10.0.0.1",http_class_name="/Common/www.aast.edu",web_application_name="/Common/www.aast.edu",policy_name="/Common/www.aast.edu",policy_apply_date="2017-02-20 07:52:02",violations="",support_id="1017117902917082439",request_status="passed",response_code="200",ip_client="185.144.40.100",route_domain="0",method="GET",protocol="HTTP",query_string="",x_forwarded_for_header_value="185.144.40.100",sig_ids="",sig_names="",date_time="2017-03-20 13:07:29",severity="Informational",attack_type="",geo_location="NL",ip_address_intelligence="N/A",username="N/A",session_id="8af3fbb94d9848bd",src_port="56374",dest_port="80",dest_ip="172.20.38.11",sub_violations="",virus_name="N/A",violation_rating="0",websocket_direction="N/A",websocket_message_type="N/A",device_id="N/A",uri="/en/images/home/5.jpg",request="GET /en/images/home/5.jpg HTTP/1.1\r\nHost: www.aast.edu\r\nUser-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0\r\nAccept: /\r\nAccept-Language: en-US\r\nAccept-Encoding: gzip, deflate\r\nReferer: http://www.aast.edu/en/\r\nCookie: f5avrbbbbbbbbbbbbbbbb=DBFEAJOPBJELFDAKEBJCCBFOELKCMNMEHKMOJEPJPEAFGAJFMOFBHLGLHIANLLEMHOIDAFJBHHDMOFBAHHBANMDBILPBEEECHLGAPGJOKJDAKMCKOOJGJABBIBPEMDPA; f5_cspm=1234; __utma=59762365.1779799469.1409206989.1433332267.1444386902.3; _ga=GA1.2.1779799469.1409206989; f5avrbbbbbbbbbbbbbbbb=MOPPNHHBFNHONHICDCABIHGFPJBCPMHJKLPNBCEEBNLNJMLMLNBKKEELMNEKNKFDHDIDANOFEHMPGDGPINLABDKAILPIHBKBGIEHKJLMGGJPCENOIIPCMDNIBCDONJAI; TS01ff0351=01ab6d262aeba8dba7ba167b17936a7e2c634b34d4dfd3f1b1da96c4ba2fbd6814f09180ecfee14cc7836f48e96db51a1f383e648e697c004c712ca5d2114010b0128da8c5561b238333f31b9938133941c73fd109; _gat=1\r\nConnection: keep-alive\r\nX-Forwarded-For: 185.144.40.100\r\n\r\n"

See the syslog example in the documentation to extract timestamp and a few other things from the events. Then use a kv filter on the key/value pairs in the rest of the message to extract them all into discrete fields.


Thanks for your reply,
I already followed documentation and I have the timestamp (kindly check attached image).

Can you plz elaborate for example how can use filter in the logstash configuration file to have the client ip address as index like timestamp and I will use this example to index other values (url, support_id, ..). I tried to use key pair values using grok but it failed. In the message log the client ip is (ip_client="185.144.40.100").
I used the following filter in the logstash.conf

filter {
if [type] == "f5-access" {
grok {
match => { "message" => %{IP:ip_client} }
}
}

logging stopped when applying this filter.

I already followed documentation and I have the timestamp (kindly check attached image).

You only have the timestamp that Logstash populates for you. As you can see it doesn't match the timestamp from your log entry.

Again, start with the syslog example from the documentation (https://www.elastic.co/guide/en/logstash/current/config-examples.html#_processing_syslog_messages). If it doesn't work, post your full configuration and an example log messages from a stdout { codec => rubydebug } output. Do not post screenshots.

Dear magnusbaeck ,
We followed the example of syslog and used the following filter :

filter {
grok {
match => { "message" => "<%{BASE10NUM:elastic_code}>%{SYSLOGTIMESTAMP:elastic_time} F5.aast.edu ASM:%{TIMESTAMP_ISO8601:f5_time} %{IPV4:client_ip} %{IPV4:server_ip} %{DATA:geo_location} %{BASE10NUM:response_code} %{DATA:username} %{BASE10NUM:support_id} %{DATA:application_policy} %{DATA:requested_uri}" }
}
}

and we had the following output:

{
"message" => "<131>Mar 29 12:59:34 F5.aast.edu ASM:2017-03-29 12:59:32 131.161.10.248 172.20.38.11 BR 200 N/A 1017117902927571753 /Common/www.aast.edu /ar/index.php/colleges/sites/SpryAssets/css/highlight/css/js/news/css/css/services/3\r",
"@version" => "1",
"@timestamp" => "2017-03-29T11:26:35.150Z",
"host" => "172.20.39.101",
"port" => 6033,
"type" => "f5-access",
"elastic_code" => "131",
"elastic_time" => "Mar 29 12:59:34",
"f5_time" => "2017-03-29 12:59:32",
"client_ip" => "131.161.10.248",
"server_ip" => "172.20.38.11",
"geo_location" => "BR",
"response_code" => "200",
"username" => "N/A",
"support_id" => "1017117902927571753",
"application_policy" => "/Common/www.aast.edu"
}

Currently, we need to take reports from output , for example create a graph showing percentage of different response codes. the top ip addresses accessing a webpage on graph or report?

That looks pretty good except that you should add a date filter to parse f5_time or elastic_time into @timestamp. If you send this data to Elasticsearch it should be easy to have Kibana plot the graphs you're looking for. Have you looked at the Kibana documentation and movies?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.