Hi,
I discovered Logstash, elasticsearch and kibana few days ago, and i'm now trying to have a kibana Dashboard of my Squid's log from Pfsense, but i got some issues ...
The logs from Squid are from the web trafic of my LAN.
A default log entry look like this :
Nov 17 21:01:10 192.168.1.1 (squid-1): 1510952470.370 233176 192.168.1.103 TCP_TUNNEL/200 5931 CONNECT etherpad.fr:443 - ORIGINAL_DST/195.154.57.241 -
The Squid's log are send to a NAS server with syslog-ng on /var/log/pfsense/pfSense.long.
On this remote server, i installed logstash, elasticsearch and kibana.
My grok for the logs is the following, and is working fine on an online grok debugger :
_%{WORD:month} %{NUMBER:date} %{TIME:timestamp} %{IP:proxy} (%{WORD:squid_version}-%{INT:http_status_code}): %{NUMBER:header1} %{NUMBER:header2} %{IP:ip_source} %{WORD:protocole}/%{INT:code} %{NUMBER:header3} %{WORD:status} %{USER:fqdn_destination}:%{NUMBER:port_destination} - %{WORD:label_destination}/%{IP:ip_destination} _-
I made a pfsense_log.conf on the logstash directory, with the following options :
input {
file {
path => "/var/log/pfsense/pfSense.log"
start_position => beginning
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => {
"message" => "%{WORD:month} %{NUMBER:date} %{TIME:timestamp} %{IP:proxy} (%{WORD:squid_version}-%{INT:http_status_code}): %{NUMBER:header1} %{NUMBER:header2} %{IP:ip_source} %{WORD:protocole}/%{INT:code} %{NUMBER:header3} %{WORD:status} %{USER:fqdn_destination}:%{NUMBER:port_destination} - %{WORD:label_destination}/%{IP:ip_destination} -"
}
}
}
output {
stdout {
codec => plain {
charset => "ISO-8859-1"
}
}
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "pfsense_log"
template => "/home/user/ELK/logstash-2.2.2/bin/pfsense_template.json"
template_name => "pfsense_log"
template_overwrite => true
}
}
And a pfsense_template.json :
{
"template": "pfsense_log",
"settings": {
"index.refresh_interval": "5s"
},
"mappings": {
"default": {
"dynamic_templates": [
{
"message_field": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string"
},
"match_mapping_type": "string",
"match": "message"
}
},
{
"string_fields": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string",
"fields": {
"raw": {
"index": "not_analyzed",
"ignore_above": 256,
"type": "string"
}
}
},
"match_mapping_type": "string",
"match": "*"
}
}
],
"properties": {
"date": {
"type": "date",
"format": "date_time_no_millis"
},
"proxy": {
"type": "string",
"index": "not_analyzed"
},
"ip_source": {
"type": "string",
"index": "not_analyzed"
},
"header1": {
"type": "integer",
"index": "not_analyzed"
},
"protocole": {
"type": "string",
"index": "not_analyzed"
},
"http_status_code": {
"type": "integer",
"index": "not_analyzed"
},
"fqdn_destination": {
"type": "string",
"index": "not_analyzed"
},
"status": {
"type": "string",
"index": "not_analyzed"
},
"label_destination": {
"type": "string",
"index": "not_analyzed"
},
"squid": {
"type": "string",
"index": "not_analyzed"
},
"ip_destination": {
"type": "string",
"index": "not_analyzed"
},
"_all": {
"enabled": true
}
}
}
}
}
Wen i run logstash (logstash -f pfsense_log.conf), i see that the data from my log file are loaded :
"path"=>"/var/log/pfsense/pfSense.log", "host"=>"user-virtual-machine", "month"=>"Nov", "date"=>"17", "timestamp"=>"22:12:41", "proxy"=>"192.168.1.1", "squid_version"=>"squid", "http_status_code"=>"1", "header1"=>"1510956761.029", "header2"=>"181845", "ip_source"=>"192.168.1.103", "protocole"=>"TCP_TUNNEL", "code"=>"200", "header3"=>"16464", "status"=>"CONNECT", "fqdn_destination"=>"etherpad.fr", "port_destination"=>"443", "label_destination"=>"ORIGINAL_DST", "ip_destination"=>"195.154.57.241"}, @lut={"path"=>[{"message"=>"Nov 17 22:12:41 192.168.1.1 (squid-1): 1510956761.029 181845 192.168.1.103 TCP_TUNNEL/200 16464 CONNECT etherpad.fr:443 - ORIGINAL_DST/195.154.57.241 -", "@version"=>"1", "@timestamp"=>"2017-11-18T00:42:38.562Z"
So i know that logstash take my log file and apply filter on it. But when i start elasticsearch, and when i try to reach 127.0.0.1:9200/pfsense_log (the name of my index), elasticsearch can't find the file, and i got an "Index_not_found_exeption" error ...
I'm not really sure if i need to manually create the "pfsense_log" file, and where to put it (elasticsearch/conf ? )
Or maybe i miss Something else ?
Any idee ?
Thanks