Elasticsearch - Logstash communication problem

Hi,

I discovered Logstash, elasticsearch and kibana few days ago, and i'm now trying to have a kibana Dashboard of my Squid's log from Pfsense, but i got some issues ...

The logs from Squid are from the web trafic of my LAN.
A default log entry look like this :
Nov 17 21:01:10 192.168.1.1 (squid-1): 1510952470.370 233176 192.168.1.103 TCP_TUNNEL/200 5931 CONNECT etherpad.fr:443 - ORIGINAL_DST/195.154.57.241 -

The Squid's log are send to a NAS server with syslog-ng on /var/log/pfsense/pfSense.long.
On this remote server, i installed logstash, elasticsearch and kibana.
My grok for the logs is the following, and is working fine on an online grok debugger :

_%{WORD:month} %{NUMBER:date} %{TIME:timestamp} %{IP:proxy} (%{WORD:squid_version}-%{INT:http_status_code}): %{NUMBER:header1} %{NUMBER:header2} %{IP:ip_source} %{WORD:protocole}/%{INT:code} %{NUMBER:header3} %{WORD:status} %{USER:fqdn_destination}:%{NUMBER:port_destination} - %{WORD:label_destination}/%{IP:ip_destination} _-

I made a pfsense_log.conf on the logstash directory, with the following options :
input {
file {
path => "/var/log/pfsense/pfSense.log"
start_position => beginning
sincedb_path => "/dev/null"
}
}

filter {
grok {
match => {
"message" => "%{WORD:month} %{NUMBER:date} %{TIME:timestamp} %{IP:proxy} (%{WORD:squid_version}-%{INT:http_status_code}): %{NUMBER:header1} %{NUMBER:header2} %{IP:ip_source} %{WORD:protocole}/%{INT:code} %{NUMBER:header3} %{WORD:status} %{USER:fqdn_destination}:%{NUMBER:port_destination} - %{WORD:label_destination}/%{IP:ip_destination} -"
}
}
}

output {
stdout {
codec => plain {
charset => "ISO-8859-1"
}
}
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "pfsense_log"
template => "/home/user/ELK/logstash-2.2.2/bin/pfsense_template.json"
template_name => "pfsense_log"
template_overwrite => true
}
}

And a pfsense_template.json :
{
"template": "pfsense_log",
"settings": {
"index.refresh_interval": "5s"
},
"mappings": {
"default": {
"dynamic_templates": [
{
"message_field": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string"
},
"match_mapping_type": "string",
"match": "message"
}
},
{
"string_fields": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string",
"fields": {
"raw": {
"index": "not_analyzed",
"ignore_above": 256,
"type": "string"
}
}
},
"match_mapping_type": "string",
"match": "*"
}
}
],
"properties": {
"date": {
"type": "date",
"format": "date_time_no_millis"
},
"proxy": {
"type": "string",
"index": "not_analyzed"
},
"ip_source": {
"type": "string",
"index": "not_analyzed"
},
"header1": {
"type": "integer",
"index": "not_analyzed"
},
"protocole": {
"type": "string",
"index": "not_analyzed"
},
"http_status_code": {
"type": "integer",
"index": "not_analyzed"
},
"fqdn_destination": {
"type": "string",
"index": "not_analyzed"
},
"status": {
"type": "string",
"index": "not_analyzed"
},
"label_destination": {
"type": "string",
"index": "not_analyzed"
},
"squid": {
"type": "string",
"index": "not_analyzed"
},
"ip_destination": {
"type": "string",
"index": "not_analyzed"
},
"_all": {
"enabled": true
}
}
}
}
}

Wen i run logstash (logstash -f pfsense_log.conf), i see that the data from my log file are loaded :
"path"=>"/var/log/pfsense/pfSense.log", "host"=>"user-virtual-machine", "month"=>"Nov", "date"=>"17", "timestamp"=>"22:12:41", "proxy"=>"192.168.1.1", "squid_version"=>"squid", "http_status_code"=>"1", "header1"=>"1510956761.029", "header2"=>"181845", "ip_source"=>"192.168.1.103", "protocole"=>"TCP_TUNNEL", "code"=>"200", "header3"=>"16464", "status"=>"CONNECT", "fqdn_destination"=>"etherpad.fr", "port_destination"=>"443", "label_destination"=>"ORIGINAL_DST", "ip_destination"=>"195.154.57.241"}, @lut={"path"=>[{"message"=>"Nov 17 22:12:41 192.168.1.1 (squid-1): 1510956761.029 181845 192.168.1.103 TCP_TUNNEL/200 16464 CONNECT etherpad.fr:443 - ORIGINAL_DST/195.154.57.241 -", "@version"=>"1", "@timestamp"=>"2017-11-18T00:42:38.562Z"

So i know that logstash take my log file and apply filter on it. But when i start elasticsearch, and when i try to reach 127.0.0.1:9200/pfsense_log (the name of my index), elasticsearch can't find the file, and i got an "Index_not_found_exeption" error ...
I'm not really sure if i need to manually create the "pfsense_log" file, and where to put it :confused: (elasticsearch/conf ? )
Or maybe i miss Something else ?
Any idee ?

Thanks :smile:

Which version are you using? Is there anything in the Elasticsearch and/or Logstash logs?

Hi,
I am using Logstash version 2.2.2, elasticsearch 2.2.1 and kibana 4.4.2.
And yes, i have some very strange things in my logs from logstash and elasticsearch :

And for elasticsearch :

If you are just starting to use the stack, why are you not using the latest versions??

That's because i read a lot of post about problem related to different version of logstash and elasticsearch trying to communicate together ... So i search which version of the bundle is 100% working fine together and i used it :blush:

Let me try again with the latest version :slight_smile:

It used to be a bit confusing as the components had different versioning schemes. From version 5.0, we however changed to unified releases, where it is expected that all components of the stack are at the same version. This is a lot easier, and also makes the support matrix considerably more compact. :slight_smile:

Oh ok, i see :smile:

I just try with the 6.0.0 version of the stack, but i can't run logstash and elasticsearch at the same time, it's about an insufficient java runtime memory allocation ...
But my virtual machine as something like 4Go of RAM and 4 processor of 1 core :sweat_smile:

Should i try with the 5.0.0 version ?

4GB of RAM should be fine for version 6.0, as I believe both by default use a 1GB heap.

I play a little bit with the jvm.option file and it's all good now !
And i can reach my index :

Regardeless to my output, i don't know if the grok filter is working, or if the data is parse ...
Anyway, thank's for your help :smile:
I'm going to search and try if it work !

Well, it's working ! I have my log from pfsense in my kibana Dashboard !!
Thank you again, i'm really happy :smile:

I just need to find a way to parse my incomming logs with multiple grok filter (i have different synthaxe depending on the log :

)

Problem solve !

Just a quick tip, it's always good to not post pictures of text. Some people block pics and sometimes they are really hard to see. But in all cases pictures mean we cannot copy/paste your data to try and help :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.