Hello,
This might be a newbish question but I haven't been able to find what I've been looking for elsewhere online. I am trying to learn how to send logs that don't have pre-built modules to logstash and have logstash do some formatting and create indexes that I can then send to elasticsearch. I am trying to collect squid logs, mark them as such, and send them to logstash for this formatting with filebeat.
My issue is, it seems like there are no issues with filebeat on the squid server but I'm not seeing anything pass through logstash to elasticsearch. I'm assuming I am tagging the logs wrong and/or I my logstash filter isn't picking them up.
Here is my configuration files:
filebeat.yml on squid server:
filebeat.inputs:
- type: log
enabled: true
paths:
- '/var/log/squid/access.log'
exclude_files: ['.gz$']
fields:
type: 'squid'
output.logstash:
hosts: ["10.4.4.15:5044"]
On ELK server..
02-beats-input.conf
input {
beats {
port => 5044
}
}
12-squid-filter.conf
filter {
if [type] == "log" {
if [type] == "squid" {
mutate {
add_field => { "hey_this_works" => "yay" }
}
}
}
}
And for output..I'm troubleshooting by sending locally to a file but will change once I get it working.
30-elasticsearch-output.conf
output {
file {
codec => "plain"
path => "/var/logs/logs-%{+YYYY-MM-dd}.txt"
}
}