I have a web server. I have a two-server es cluster. I want to pipe the access logs to the es cluster using logstash.
Is this the right configuration for the web server? How do I configure the ES cluster to accept the data?
web server
input {
file {
path => "/var/log/httpd/miss*log"
type => "apache"
}
}
filter {
if [type] == "apache" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
if [clientip] in ["10.1.88.11", "10.1.88.12", "10.1.88.13", "10.1.88.14", "10.1.88.15", "10.1.88.16", "10.1.42.117", "10.1.42.118", "10.1.42.119", "10.1.88.21", "10.1.88.22", "10.1.88.23", "10.1.88.24", "10.1.88.25", "10.1.88.26", "10.1.42.127", "10.1.42.128", "10.1.42.129"] {
drop {}
}
}
}
output {
elasticsearch {
cluster => "muostats"
host => "10.210.2.98:9300"
protocol => "node"
index_type => "apache"
workers => "1"
}
}
- Output host => is 'a' IP from my current configuration while I troubleshoot.