I'm fairly new to ELK, but not completely hopeless, I hope. I am running filebeat on a Windows server to collect tomcat8 access logs (eClinicalWorks on Windows). I got the grok pattern correct as far as any pattern tester goes, but I cannot ship from filebeat to ES because of some escape character issues (I have to add an extra \ to any escapes for filebeat to work, but they fail when they hit ES, but anyway). I have reconfigured to pipe through logstash, and everything is great except the filebeat data now gets indexed under logstash, which is not what I want. Config files follow:
filebeat.yml
filebeat.inputs:
- type: log
enabled: false
paths:
filebeat.config.modules:
path: {path.config}/modules.d/*.yml
reload.enabled: true
reload.period: 15s
setup.template.settings:
index.number_of_shards: 1
index.number_of_replicas: 1
setup.template.enabled: true
setup.template.name: "filebeat"
setup.template.pattern: "filebeat-*"
setup.template.fields: "{path.config}/fields.yml"
tags: ["eCW", "tomcat","NLB"]
setup.kibana:
host: "kibana.domain.int:5601"
output.logstash:
hosts: ["pmr-ls1.domain.int:5045","pmr-ls2.domain.int:5045"]
loadbalance: true
index: "filebeat-%{YYYY.MM.dd}"
processors:
- add_host_metadata: ~
apache2.yml
- module: apache2
access:
enabled: true
var.paths: ["D:/eClinicalWorks/tomcat8/logs/jasper*.log"]
error:
enabled: false
logstash:
input {
beats {
port => 5045
type => file
tags => "neweCW"
}
}
filter {
grok {
match => {
"message" => "%{IPORHOST:apache2.access.remote_ip} - %{NUMBER:apache2.access.time} %{DATA:apache2.access.user_name} \[%{HTTPDATE:timestamp}\] %{WORD:apache2.access.request} %{DATA:apache2.access.url} HTTP/%{NUMBER:apache2.access.http_version} %{NUMBER:apache2.access.response_code} (?:%{NUMBER:apache2.access.body_sent.bytes}|-) %{DATA:apache2.access.referrer}"
}
overwrite => [ "message" ]
remove_field => [ "ident", "auth" ]
}
geoip { source => "apache2.access.remote_ip" }
mutate {
gsub => [
"request", "\?.+", "",
"proxiedip", "(^\"|\"$)", "",
"loginame", "(^\"|\"$)" , "",
"referrer", "(^\"|\"$)" , ""
]
}
mutate {
convert => {
"bytes" => "integer"
"elapsed_millis" => "integer"
"serverport" => "integer"
}
}
mutate {
remove_field => "host"
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
if "_grokparsefailure" not in [tags] {
stdout {
codec => rubydebug
}
}
elasticsearch {
manage_template => true
hosts => ["pmr-es1.domain.int:9200","phy-es1.domain.int:9200"]
index => ["filebeat-%{+YYYY.MM.dd}"]
}
# stdout {codec => rubydebug}
}
Some of the tags are for my tracking of changes to the filters (neweCW) , so yes I will have a bunch of junk tags.
Anything else I can provide? I could use a bit of help.
Thanks,
Rich