I use filebeat to send nginx access logs to logstash to parse them and then to elasticsearch. The folder includes access logs my previous days, but when I deploy the ELK stack, only the newly arrived nginx access logs are saved into elasticsearch. Is it possible to save also the old ones?
Are you using the nginx module here?
Here is my filebeat.yml
:
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/nginx/access.log
- /var/log/spring/geo/*.log
output.logstash:
enabled: true
hosts: ["logstash:5035"]
My nginx.conf
from logstash
pipelines:
input {
beats {
port => 5035
}
}
filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:http_x_forwarded_for}"]
}
grok {
match => [ "http_x_forwarded_for" , "%{IP:real_client_ip}"]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
}
useragent {
source => "message"
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "weblogs-%{+YYYY.MM.dd}"
document_type => "nginx_logs"
user => "elastic"
password => "changeme"
}
stdout { codec => rubydebug }
}
I am not using the nginx module. Should I use it?
It'd be easier, yeah. But it looks like you have another non-nginx source in use that makes it a bit harder.
In your log
input you can set things like ignore_older
to get it to capture those older ones.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.