Hi,
I am reading a .log file using filebeat and I need the data to be output to the elasticsearch.
This is my filebeat configuration
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- ~/Documents/development/filebeat/*.log
#================================ Outputs =====================================
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
I have a log file placed in the ~/Documents/development/filebeat directory
This is my logstash configuration
input {
beats {
host => "localhost"
port => 5044
}
}
filter {
// some filtering applies in here
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "transactions-%{+YYYY.MM.dd}"
}}
I have executed the process once and the log file entries were created as an index in the elasticsearch. Then I deleted the created index and now trying to recreate the index by restarting all the services (elasticsearch, logstash and filebeat)
But the index is not creating again.
I even deleted the filebeat registry file and tried it again, but again the registry file will be created with the old data. I have even renamed the log files.
What is the wrong thing that I am doing here ?