Filebeat not reading the already processed log file again

Hi,

I am reading a .log file using filebeat and I need the data to be output to the elasticsearch.

This is my filebeat configuration

#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - ~/Documents/development/filebeat/*.log

#================================ Outputs =====================================
output.logstash:
  # The Logstash hosts
  hosts: ["localhost:5044"]

I have a log file placed in the ~/Documents/development/filebeat directory
This is my logstash configuration

input {
    beats {
      host => "localhost"
      port => 5044
    }
}

filter {
// some filtering applies in here
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "transactions-%{+YYYY.MM.dd}"
}}

I have executed the process once and the log file entries were created as an index in the elasticsearch. Then I deleted the created index and now trying to recreate the index by restarting all the services (elasticsearch, logstash and filebeat)
But the index is not creating again.

I even deleted the filebeat registry file and tried it again, but again the registry file will be created with the old data. I have even renamed the log files.
What is the wrong thing that I am doing here ?

This seems odd. When you restart Filebeat after deleting the registry file, can you post the first 50 or so lines from the Filebeat log? Please be sure to redact any sensitive information before posting.

I have done a stupid mistake there. I was keep deleting the registry file when the logstash service is running. That's why it was not deleted properly. Stopping the service and then deleting works actually. My bad

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.