Filebeat need restart to send logs and his logs they are duplicate

Hello community elastic, i have a problem with filebeat, I have seen many threads in the community, but I have two problems, I have to restart filebeat to be able to send logs, but it receives duplicates, that is, it sends the complete log and not the last lines. I'm trying to register Suricata IDS logs.

My configuration filebeat (deleted comments):

filebeat:
  prospectors:
      paths:
        - /var/log/suricata/fast.log
      input_type: log
      scan_frequency: 10s
  registry_file: /var/lib/filebeat/registry
output:
  elasticsearch:
http://localhost:9200/path
    hosts: ["localhost:9200"]
  logstash:
    hosts: ["localhost:5044"]
    bulk_max_size: 1024
shipper:
logging:
  to_files: false
  files:.
    path: /var/log/mybeat
    rotateeverybytes: 10485760 # = 10MB

My configuration logstash:

input {
  file {
    path => ["/var/log/suricata/eve.json"]
    sincedb_path => ["/var/lib/logstash/"]
    codec =>   json
    type => "SuricataIDPS"
  }

}

filter {
  if [type] == "SuricataIDPS" {
    date {
      match => [ "timestamp", "ISO8601" ]
    }
    ruby {
      code => "if event['event_type'] == 'fileinfo'; event['fileinfo']['type']=event['fileinfo']['magic'].to_s.split(',')[0]; end;"
    }
  }

  if [src_ip]  {
    geoip {
      source => "src_ip"
      target => "geoip"
      #database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
      add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
    }
    mutate {
      convert => [ "[geoip][coordinates]", "float" ]
    }
    if ![geoip.ip] {
      if [dest_ip]  {
        geoip {
          source => "dest_ip"
          target => "geoip"
          #database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
          add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
          add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
        }
        mutate {
          convert => [ "[geoip][coordinates]", "float" ]
        }
      }
    }
  }
}

output {
  elasticsearch {
    host => localhost
    #protocol => http
  }
}

I have tried to solve the problems, but I am still lacking in knowledge, if you could help me with some advice I would appreciate it. Thank you. If you need more information, tell me. :slight_smile:

You have both outputs configured. Elasticsearch and Logstash. If one is block or not available, the publishing in filebeat will block -> neither output will receive any new events. The registry file tracks all ACKed events. Having 2 outputs requires both outputs ACKing the events to update the registry file. Most likely You have to restart filebeat, because one output is not available or unresponsive. You get the duplicates, because filebeat employs at-least-once-semantics, resending all events not yet ACKed.

Given you have a Logstash configuration, I'd say remove the elasticsearch output settings from your filebeat configuration.

Also check the filebeat logs for errors.

Hello @steffens, me problem has been solved, I have checked the logs finding several errors, I have solved them one by one and it is already working correctly. Greetings and thanks for help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.