Filebeat doesn't update log file and limits file to 2048 lines

Hi,

I'm new to this.

I want Filebeat to update the output file each time the log gets updated.
What's happening now is that Filebeat creates a file with only the existing lines that fit "inclued_lines" that I've entered.

Another problem that I have is that the new file that Filebeat creates is limited to 204 lines.
For example - If I have a log file with 5k lines, only the first 2048 will be written to the new file.

############################# Filebeat ######################################
filebeat:

List of prospectors to fetch data.

prospectors:
# Each - is a prospector. Below are the prospector specific configurations
-

  paths:
    - C:\nexperience\logs\handsets\syslog.004562.02.log
    #- c:\programdata\elasticsearch\logs\*

  input_type: log

  include_lines: ["DEBUG"]

  scan_frequency: 5s

registry_file: "C:/ProgramData/filebeat/registry"

output:

Elasticsearch as output

elasticsearch:
# Array of hosts to connect to.
# Scheme and port can be left out and will be set to the default (http and 9200)
# In case you specify and additional path, the scheme is required: http://localhost:9200/path
# IPv6 addresses should always be defined as: https://[2001:db8::1]:9200
hosts: ["localhost:9200"]

template:

  # Template name. By default the template name is filebeat.
  #name: "filebeat"

  # Path to template file
  path: "filebeat.template.json"

File as output

file:
# Path to the directory where to save the generated files. The option is mandatory.
path: "/tmp/filebeat"

# Name of the generated files. The default is `filebeat` and it generates files: `filebeat`, `filebeat.1`, `filebeat.2`, etc.
filename: filebeat

# Maximum size in kilobytes of each file. When this size is reached, the files are
# rotated. The default value is 10 MB.
#rotate_every_kb: 10000

# Maximum number of files under path. When this number of files is reached, the
# oldest file is deleted and the rest are shifted from last to first. The default
# is 7 files.
number_of_files: 2

Console output

console:

# Pretty print json event
#pretty: false

############################# Shipper #########################################

shipper:

############################# Logging #########################################

logging:

files:

rotateeverybytes: 10485760 # = 10MB

I've moved this to the appropriate category.

Thanks, but I've already solved this.
You can close the thread.

would be nice to write down the solutions for others running into similar issues.

SOLUTION:
for the problem I had with the file writing only 2048 lines:
I've changed "spool_size" value to 10000.

for not updating the file:
In output section I had 'elasticsearch' and 'file' uncommented, so I just commented 'elasticsearch' with all its fields.

it should not be required to increase the spool size. Once (all) outputs confirmed something written, the next lines will be processed and pushed. Having had 2048 events sounds like file output being successful, but output pipeline still hanging on elasticsearch output due to not getting an ACK (or not being able to connect).