Filebeat - stalls and effectively hangs when max request size exceeded

xpack.monitoring.enabled: true
logging.level: info

cloud.id: "xx:yy"
cloud.auth: "zz"

filebeat.registry_file: D:/home/site/filebeatregistry

filebeat.prospectors:
- type: log
  enabled: true
  json.keys_under_root: true
  json.overwrite_keys: true
  paths:
  - D:/local/temp/jobs/continuous/worker/**/safflog_*.txt
  - D:/home/site/wwwroot/**/safflog_*.txt
  
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml    

setup.template:
  name: "filebeat-rr-logs"
  pattern: "filebeat-rr-logs-*"
  settings:
    index:
      number_of_shards: 1
      number_of_replicas: 0

output.elasticsearch:
  bulk_max_size: 2000
  index: "filebeat-rr-logs-%{[beat.version]}-%{+yyyy.ww}"
  compression_level: 2

Logs-wise:

2019-04-26T08:31:41.672Z ERROR elasticsearch/client.go:317 Failed to perform any bulk index operations: 413 Request Entity Too Large:

(just repeats)

Hosted cluster is 8GB RAM / 192GB storage, single zone.. 1 shard, 0 replicas as above.

Thanks!