Logstash and Filebeat config for Logs count more 10,000

I am trying to configure my BELK server for large log count( more than 20,000).

Currently my config is

Logstash:

input{
    beats{
      port => 27080
      congestion_threshold => 1500
    }

Filebeat

filebeat:
  prospectors:
    - paths:
        - /
      input_type: /var/log/*.log
      document_type: log
      registry: /etc/filebeat
  spool_size: 9000
  idle_timeout: 500s

Its working fine for log count less than 1000.

But, If it is above that its giving me error.

beats_input_codec_plain_applied, _grokparsefailure

Does anyone know any resolution to this issue. What parameters do i need to change.

This looks more like a potential issue on the Logstash side, but not sure if it is related to the number of log files as it seems to be a grok failure. Can you post some more of the error message you get? I assume you get this error from Logstash?