Ingest node conditional and max script compilations

I have an ingest node that contains:

      - date:
          field: "event.start"
            - "ISO8601"
          if: "ctx.event?.start != null && ctx.agent.type == 'packetbeat'"
          ignore_failure: true

The elasticsearch runs behind logstash and packetbeat communicates with logstash.

In my machine where packetbeat runs, I start a process that creates hundreds of connections per second, thus hundreds of events to be logged. In the packetbeat logs, I get the following error multiple times:

May 16 21:08:56 my-machine packetbeat[908]: 2020-05-16T21:08:56.155Z        ERROR        [logstash]        logstash/async.go:279        Failed to publish events caused by: write tcp> write: connection reset by peer
May 16 21:08:57 my-machine packetbeat[908]: 2020-05-16T21:08:57.624Z        ERROR        [publisher_pipeline_output]        pipeline/output.go:127        Failed to publish events: write tcp> write: connection reset by peer

In the elasticsearch logs, I see:

"Caused by: org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.max_compilations_rate] setting",

How can I use a stored script with the if conditional of the processor? And, in general, how am I supposed to solve this issue?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.