Assistance Needed with TCP Socket Log Transmission to Elastic

Subject: Assistance Needed with TCP Socket Log Transmission to Elastic

Hello Elastic Community,

I'm encountering an issue with sending logs to Elastic's Netskope integration using the Python socket library, and I'd appreciate some guidance.

Workflow Overview

Here's a brief outline of my current process:

  1. Log Collection: I collect logs from a Netskope Tenant.
  2. Data Processing: The logs are parsed into ECS (Elastic Common Schema) format, compatible with the Netskope integration.
  3. Socket Connection: I establish a TCP socket connection to the Elastic server.
  4. Data Splitting: I separate multiple events with newline characters (\n) and create a JSON dump for the payload.
  5. Data Transmission: I send alerts/events to the Elastic server using the sendall method from the socket library.

TCP Connection Code

This is how I establish the TCP connection:

self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.sock.connect(
    (<Server IP>, <Server Port>)
)

Sending Logs Code

self.sock.sendall(bytes(data, encoding="utf-8"))

The Issue

I'm sending batches of up to 10,000 logs at a time, and it generally works. However, when I increase the rate to 100,000 EPM (Events Per Minute), some logs get truncated or lost. Here are the error messages I'm seeing in Elastic:

Questions

  1. Elastic's Ingestion Rate: Is there a limit on how fast Elastic can accept data? If so, what's the recommended rate to avoid data loss or truncation?
  2. TCP Socket Best Practices: Is my approach of sending logs in large batches through a single socket connection valid, or could it be causing the issue? Should I consider creating a new socket for each event?
  3. Optimal Approach: What are the best practices for sending high-throughput logs to Elastic via TCP socket? Would using a different tool like Logstash or Beats be a better option?

I'd really appreciate any advice or best practices on addressing this issue and ensuring reliable log transmission to Elastic.

Thank you in advance for your help!