Does elastic agent store logs if elasticsearch go down for some time

I want to confirm it , logs temporarily saved in a file when Elasticsearch go down due to some issue and when elasticsearch is UP , it is automatically send stored logs to elasticsearch? It is the feature in elasticagent
Elastic Agent has built-in support for persistent queues, which allow logs to be temporarily stored on disk if Elasticsearch becomes unavailable.

Configure Persistent Queues in elastic-agent.yml:

queue.spool:
  file:
    enable: true
    path: "/var/lib/elastic-agent/spool"
    max_file_size: 50MB
    max_total_size: 1GB

Effect: If Elasticsearch is unavailable, logs are stored locally on disk and can be processed once Elasticsearch is reachable again. This prevents data loss.

:two: Use Retry and Backoff Settings

To avoid data loss during short outages or network issues, you can configure retry and backoff settings in elastic-agent.yml:

output.elasticsearch:
hosts: ["http://your-es-cluster:9200"]
max_retries: 5
backoff:
init: 1s
max: 60s

Effect: The agent will retry sending logs to Elasticsearch several times before giving up. It can back off after each failed attempt, waiting longer before retrying.

Where did you get this information from? This is not correct.

Elastic Agent does not have support for persistent queue, the only queue it uses is the memory queue.

If Elasticsearch goes down, the memory queue will start to fill up, once it is full, the inputs will stop accepting new logs.

Depending on the input and the time that Elasticsearch was unavailable, this can lead to data loss.

So, any alertnative of kafka since my current setup all logs are coming at kafka for buffer ,But In case of elastic agent , I have to use integration through elastic agent

Elastic Agent can send to Kafka too.