When using recommended elastic/elastic-agent image running in OTEL Gateway mode it shows errors when ingesting 10m and 60m aggregated data.
Example setup
-
Start Elastic server using elasticsearch/elasticsearch:9.2.1 Docker image.
-
Generate API key to use for agent
-
Start elastic agent image elastic/elastic-agent:9.2.1 in EDOT Gateway mode ( Default configuration of the EDOT Collector (standalone) | Elastic Agent ) using recommended defaults from the same page (can't post github link here)
-
Start some simple app and point it towards Gateway, example:
// my-app.js
import express from 'express';
const app = express();
const port = process.env.PORT || 3000;
app.get('/hello', (_req, res) => {
res.type('text/plain').send('Hello, World!');
});
if (process.env.NODE_ENV !== 'test') {
app.listen(port, () => {
console.log(`my-app listening on http://localhost:${port}`);
});
}
export default app;
export OTEL_EXPORTER_OTLP_ENDPOINT="<GATEWAY_URL>"
export OTEL_EXPORTER_OTLP_PROTOCOL="grpc"
export OTEL_RESOURCE_ATTRIBUTES="deployment.environment.name=dev,service.name=sample-app"
node --import ` @elastic`/opentelemetry-node my-app.js
- Check EDOT Gateway logs, after a while when 10m or 60m metrics are supposed to be exported, you’ll see following error message:
2026-01-20T07:40:00.001Z error internal/base_exporter.go:114 Exporting failed. Rejecting data. {"resource": {"service.instance.id": "dd20735c-d1d9-4976-84da-e07a145c8ee6", "service.name": "elastic-agent", "service.version": "9.2.3"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "metrics", "error": "sending queue is full", "rejected_items": 8}
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*BaseExporter).Send
go.opentelemetry.io/collector/exporter/exporterhelper@v0.139.0/internal/base_exporter.go:114
go.opentelemetry.io/collector/exporter/exporterhelper/internal.NewMetricsRequest.newConsumeMetrics.func1
go.opentelemetry.io/collector/exporter/exporterhelper@v0.139.0/internal/new_request.go:176
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics
go.opentelemetry.io/collector/consumer@v1.45.0/metrics.go:27
go.opentelemetry.io/collector/service/internal/refconsumer.refMetrics.ConsumeMetrics
go.opentelemetry.io/collector/service@v0.139.0/internal/refconsumer/metrics.go:29
go.opentelemetry.io/collector/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics
go.opentelemetry.io/collector/internal/fanoutconsumer@v0.139.0/metrics.go:71
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics
go.opentelemetry.io/collector/consumer@v1.45.0/metrics.go:27
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor.(*Processor).exportForInterval
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor@v0.20.0/processor.go:597
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor.(*Processor).export
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor@v0.20.0/processor.go:463
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor.(*Processor).commitAndExport
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor@v0.20.0/processor.go:445
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor.(*Processor).Start.func1
github.com/elastic/opentelemetry-collector-components/processor/lsmintervalprocessor@v0.20.0/processor.go:201
2026-01-20T07:40:00.001Z warn lsmintervalprocessor@v0.20.0/processor.go:202 failed to export {"resource": {"service.instance.id": "dd20735c-d1d9-4976-84da-e07a145c8ee6", "service.name": "elastic-agent", "service.version": "9.2.3"}, "otelcol.component.id": "elasticapm", "otelcol.component.kind": "connector", "otelcol.signal": "traces", "otelcol.signal.output": "metrics", "error": "failed to export: failed to export interval 10m0s for end time 1768894800: failed to consume the decoded value: sending queue is full", "end_time": "2026-01-20T07:40:00.000Z"}