Redis metrics to Elastic APM using Open telemetry Collector

Kibana + Elastic Search + APM Server version: 9.0.1

Original install method: Docker

OTEL Contrib Version: 0.128.0

Redis Version: 8.0.2-alpine3.21

Steps to reproduce:

  1. deploy the services in docker
  2. run open telemetry collector with below configuration
  3. check the services in APM Service section

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: "0.0.0.0:4317"
  redis:
    endpoint: "redis:6379"
    collection_interval: 2s
    tls:
      insecure_skip_verify: true
      insecure: true

...

service:
    metrics:
      receivers:
        - otlp
        - redis
      processors: [batch]
      exporters:
        - otlp/elastic
        - debug


Errors in APM UI:

All the metrics will be received with service name as unknown.

Is there any default dashboards to view redis metrics? possibly inside APM UI ?

What is the best practice

Provide logs and/or server output (if relevant):

{
  "_index": ".ds-metrics-apm.app.unknown-default-2025.06.18-000001",
  "_id": "FAwKg5cBonjAsBhFCit6",
  "_version": 1,
  "_source": {
    "observer": {
      "hostname": "6d2236ca0a04",
      "type": "apm-server",
      "version": "9.0.1"
    },
    "agent": {
      "name": "otlp",
      "version": "unknown"
    },
    "@timestamp": "2025-06-18T12:36:01.775Z",
    "data_stream": {
      "namespace": "default",
      "type": "metrics",
      "dataset": "apm.app.unknown"
    },
    "service": {
      "framework": {
        "name": "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/redisreceiver",
        "version": "0.128.0"
      },
      "name": "unknown",
      "language": {
        "name": "unknown"
      }
    },
    "redis.db.avg_ttl": 0,
    "metricset": {
      "name": "app"
    },
    "event": {},
    "redis.db.expires": 0,
    "redis.db.keys": 2,
    "labels": {
      "redis_version": "8.0.2",
      "db": "0"
    }
  },
  "fields": {
    "service.framework.version": [
      "0.128.0"
    ],
    "redis.db.expires": [
      0
    ],
    "redis.db.keys": [
      2
    ],
    "service.language.name": [
      "unknown"
    ],
    "agent.name.text": [
      "otlp"
    ],
    "processor.event": [
      "metric"
    ],
    "agent.name": [
      "otlp"
    ],
    "metricset.name.text": [
      "app"
    ],
    "labels.db": [
      "0"
    ],
    "service.name": [
      "unknown"
    ],
    "service.framework.name": [
      "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/redisreceiver"
    ],
    "data_stream.namespace": [
      "default"
    ],
    "service.language.name.text": [
      "unknown"
    ],
    "labels.redis_version": [
      "8.0.2"
    ],
    "data_stream.type": [
      "metrics"
    ],
    "observer.hostname": [
      "6d2236ca0a04"
    ],
    "service.framework.name.text": [
      "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/redisreceiver"
    ],
    "metricset.name": [
      "app"
    ],
    "@timestamp": [
      "2025-06-18T12:36:01.775Z"
    ],
    "observer.type": [
      "apm-server"
    ],
    "observer.version": [
      "9.0.1"
    ],
    "redis.db.avg_ttl": [
      0
    ],
    "service.name.text": [
      "unknown"
    ],
    "data_stream.dataset": [
      "apm.app.unknown"
    ],
    "agent.version": [
      "unknown"
    ]
  }
}

Hi @rpanand24,

Welcome back! Can you confirm if the service.name attribute is on the spans coming from the receiver (perhaps by outputting to the console)? I see you are using the OTel Collector Redis Receiver, which is still in beta. So I want to rule that out.

Let us know!

1 Like

Hi @rpanand24

I am perhaps a bit confused...

So those look operational Redis metrics so they will show up in the metrics-* data view...

A) What is the Exact Flow?
and
B) can you share the Exporter Configuration

Redis > OTEL Collector > Elastic search Exporter > Elasticsearch

It is unclear to me (and yes it is confusing right now) Why you are sending Redis Metrics with the APM Server path... yes it works but not really needed...

With Respect to the unknown services... that is usually set by the SDK / Agent for App Traces etc... so I am no surprised it is not set.
You can set the field in the collector or add it in an ingest pipeline

2 Likes