Failed to index documents in 'traces-apm-default' (resource_not_found_exception) after recreating integration

Hi,

We are trying to configure one of our Elasticsearch to receive APM traces, the expectation is that after the configuration of APM integration it will create all the required data stream to ingest traces into this elasticsearch server.

Instead in the APM server we are receiving the following message:

{"log.level":"error","[@timestamp](https://github.com/timestamp)":"2025-01-16T23:29:43.380Z","log.origin":{"function":"github.com/elastic/go-docappender/v2.(*Appender).flush","file.name":"v2@v2.3.1/appender.go","file.line":443},"message":"failed to index documents in 'traces-apm-default' (resource_not_found_exception): [require_data_stream] request flag is [true] and [traces-apm-default] is not a data stream","service.name":"apm-server","documents":5,"ecs.version":"1.6.0"}
{"log.level":"error","[@timestamp](https://github.com/timestamp)":"2025-01-16T23:30:01.032Z","log.origin":{"function":"github.com/elastic/go-docappender/v2.(*Appender).flush","file.name":"v2@v2.3.1/appender.go","file.line":443},"message":"failed to index documents in 'traces-apm-default' (resource_not_found_exception): [require_data_stream] request flag is [true] and [traces-apm-default] is not a data stream","service.name":"apm-server","documents":7,"ecs.version":"1.6.0"}

We have delete and recreate the agent policies and integration multiple times, restarted the APM server, Elasticsearch and Kibana and it does not create the required artifacts that we can see in other Elasticsearch systems where we can ingest traces without a problem.

We have follow the comments mentioned here

I have confirmed that there is not index, nor data stream with the name "traces-apm-default"

This is an initial setup that we are doing in these systems based on version 8.16.2, the other systems were configured in older versions.

Thanks in advance for your help,

Zareh

Hi @zvazquez

What exactly do you mean ... Did you install the Managed APM Agent or APM binary Server?

Do you have the correct index templates

Can you run the following in Dev Tools and show the command and results

GET _cat/indices/*traces*?v

Hi stephenb,

Currently we are using APM Binary approach:

We have enabled the integration and the status seems ok:

We do have the trace-apm* Index templates available:

But there is not indexes created

Or datastreams:

Zareh

Can you share the latest logs from an APM agent trying to send traces?
Show the startup logs for the agent as well?
Are you self-managed or Elastic Cloud?
What version?

Seems you are in some inconsistent state...

@simitt Any thoughts? ^^^

Hi stephenb,

The error message in the APM server is what I originally reported:

{"log.level":"error","[@timestamp](https://github.com/timestamp)":"2025-01-16T23:29:43.380Z","log.origin":{"function":"github.com/elastic/go-docappender/v2.(*Appender).flush","file.name":"v2@v2.3.1/appender.go","file.line":443},"message":"failed to index documents in 'traces-apm-default' (resource_not_found_exception): [require_data_stream] request flag is [true] and [traces-apm-default] is not a data stream","service.name":"apm-server","documents":5,"ecs.version":"1.6.0"}

We are using Elasticsearch version 8.16.2 and APM server version 8.16.2, I have also tried with the previous version of APM Server that we are using 8.14.3.

Zareh

Are you self Hosted or In Elastic Cloud

Can you go and try to re-install the assets....

And can you try this... and see if it creates the data stream etc
This is just a sample document...
Try this in Kibana - Dev Tools

POST traces-apm-default/_doc
{
	"container": {
		"id": "7d9741755612a1628ea1c8e5afbb561958d22503f46085891e92b265f325c438"
	},
	"kubernetes": {
		"node": {
			"name": "gke-stephen-brown-gke-de-default-pool-38f4baa2-l3jk"
		},
		"pod": {
			"name": "my-otel-demo-cartservice-6c4cc6f5d5-m448l"
		},
		"namespace": "default"
	},
	"parent": {
		"id": "0c3abc34a1aa7f03"
	},
	"agent": {
		"name": "opentelemetry/dotnet/elastic-dotnet",
		"version": "1.0.0-alpha.6"
	},
	"processor": {
		"event": "transaction"
	},
	"url": {
		"path": "/oteldemo.CartService/GetCart",
		"original": "/oteldemo.CartService/GetCart",
		"scheme": "http",
		"port": 8080,
		"domain": "my-otel-demo-cartservice",
		"full": "http://my-otel-demo-cartservice:8080/oteldemo.CartService/GetCart"
	},
	"labels": {
		"transaction_id": "80d9070a43110718",
		"grpc_method": "/oteldemo.CartService/GetCart",
		"k8s_pod_ip": "127.0.0.6",
		"network_protocol_version": "2",
		"http_route": "/oteldemo.CartService/GetCart",
		"app_user_id": "632b0f49-37ee-4101-8853-c4c230c16144",
		"grpc_status_code": "0",
		"service_namespace": "opentelemetry-demo"
	},
	"observer": {
		"hostname": "f8032c3ca6ee",
		"type": "apm-server",
		"version": "8.17.1"
	},
	"trace": {
		"id": "033d7d2680d4a1ce24353d3a0e807303"
	},
	"@timestamp": "2025-01-21T20:09:55.332Z",
	"data_stream": {
		"namespace": "default",
		"type": "traces",
		"dataset": "apm"
	},
	"numeric_labels": {
		"app_cart_items_count": 0
	},
	"service": {
		"node": {
			"name": "cc09ca68-05df-4173-80ca-e1f0590fa35a"
		},
		"environment": "opentelemetry-demo",
		"framework": {
			"name": "Microsoft.AspNetCore"
		},
		"name": "cartservice",
		"language": {
			"name": "dotnet"
		}
	},
	"host": {
		"hostname": "gke-stephen-brown-gke-de-default-pool-38f4baa2-l3jk",
		"name": "gke-stephen-brown-gke-de-default-pool-38f4baa2-l3jk"
	},
	"http": {
		"request": {
			"method": "POST"
		},
		"response": {
			"status_code": 200
		}
	},
	"event": {
		"ingested": "2025-01-21T20:09:56Z",
		"success_count": 1,
		"outcome": "success"
	},
	"transaction": {
		"result": "HTTP 2xx",
		"duration": {
			"us": 1228
		},
		"representative_count": 1,
		"name": "POST /oteldemo.CartService/GetCart",
		"id": "80d9070a43110718",
		"type": "request",
		"sampled": true
	},
	"user_agent": {
		"original": "grpc-node-js/1.10.11",
		"name": "Other",
		"device": {
			"name": "Other"
		}
	},
	"span": {
		"id": "80d9070a43110718"
	},
	"timestamp": {
		"us": 1737490195332732
	}
}

You can clean up afterwards if you like

Hi stephenb,

Indexing the document did the trick and created the data stream and with this the underlying index.

Do you think there is any other standard data stream that we should initialize?

Thank for your help,

Zareh