APM server /intake/v2/events return 202 but timed too long

Kibana version: 7.9.1

Elasticsearch version:7.9.1

APM Server version:7.9.1

APM Agent language and version: golang, 1.8.0

Original install method (e.g. download page, yum, deb, from source, etc.) and version:
helm install

Fresh install or upgraded from other version?
fresh install

output is elastic search.
load balancer is nginx ingress

APM all default settings.

sorry for my short english.
all operation is well done, but APM server /intake/v2/events POST response is too long.
almost 10 seconds..

flow is
golang agent -> APM domain name -> ingress nginx -> APM server pods
pods count is 3.

I use golang apmzap, apmchi, apmclient.

apm index

apm responses..

Steps to reproduce:
1.
2.
3.

Errors in browser console (if relevant):

Provide logs and/or server output (if relevant):
there is no error message on APM server

│ 2021-04-20T02:27:52.487Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "13348fd9-918d-415e-870c-d0b79c478d14", "method": "POST", "URL": "/intake/v2/events", "content_length": 899, "remote_address": "10
│ .202.79.242", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 202}
│ 2021-04-20T02:27:52.814Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "74664d0e-bb43-4255-ab8d-f7fcebe7a0cb", "method": "POST", "URL": "/intake/v2/events", "content_length": 892, "remote_address": "10
│ .202.77.26", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 202}
│ 2021-04-20T02:27:54.664Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "c8d6c0d0-0368-4b01-b30c-5745edc1426f", "method": "POST", "URL": "/intake/v2/events", "content_length": 908, "remote_address": "10
│ .202.6.188", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 202}
│ 2021-04-20T02:27:56.126Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "3c6056d5-5bd7-470b-9636-4bbc8c901710", "method": "POST", "URL": "/intake/v2/events", "content_length": 896, "remote_address": "10
│ .202.77.26", "user-agent": "elasticapm-go/1.8.0 go/go1.16", "response_code": 202}
│ 2021-04-20T02:27:56.851Z    INFO    [request]    middleware/log_middleware.go:97    request ok    {"request_id": "554cdcc8-316c-49cd-bcfe-4f84e2156458", "method": "GET", "URL": "/", "content_length": 0, "remote_address": "10.240.3.1", "user-agent":
│ "kube-probe/1.15", "response_code": 200}
│ 2021-04-20T02:27:58.733Z    INFO    [request]    middleware/log_middleware.go:97    not modified    {"request_id": "c6477486-b3ed-4f23-b379-4b6e8248e22c", "method": "GET", "URL": "/config/v1/agents?service.environment=dev&service.name=kakaoenterpris
│ e_stf-kcgift", "content_length": 0, "remote_address": "10.202.79.242", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 304}
│ 2021-04-20T02:28:01.507Z    INFO    [request]    middleware/log_middleware.go:97    request ok    {"request_id": "13d145e3-156f-40be-bd46-f04b8ac5f71e", "method": "GET", "URL": "/", "content_length": 0, "remote_address": "10.240.3.1", "user-agent":
│ "kube-probe/1.15", "response_code": 200}
│ 2021-04-20T02:28:02.187Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "e475cc9d-1adb-4703-a0fb-58581ac31267", "method": "POST", "URL": "/intake/v2/events", "content_length": 884, "remote_address": "10
│ .202.78.143", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 202}
│ 2021-04-20T02:28:02.609Z    INFO    [request]    middleware/log_middleware.go:97    not modified    {"request_id": "75f73a17-862e-4387-9efc-241ca1e73276", "method": "GET", "URL": "/config/v1/agents?service.environment=dev&service.name=kakaoenterpris
│ e_leah-test1", "content_length": 0, "remote_address": "10.202.77.26", "user-agent": "elasticapm-go/1.8.0 go/go1.15.2", "response_code": 304}
│ 2021-04-20T02:28:02.876Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "074f3cc4-e558-4940-8783-c771ed075421", "method": "POST", "URL": "/intake/v2/events", "content_length": 889, "remote_address": "10
│ .202.9.117", "user-agent": "elasticapm-go/1.8.0 go/go1.16", "response_code": 202}
│ 2021-04-20T02:28:06.851Z    INFO    [request]    middleware/log_middleware.go:97    request ok    {"request_id": "a5dd493d-d1ee-4fb6-8e1b-13fc80fd47da", "method": "GET", "URL": "/", "content_length": 0, "remote_address": "10.240.3.1", "user-agent":
│ "kube-probe/1.15", "response_code": 200}
│ 2021-04-20T02:28:07.690Z    INFO    [request]    middleware/log_middleware.go:97    request accepted    {"request_id": "6db8202f-3f47-429b-927c-81ce5654c567", "method": "POST", "URL": "/intake/v2/events", "content_length": 905, "remote_address": "10
│ .202.0.123", "user-agent": "elasticapm-go/1.8.0 go/go1.16", "response_code": 202}

Welcome to the forum, @badtimo!

all operation is well done, but APM server /intake/v2/events POST response is too long.
almost 10 seconds..

This is expected. The Go agent (and most other agents) streams events to the server continuously. Each HTTP request is closed after 10 seconds, and a new one is created.

@axw

This is expected. The Go agent (and most other agents) streams events to the server continuously. Each HTTP request is closed after 10 seconds, and a new one is created.

  1. Is this connection stream keep-alive? How many connections will there be per pod?
  2. Will this connection have a small performance impact on other requests?
    I'm serving several domain in one nginx ingress controller.
    I think this long connection will affect nginx's concurrent connections limits..

Thank you for your kind response.

  1. Is this connection stream keep-alive? How many connections will there be per pod?

Yes. The Go agent creates just one a single connection at a time, and will reuse it like short requests on a keep-alive connection.

  1. Will this connection have a small performance impact on other requests?
    I'm serving several domain in one nginx ingress controller.
    I think this long connection will affect nginx's concurrent connections limits..

I wouldn't expect there to be any difference in connections if you have a busy application, unless keep-alive were disabled. We have never heard of any issues from users related to this behaviour.

Nevertheless, if you want to you can reduce the request time if you wish to:

Configuration | APM Go Agent Reference [2.x] | Elastic

I would only recommend changing away from the defaults if you observe some issues with them.

This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.