Apm does not show any information with node agent

If you are asking about a problem you are experiencing, please use the following template, as it will help us help you. If you have a different problem, please delete all of this text :slight_smile:

Kibana version:7.2.0

Elasticsearch version:7.2.0

APM Server version:7.2.0

APM Agent language and version: node version: 2.12.0

Browser version:

Original install method (e.g. download page, yum, deb, from source, etc.) and version: all installed with docker

Fresh install or upgraded from other version? no

Is there anything special in your setup? For example, are you using the Logstash or Kafka outputs? Are you using a load balancer in front of the APM Servers? Have you changed index pattern, generated custom templates, changed agent configuration etc. no

Description of the problem including expected versus actual behavior. Please include screenshots (if relevant):
Apm does not show any information, node agents are executed with docker whose base image is PM2 from Kmetrics and inside the docker the app is started with node js

Steps to reproduce:
1.
2.
3.

Errors in browser console (if relevant):
none

Provide logs and/or server output (if relevant):

2019-07-13T16:43:42.356Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "60c8b165-fb63-4271-a679-bd9f869d8def", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:44:12.356Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "e857d46f-4117-47bf-806a-a474539b12f6", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:44:42.357Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "abac1175-688b-42b8-8747-4cdcc28604aa", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:45:21.104Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "ba98d221-c745-4fda-90e6-89e5d4e24cdf", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:45:51.077Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "ff838f54-e0a0-4752-846f-534e442e7db6", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:46:21.075Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "6ca91867-d383-4a12-9856-d30dbddf4848", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:46:51.080Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "1aecd3ef-c95d-4979-8104-509e2515b59b", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:47:21.079Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "8cd9fe69-13b3-462e-bbc2-10b818ef9274", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:47:51.078Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "aa48fe1e-7d18-4bb8-a192-ac65ef509bfb", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:48:21.078Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "6e2fd831-ad46-4547-98c6-a1ec357c03d4", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:48:46.970Z INFO [request] beater/common_handler.go:185 handled request {"request_id": "99052471-782f-4a83-87a0-bc2171c2c3e5", "method": "POST", "URL": "/intake/v2/events", "content_length": -1, "remote_address": "192.168.128.1", "user-agent": "elastic-apm-node/2.12.1 elastic-apm-http-client/8.0.0", "response_code": 202}
2019-07-13T16:49:16.932Z INFO [request] beater/common_handler.go:185 handled request

It looks like the data are successfully sent to the APM Server. For figuring out what causes the issue, I suggest to check if the data are actually stored in Elasticsearch. You can do that by sending a query to it, e.g. in the form of GET <ES-host>/apm*/_search. If no data are indexed please check the server logs for any errors, e.g. with connecting to Elasticsearch.