No Http/Kafka transactions captured by APM Java Agent

Hi, we have a Java application, every time we run it, it sends HTTP requests and gets results within a certain time period, and pushes the results to Kafka. We want to trace the HTTP operations and Kafka transactions during that process.
We packed it into a jar file and used elastic-apm-agent-1.43.0 as java agent. However, there is no transactions are captured and output in Elastic, it shows "No transactions found/No dependencies found/No data to display".

When we ran it in Debug mode, there is no "startTransaction", "startSpan" logged, but we've checked the version of HTTP/Kafka packages we are using and they are all supported versions

org.apache.httpcomponents httpclient 4.5.14

org.apache.kafka kafka-clients 3.4.0

org.apache.httpcomponents httpcore 4.4.15

We didn't touch the original codebase of our application and didn't add any dependency for esaas to it, do we have to inject some code into the code base? But in our understanding, apm-java-agent allows us to monitor the application without touching the code, doesn't it?

Could anyone please give us any idea about what's the problem? Thank you for your help in advance!

APM Server version:
APM Agent language and version:
elastic-apm-agent-1.43.0, Java

Provide logs and/or server output (if relevant):

2023-11-13 11:53:11,034 [elastic-apm-metadata-0] INFO  co.elastic.apm.agent.impl.metadata.SystemInfo - Failed to execute command "powershell.exe [System.Net.Dns]::GetHostEntry($env:computerName).HostName" with exit code 1
2023-11-13 11:53:14,040 [elastic-apm-server-healthcheck] INFO - Elastic APM server is available: {  "build_date": "xxx",  "build_sha": "xxxx",  "publish_ready": true,  "version": "8.5.3"}
2023-11-13 11:53:14,142 [elastic-apm-remote-config-poller] INFO  co.elastic.apm.agent.configuration.ApmServerConfigurationSource - Received new configuration from APM Server: {}
2023-11-13 11:53:15,432 [main] INFO  co.elastic.apm.agent.impl.ElasticApmTracer - Tracer switched to RUNNING state
2023-11-13 11:53:55,433 [elastic-apm-init-instrumentation-shutdown-hook] INFO  co.elastic.apm.agent.bci.InstrumentationStatsLifecycleListener - Used instrumentation groups: [apache-httpclient, executor, executor-collection, fork-join, kafka, log-correlation, log-error, log-reformatting, logging, process, ssl-context, urlconnection]
2023-11-13 11:53:55,433 [elastic-apm-circuit-breaker] INFO  co.elastic.apm.agent.impl.circuitbreaker.CircuitBreaker - Stopping the Circuit Breaker thread.
2023-11-13 11:53:55,939 [elastic-apm-init-instrumentation-shutdown-hook] INFO  co.elastic.apm.agent.impl.ElasticApmTracer - Tracer switched to STOPPED state
2023-11-13 11:53:55,941 [elastic-apm-init-instrumentation-shutdown-hook] INFO - dropped events because of full queue: 0
2023-11-13 11:53:56,041 [elastic-apm-init-instrumentation-shutdown-hook] INFO - Reported events: 18
2023-11-13 11:53:56,041 [elastic-apm-init-instrumentation-shutdown-hook] INFO - Dropped events: 0

Another dependency, not sure whether it's related to this issue.


Hi !

From the description you give of your application it seems to be a "batch" application that does call to an HTTP service and also calls Kafka.

As there is no startTransaction in the logs, this likely means that your application is NOT using a supported library for automatic instrumentation, which is very common with such batch applications. What this means is that the agent can't guess automatically where each transaction starts and ends.
Having a transaction is required for Elastic agents to represent the entry-points of the application in APM as the root elements of the trace.

It is possible to use configuration to make the agent aware of the transaction scope by using the trace_methods configuration option.
See How to find slow methods | APM Java Agent Reference [1.x] | Elastic for the pros/cons of this approach and how to configure it.

That will require to know a bit about the implementation details of the application in order to define which method should be instrumented to represent the "unit of work" that should be captured as a transaction.

Once your application is capturing transactions, all the other parts of the application like outgoing HTTP calls or sending messages through Kafka should be captured as spans and you should be able to have an end-to-end trace in Kibana UI.

Hi, Thank you sooooo much for your reply! It works now!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.