With this like code, I couldn't see details inside callable.
code
public class ApmTest {
static ForkJoinPool fjp = new ForkJoinPool();
static ExecutorService pool = Executors.newFixedThreadPool(10);
public static void main(String[] args) throws ExecutionException, InterruptedException {
String result1 = fjp.submit(() -> runInForkJoinPool()).get();
String result2 = pool.submit(() -> runInExecutorService()).get();
System.out.println("result is " + result1);
System.out.println("result is " + result2);
fjp.shutdown();
pool.shutdown();
}
public static String runInExecutorService() {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return "runInExecutorService";
}
public static String runInForkJoinPool() {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
return "runInForkJoinPool";
}
}
result
configuration
-javaagent:path of elastic-apm-agent-1.13.0.jar \
-Delastic.apm.service_name=myservicename \
-Delastic.apm.server_urls=myserverurl \
-Delastic.apm.trace_methods=mypackage.ApmTest#* \
-Delastic.apm.transaction_max_spans=50000 \
-Delastic.apm.trace_methods_duration_threshold=50ms \
-Delastic.apm.profiling_spans_enabled=true \
-Delastic.apm.profiling_included_classes=mypackage.* \
-Delastic.apm.application_packages=mypackage \
Backgrounds
Because there are some tasks that are need to be executed in FrokJoinPool
concepts; like parelllStream().map(...)
at deep level operation, So I run whole task in ForkJoinPool
.
Question
I hope transaction is started from main
and the followed method calls those executed inside ForkJoinPool are profiled as in same transaction.
ForkJoinPool is not possible to trace?
Or, Any workaround to manually propagate context to each parallel executions?
Update
When I execute methods in stream()
it profiled, but methods executed in parallelStream()
are not profiled.