I am using the Java agent in a project that use Apache Spark, the Java program it-self is a Spark driver (same JVM).
Since Apache Spark doesnot implement Open Tracing, I am using Spark listener to log Spark events such as task start, stage start etc... I create spans, which is ok.
But I would like to create another service "spark", is it possible to have 2 services in same JVM ? ("application" and "spark")
Hi Thomas
At the moment, it is not possible to separate traces coming from the same JVM into separate services manually.
However, what I can suggest is using different types for your application transactions and Spark transactions through the API. Then, you would be able to toggle between them through the transaction type dropdown filter. You can also set the type if you are using the @CaptureTransaction API.
I hope this helps.
Could you elaborate on what do you mean by "join" transactions here ?
Transactions are top-level items that break down into spans (db calls, external call to an HTTP API, spark job, ...).
Thus, if you have spark jobs that are triggered outside of any application transaction, they aren't monitored. What you have above is a spark task/job that is triggered from within an HTTP transaction.
This seems to be Ok for me, is the data above consistent with your knowledge about application architecture and internals ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.