Python agent only recording some background tasks

(Scott Burns) #1

Hi there! :wave:

We're using this decorator to record background job (using RQ) performance.


    'INCLUDE_PATH': ['health'],
    'SERVICE_NAME': 'health-{}'.format(ENVIRONMENT),
    'SERVER_URL': os.environ['APM_HOST'],

def apm_decorator(*apm_args):
    def decorator(func):
        def wrapper(*args, **kwargs):
            client = elasticapm.Client(settings.ELASTIC_APM)
            context = {
                'args': args,
                'kwargs': kwargs,
            result = func(*args, **kwargs)
            full_name = '{}.{}'.format(func.__module__, func.__name__)
            client.end_transaction(full_name, 'SUCCESS')
            return result
        return wrapper
    return decorator

def real_function():

def background_task():
    # other stuff

def view(request):

(The 'jobs' arg passed to the apm_decorator puts these transactions into the "jobs" tab in APM.)

real_function is reported but nothing about background_task appears in APM so anything before or after real_function is not reported. (HTTP requests from Django are reported normally, so our APM server seems ok.)

Python 3.6.4 (also occurred in python 2.7.X), APM Agent version 3.0.0

Thanks for your help!

(Andrew Wilkins) #2

Hi Scott, thanks for trying out Elastic APM!

RQ uses a fork/exec pattern, so I suspect what's happening is that some of your jobs are completing before the transactions can be sent off to the APM Server.

By default the agent won't send data to the server until 10 seconds have elapsed, or 500 transactions have been recorded. These can be controlled with ELASTIC_APM_FLUSH_INTERVAL and ELASTIC_APM_MAX_QUEUE_SIZE respectively.

Adding MAX_QUEUE_SIZE: 1 to the ELASTIC_APM settings dict will force each job transaction to be sent immediately.

(Scott Burns) #3

Thanks for your reply Andrew, this was very helpful. For those that end up on this page, this is what I ended up doing:

  • Subclass rq.job.Job and override the perform function with basically what I used above. Begin the transaction, set custom context w/ the jobs args and kwargs, super().perform(), and then end the transaction and close the client. django-rq allows for setting the global custom Job class ( so it was relatively easy to hook that up.

(system) #4

This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.