How to connect Django and Elasticsearch by using Logstash?

Hi, I'm trying to send data in Django sqlite DB to Elasticsearch by using Logstash.
I made two tables in Django and they are in sqlite DB basically embedded in Django. And I want send these datas to elasticsearch by using logstash, but not sure which logstash input plugin to choose...
I'm confused with two ways below.
First, the python-logstash method in Django settings.py and tcp input plugin of logstash.
Second, the sqlite input plugin of logstash.
The problem of second way is that the latest version of sqlite input plugin was in 2018, meaning elastic team doesn't support this way offcially.
I've tried the first one, but it didn't work anyway. Here is settings.py in Django I've written.

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'handlers': {                       
        'logstash': {
            'level': 'INFO',
            'class': 'logstash.TCPLogstashHandler',
            'host': 'localhost',
            'port': 5959,  # Default value: 5959
            'version': 1,
            'message_type': 'django',
            'fqdn': False,
            'tags': ['django.request'],
        },
        'console': {
            'level': 'INFO',
            'class': 'logging.StreamHandler'
        },
    },
    'loggers': {                        
        'django.request': {
            'handlers': ['logstash'],
            'level': 'INFO',
            'propagate': True,
        },

        'django': {
            'handlers': ['console'],
            'propogate': True,
        },
    }
}

and this is my logstash conf file. I just wanted to check whether django datas have been sent to logstash through "stdout", but there was no output.

input {
    tcp {
    port => 5959
    codec => json
    }
}
output {
    stdout {
    }
}

In my situation, is it right to use logstash.TCPLogstashHandler in settings.py and tcp input plugin of logstash? Or is there any other method to recommend?
Thank you.

I see you're wanting to send the request/access logs as well as the error logging.

While you could emit these over TCP I would caution you against this because it will introduce coupling; consider what would happen when logstash is backlogged or unavailable. The TCP connection might:

  • immediately fail (eg. connection reset) -- you then lose logs
  • connect very slowly (eg. packet loss) -- Django threads would build up and this can turn into a service issue
  • run very slowly and cause the TCP flow-control to shutdown, blocking the sending of the logs, which can cause normally fast things such as logger to block, causing your application to then hang.

So emitting logs synchronously is not something I would recommend; I've accidentally taken down a whole network by having a DNS server send its logs over TCP syslog once for this very reason; the syslog server was full, sockets backed up, which caused the syslog(3) libc function to block (surprise, yes it can block).

If you want to log remotely, consider using UDP messages instead. Or use something specifically designed for remote logging, such as APM. (https://www.elastic.co/guide/en/apm/agent/python/current/django-support.html)

Write to log files instead; ensure that the log files are suitably constrained/rotated, and then tail those log files using something like filebeat. That way, if logstash or filebeat are unavailable, your application is unaffected. It also means you can put (generally MUCH better) access logging in at your webserver level.

Hope that helps.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.