Configure Elasticsearch on Django App

:wave: Hello everyone,

I have been trying for weeks now to configure Elasticsearch/logstash on my Django app.
But, all the tutorials i found are dealing with local elasticsearch meanwhile the one i’m dealing with is an Enterprise edition.

So i need a big help for the configuration to finally be able to send data from django to Elasticsearch using Logstash if it's what i need, any link or paper or method i will take.

Thanks in advance.

These are the configs i made. ( But i'm sure ports and hosts are shit cause i don't know how to find the correct ones on my Elasticsearch entreprise). Please help !

In settings.py

INSTALLED_APPS = [
  # .... 
    'django_elasticsearch_dsl',
]

ELASTICSEARCH_DSL={
    'default': {
        'hosts': 'localhost:9200'
    },
}
LOGGING = {
  'version': 1,
  'disable_existing_loggers': False,
  'formatters': {
      'simple': {
            'format': 'velname)s %(message)s'
        },
  },
  'handlers': {
        'console': {
            'level': 'INFO',
            'class': 'logging.StreamHandler',
            'formatter': 'simple'
        },
        'logstash': {
            'level': 'WARNING',
            'class': 'logstash.TCPLogstashHandler',
            'host': 'localhost',
            'port': 5959, # Default value: 5959
            'version': 1, # Version of logstash event schema. Default value: 0 (for backward compatibility of the library)
            'message_type': 'django',  # 'type' field in logstash message. Default value: 'logstash'.
            'fqdn': False, # Fully qualified domain name. Default value: false.
            'tags': ['django.request'], # list of tags. Default: None.
        },
  },
  'loggers': {
        'django.request': {
            'handlers': ['logstash'],
            'level': 'WARNING',
            'propagate': True,
        },
        'django': {
            'handlers': ['console'],
            'propagate': True,
        },
    }
}

By Enterprise Edition you mean Elastic Cloud?

Elastic Cloud does not provide Logstash, if you want to use Logstash you need to install and manage it yourself on your infrastructure, do you have a Logstash instance running already?

Hello @leandrojmp ,

It's not Elastic Cloud, i think it's Elasticsearch(private) managed by our company.
And there's a logstash pipeline running already.

input {
  kafka {
    bootstrap_servers => "broker-201.streaming.iaas.cagip.group.gca:9093, broker-202.streaming.iaas.cagip.group.gca:9093, broker-203.streaming.iaas.cagip.group.gca:9093, broker-204.streaming.iaas.cagip.group.gca:9093, broker-205.streaming.iaas.cagip.group.gca:9093,broker-206.streaming.iaas.cagip.group.gca:9093"
    decorate_events => true
    group_id => "elisa.cagip.care-oaps-prd.group.elisa"
    topics_pattern => "elisa.cagip.care-oaps-prd.*"
    codec => "json"
    ssl_endpoint_identification_algorithm => ""
    ssl_truststore_location => "/usr/share/logstash/config/ssl/truststore.jks"
    ssl_truststore_password => "TrustPass%%"
    ssl_truststore_type => "jks"
    security_protocol => "SASL_SSL"
    sasl_mechanism => "PLAIN"
    jaas_path => "/usr/share/logstash/config/jaas-kafka-client.conf"
  }
}
###
filter {
}
###
output {
  elasticsearch {
    hosts => "https://#######.elisa.###.group.gca:9243"
    user => "#####" 
    password => "##############@"
    action => create
    ssl => true
    ssl_certificate_verification => false
    cacert => "/usr/share/logstash/config/ssl/ca.crt"
    index => "logstash-example"
  }
}

You will need to create another pipeline to receive the logs from your application then.

This pipeline needs to have an input listening on a TCP port, which will be the same port you will configure in your application, then you will also need an output to elasticsearch to store the data on a indice.

Something like this:

input {
    tcp {
        host => "0.0.0.0"
        port => "5959"
    }
}
#
output {
  elasticsearch {
    hosts => "https://#######.elisa.###.group.gca:9243"
    user => "#####" 
    password => "##############@"
    ssl => true
    ssl_certificate_verification => false
    cacert => "/usr/share/logstash/config/ssl/ca.crt"
    index => "index-for-your-django-logs"
  }
}

The port is the same you used in the example configuration you shared before, but the host in your django log handler needs to be pointing to the logstash host.

Then you will need to edit the pipelines.yml file and add another pipeline pointing to this logstash configuration.

Thanks very much @leandrojmp for your prompt answers, i'm going to try this ASAP, and come back.

Please @leandrojmp if I get this => "Then you will need to edit the pipelines.yml file and add another pipeline pointing to this logstash configuration."
It means i need to create 2 logstash pipeline ?

Yes, logstash can run multiple pipelines, you need to check how is your pipelines.yml configured, then add another pipeline in this format.

- pipeline.id: pipeline name
  path.config: "/path/to/the/configuration.conf"

Hello @leandrojmp, it's me again.
for the pipelines.yml, i don't know where to find it.

And again i'm dealing with something like this, the GUI.

So where can i get this pipelines.yml

The pipelines.yml resides in the Logstash server, in the Logstash directory.

But it seems that you are using the Centralized Management Pipeline GUI, which means that you will need to create and edit your pipeline using this interface and it should start the pipeline on your Logstash server.

I never used this feature, so I'm not sure how it works, but it seems that you need to click on the Create Pipeline button to create a new one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.