ELK Stack and CI/CD

Can you please share your views on ELK stack with CI and CD. Any standard procedures/process have been set to accomplish this process as part of Dev Ops activity?

Hey!

  • What is your aim in terms of deployment for the ELK stack?
  • What do you see changing a lot for the stack?
  • Where are you thinking of deploying the stack to?
  • What sort of usage do you think the stack will undergo?

CI/CD for what exactly?
Integration with an application and testing changes against Elasticsearch versions?
Ingestion of data (logs, metrics) from such platforms for analysis?
Something else?

Also we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out! :wink:

1 Like

Continuous Integration and Continuous deployment. Just like Java/J2EE applications, whether it is possible/recommended to have BELK configs checked into the source code control system and do the regular deployments to different environments say dev, QA, UAT, stage and production through the deployment tools.

So the aim is to have a process to deploy files like elasticsearch.yml, kibana.yml etc to various environments?

How are the environments structured? As in, are they on your own hosted hardware, cloud (AWS, Azure etc)?

Have you had any experience with elastic stack before?

Two things here.

  1. Deploying the products itself with configuration control tool like Ansible. I think there are playbooks available from Elastic. However need to check if is available for Elasticsearch. Kibana Filebeat and Logstash. Also it will help to have the consistent configuration for multinode environment.
    And for upgrading the ELK stack to the latest 6.0 version.

  2. Application related configuration file like Filebeat config and logstash config files, which can be deployed to different environments with few changes within the files wrt to the environment.

BELK is installed in data center.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.