Hi everybody! Kudos to the @elastic team! It's impressive what this piece of software can do!
I've been going through the docs, and googling through no avail. Hit Stack Overflow, and GitHub. I've studied the repos and tried different images combinations... I'm stuck.
I'm currently under Rails 5.1.5, Docker 17.12 CE, and I'm using Docker-Compose Version 3. I just want to transfer the logs from a small Rails app to Elasticsearch 6.2.. I don't care how pretty they look, they just need to be there.
This is everything that I've done:
- I've read the setup documentation for ELK + Beats (Firebeat).
- I've tried setting the logs from Rails to STDOUT. Then use the Gelf Driver from Docker to transfer it to the Logstash URL.
- I've tried loading the image from: docker.elastic.co, and passing my own .yml
- Tried using the sebp/elk image. But got lost on how to send it to Filebeat or Logstash. I pulled a Filebeat image, but didn't know how to make it work.
- I've tried checking the first 2 Google Search Pages (Yes, I went as far as to look to the 2nd page) on how to set it up with Rails. Unfortunately, all of them are outdated, and the configuration doesn't transfer well to this newer version.
- Created a StackOverflow question.
I'm able to load Kibana, but Kibana is not able to read any of the logs (I don't know if it's receiving them), therefore I can't create an Index.
I've changed the code so many times, that I don't have the list of all the different attempts.
Here's in application.rb:
I used to have one that used to send it directly, but it never worked.
config.lograge.formatter = Lograge::Formatters::Logstash.new
config.lograge.logger = ActiveSupport::Logger.new(STDOUT)
Here's my full Docker-Compose (So sorry for the verbosity and the comments):
# WARNING!! Indentation is important! Be careful how you indent.
# All paths that point to the actual disk (not the Docker image)
# are relative to the location of the file!
version: '3'
services:
# elasticsearch:
# build:
# context: elk/elasticsearch/
# volumes:
# - ./elk/elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro
# ports:
# - "9200:9200"
# - "9300:9300"
# environment:
# ES_JAVA_OPTS: "-Xmx256m -Xms256m"
# networks:
# - elk
# logstash:
# build:
# context: elk/logstash/
# volumes:
# - ./elk/logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
# - ./elk/logstash/pipeline:/usr/share/logstash/pipeline:ro
# ports:
# - "5000:5000"
# environment:
# LS_JAVA_OPTS: "-Xmx256m -Xms256m"
# networks:
# - elk
# depends_on:
# - elasticsearch
# kibana:
# build:
# context: elk/kibana/
# volumes:
# - ./elk/kibana/config/:/usr/share/kibana/config:ro
# ports:
# - "5601:5601"
# networks:
# - elk
# depends_on:
# - elasticsearch
elk:
image: sebp/elk
ports:
- "5601:5601"
- "9200:9200"
- "5044:5044"
volumes:
- elasticsearch:/usr/share/elasticsearch/data
filebeat:
image: docker.elastic.co/beats/filebeat:6.2.3
depends_on:
- elk
volumes:
# - ./elk/filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
- /var/run/docker.sock:/tmp/docker.sock
environment:
- LOGSTASH_HOST= 'localhost'
- LOGSTASH_PORT=5044
app:
# These are configuraiton options that are applied at build time.
# In this case, the dockerfile will be read in the root.
build: .
# Tells docker, not to initialize this service, before the listed
# ones are ran.
depends_on:
- db
# - elk
- filebeat
# We create environment variables.
environment:
RAILS_ENV: development
LOGSTASH_HOST: localhost
# Same as the one in MYSQL above:
# LOCAL
SECRET_MYSQL_HOST: 'db'
SECRET_MYSQL_DATABASE: 'dev'
SECRET_MYSQL_USERNAME: 'ruby'
SECRET_MYSQL_PASSWORD: 'userPassword'
command: bash -c "rm -f tmp/pids/server.pid && bundle exec rails s -p 3001 -b '0.0.0.0'"
# This will allow you to hook yourself to the debugger
# How you use it:
# docker-compose up -d //-d flag is important!
# docker attach cprintservicehub_app_1 (If that doesn't work, try with the container's id).
stdin_open: true
tty: true
# logging:
# driver: gelf
# options:
# gelf-address: 'udp://localhost:12201'
links:
- db
volumes:
- "./:/var/www/cprint"
ports:
- "3001:3001"
expose:
- "3001"
# Volumes are the recommended storage mechanism of Docker.
volumes:
db-data:
driver: local
elasticsearch:
driver: local
networks:
elk:
driver: bridge
Do you have any ideas? Do I need to elaborate more?
Thank you!