Docker[ELK+Filebeat] => No logs

Hi everyone,

I'm learning how to use ELK so I decided to do my own Hello world.
I planned to make some graph with my syslog/auth.log

I also decided to put every services in a Docker's container.

I set up everything the way I found online but I still cannot get any data in ES.

I give you my conf files to, please, help me find why.

docker-compose.yml:

elasticsearch:
  image: elasticsearch:latest
  command: elasticsearch -Des.network.host=0.0.0.0
  ports:
    - "9200:9200"
    - "9300:9300"
logstash:
  image: logstash:latest
  command: logstash -f /etc/logstash/conf.d/logstash.conf
  volumes:
    - ./logstash/config:/etc/logstash/conf.d
  ports:
    - "5000:5000"
  links:
    - elasticsearch
kibana:
  build: kibana/
  volumes:
    - ./kibana/config/kibana.yml:/opt/kibana/config/kibana.yml
  ports:
    - "5601:5601"
  links:
    - elasticsearch

logstash.conf:

input {
        beats {
                port => 5000
                type => "logs"
        }
}

filter {
        if [type] == "syslog" {
                grok {
                        match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
                        add_field => [ "received_at", "%{@timestamp}" ]
                        add_field => [ "received_from", "%{host}" ]
                }
                syslog_pri { }
                date {
                        match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
                }
        }
}

output {
        elasticsearch {
                hosts => "elasticsearch:9200"
        }
}

filebeat.yml:

filebeat:
  # List of prospectors to fetch data.
  prospectors:
    # Each - is a prospector. Below are the prospector specific configurations
    -
      # Paths that should be crawled and fetched. Glob based paths.
      # For each file found under this path, a harvester is started.
      paths:
        - "/var/log/*.log"
        - "/var/log/syslog"
      # - c:\programdata\elasticsearch\logs\*

      # Type of the files. Based on this the way the file is read is decided.
      # The different types cannot be mixed in one prospector
      #
      # Possible options are:
      # * log: Reads every line of the log file (default)
      # * stdin: Reads the standard in
      input_type: log

      document_type: syslog

output:
  logstash:
    hosts:["localhost:5000"]

Everything is setup on the same machine.
I look at the conf files many time and I cannot figure it out.
I think filebeat don't send correctly the data to logstash. I'm new to this and I don't know how to debug this by myself.

Thanks.

I put forward in the resolution of my problem.

docker ps -a shows me that filebeat exit after running.
So obviously I don't have anything to index.

I'm looking forward this way.

any log output from filebeat? Have you tried to start filebeat with debug logging enalbed?

I'm trying to find where are the filebeat log in the document and also how to get it from a docker container. I think I have to mount a volume to save the data from filebeat into my server. Because of the volatile type of docker data.

Start filebeat with -e flag to have filebeat log to stderr. You can use docker logs <container_id/name> to show all filebeat log output.

Thank you !

I got :

Loading config file error: Failed to read /etc/filebeat/filebeat.yml: read /etc/filebeat/filebeat.yml: is a directory. Exiting.

How can be this path a directory ?

I launch my service with :

sudo docker run -d -v filebeat.yml:/etc/filebeat/filebeat.yml --name filebeat prima/filebeat -e

Ok I moved forward.

The main reason why file exited after launch was because of a typo in the conf file.

yml files are like python, they use "white space indentation".

Instead of :

output:
  logstash:
    hosts:["localhost:5000"]

I had to add a space like this

output:
  logstash:
    hosts : ["localhost:5000"]

now all my services are in their containers. But I still don't have any output data.

Even with a nc localhost 5000 < /var/log/syslog

Can you share some log debug output from filebeat, means running it with -e -d "*"

Thanks @ruflin

The only suspect log is :
WARN DNS lookup failure "logstash": lookup logstash on 8.8.8.8:53: no such host

for the all logs : http://pastebin.com/kJjeMUbs

What is strange is that in your config you have localhost for LS and here it tries to look up "logstash" host?

Why not remove logstash from the equation since Filebeat can write natively to Elasticsearch?

That would help no narrow the problem.

Docker when containers are linked uses the destination container name as a hostname, but you don't seem to link any containers to logstash container.

BTW, Which container does Filebeat run in? It is not clear to me...

Also your logs in Pastebin don't show anything abnormal. The log files watched are too old for Filebeat to care (> 24h) or the files didn't change within last 24h.