Not able to find indices, Learning about elastic stack

Hello everyone;

I am new in this about the elastic stack. I have been reading about it and learning since 3 days ago, but I still have missing pieces.

I tried to orchestrate a simple sample using docker/docker-compose in order to see everything working, but I havent been able to get indices in my Kibana; so I am not sure what is missing and how to test if logstash or the file beat is the problem.

I will begin first for my understanding, please correct me if I am missing something:

  • Logstash: it is our entry point to inject data/messages to the system. It can parse and also enrich the data
  • Elasticsearch: store and index the data, it will be our search engine.
  • Kibana: it is the dashboard that interact with elasticsearch to extract and display the data
  • Beats: little procedures that will fetch data somehow and pass it to Logstash to be treated.

With this concepts in mind I have the following docker-compose:

version: '2'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.1.1
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      - discovery.type=single-node
  kibana:
    image: docker.elastic.co/kibana/kibana:7.1.1
    ports:
      - "5601:5601"
    environment:
      ELASTICSEARCH_HOSTS: http://elasticsearch:9200
    depends_on:
      - elasticsearch
  logstash:
    image:  docker.elastic.co/logstash/logstash:7.1.1
    command: logstash -f /etc/logstash/conf.d/logstash.conf
    volumes:
      - ./logstash:/etc/logstash/conf.d
      - ./logstash/patterns:/opt/logstash/patterns
    depends_on:
      - elasticsearch
  filebeat:
    image: docker.elastic.co/beats/filebeat:7.1.1
    volumes:
      - ./filebeat/filebeat.yml:/filebeat.yml
      - ../fake-logs:/opt/tomcat/logs
    depends_on:
      - logstash

The filebeat configuration looks like:

filebeat.inputs:
- type: log
  paths:
    - "/opt/tomcat/logs/catalina.out"
  fields:
    apache: true
  fields_under_root: true
  scan_frequency: 2s
output.logstash:
    hosts: ["logstash:5044"]
setup.kibana:
  host: "kibana:5061"
logging.files:
    rotateeverybytes: 10485760 # = 10MB
  selectors: ["*"]
  level: warning

The logstash configuration looks like:

input {
  beats {
    port => 5044
  }
}

filter {
        if [message] !~ /(.+)/  {
            drop { }
        }
        grok{
             patterns_dir => "./patterns"
             match => [ "message", "%{CATALINA_DATESTAMP:timestamp} %{NOTSPACE:className} %{WORD:methodName}\r\n%{LOGLEVEL: logLevel}: %{GREEDYDATA:message}" ]
             overwrite => [ "message" ]
        }
        grok{
            match => [ "path", "/%{USERNAME:app}.20%{NOTSPACE}.log"]
            tag_on_failure => [ ]
        }
        #Aug 25, 2014 11:23:31 AM
        date{
            match => [ "timestamp", "MMM dd, YYYY hh:mm:ss a" ]
            remove_field => [ "timestamp" ]
        }

}

output {
  elasticsearch {
    hosts => "elasticsearch"
  }
}

Finally my log looks like:

02-Jun-2019 14:49:50.598 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
02-Jun-2019 14:49:50.602 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.19]
02-Jun-2019 14:49:50.654 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive [/opt/tomcat/webapps/isurge.war]
....
 :: Spring Boot ::        (v2.1.3.RELEASE)

2019-06-02 14:49:56.645  INFO 31600 --- [           main] i.s.core.info.surge.ServletInitializer   : Starting ServletInitializer on vps678506 with PID 31600 (/opt/tomcat/webapps/isurge/WEB-INF/classes started by tomcat in /)
2019-06-02 14:49:56.658  INFO 31600 --- [           main] i.s.core.info.surge.ServletInitializer   : The following profiles are active: prod
2019-06-02 14:49:59.126  INFO 31600 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data repositories in DEFAULT mode.
2019-06-02 14:49:59.269  INFO 31600 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 131ms. Found 9 repository interfaces.

So finally, is there any other concept I should take into account and I am missing? Is there a way to test if Filebeat is sending things to logstash how or where can I check that? I think my problem is in the Logstash and the filter I used the grok discover, but I am not sure if it is totally correct, I hate regexp :slight_smile:

I am sorry for these noob questions but has too many concepts involved that is hard to keep up, anyway this project and capabilities attracted my attention and would like to use it.

Kind regards.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.