No index created in Elasticsearch. Can see logs in Docker

Hi,

I'm trying to get a simple ELK setup working with Docker. Issue is I don't see the index I've defined created

docker-compose.yml

  • I've verified that log files are copied over correctly onto the Docker container
services:
  ...
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.15.2
    container_name: elasticsearch
    environment:
      - node.name=elasticsearch
      - discovery.type=single-node
      - bootstrap.memory_lock=true
      - 'ES_JAVA_OPTS=-Xms512m -Xmx512m'
    volumes:
      - es-data:/usr/share/elasticsearch/data
    ports:
      - 9200:9200
    networks:
      - elk
  logstash:
    image: docker.elastic.co/logstash/logstash:7.15.2
    container_name: logstash
    volumes:
      - ./logstash/pipeline/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
      - ./logs:/usr/share/logstash/logs
    ports:
      - 5000:5000
    environment:
      - 'LS_JAVA_OPTS=-Xmx256m -Xms256m' # Sets the Java options for Logstash heap size.
    networks:
      - elk
    depends_on:
      - elasticsearch
  kibana:
    image: docker.elastic.co/kibana/kibana:7.15.2
    container_name: kibana
    ports:
      - 5601:5601
    environment: # Sets the Elasticsearch URL and Node.js options for the Kibana container.
      - ELASTICSEARCH_URL=http://elasticsearch:9200
      - 'NODE_OPTIONS=--max-old-space-size=2048'
    networks:
      - elk
    depends_on:
      - elasticsearch

networks:
  elk:
    driver: bridge

volumes:
  es-data:
    driver: local

logstash.conf

  • My understanding is that @timestamp is needed in the output to create the index. Since the input logs do not have @timestamp the filter converts the ISO8601 dates to a @timestamp field
input {
 file {
    path => "/usr/share/logstash/logs"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} %{JAVACLASS:class} %{GREEDYDATA:message}" }
  }
  date {
    match => ["timestamp", "ISO8601"]
  }
}

output {
  elasticsearch {
    hosts => "http://elasticsearch:9200"
    index => "spring-boot-logs-%{+YYYY.MM.dd}"
  }
  stdout {
    codec => rubydebug
  }
}

curl -X GET "http://localhost:9200/_cat/indices?v=true&pretty"

gives me

green  open   .geoip_databases                5KcSY9VIQ56YyzMgKIwkRQ   1   0         35            0     33.6mb         33.6mb
green  open   .apm-custom-link                584pRkqYSvSEonkwV1j2Cw   1   0          0            0       208b           208b
green  open   .kibana-event-log-7.15.2-000001 Vs1frDseTeqfVT0B3TVAfA   1   0         17            0     49.8kb         49.8kb
green  open   .apm-agent-configuration        H7oR8SbKSyWMp3DJ0n4z-w   1   0          0            0       208b           208b
green  open   .kibana_7.15.2_001              VpTODGWeTgWzUx_zEouLTA   1   0         21            8      2.3mb          2.3mb
green  open   .kibana_task_manager_7.15.2_001 Q8g6taJtTnKlPnFOmr1w-g   1   0         15          617    479.5kb        479.5kb
green  open   .tasks                          AxjLiUffTTa1UlbBL5eNLg   1   0         13            3     45.5kb         45.5kb

Since /usr/share/logstash/logs is a directory you need to specify a file pattern or just add a * to the end, like this:

path => "/usr/share/logstash/logs/*

Thanks for the reply. That didn't seem to work. Still get the same result for curl -X GET "http://localhost:9200/_cat/indices?v=true&pretty"

What do you have in your logstash container logs?

Start it again and share it.

2025-01-16 09:04:07 [2025-01-16T14:04:07,319][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x19e985d run>"}
2025-01-16 09:04:08 [2025-01-16T14:04:08,518][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>1.4}
2025-01-16 09:04:08 [2025-01-16T14:04:08,682][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
2025-01-16 09:04:08 [2025-01-16T14:04:08,734][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.41}
2025-01-16 09:04:08 [2025-01-16T14:04:08,960][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
2025-01-16 09:04:09 [2025-01-16T14:04:09,003][INFO ][filewatch.observingtail  ][main][3405ca177c36b4b66b2bb9e9c8360e55e23c901fa1005c000fe71844c0c20c66] START, creating Discoverer, Watch with file and sincedb collections
2025-01-16 09:04:09 [2025-01-16T14:04:09,008][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}

I've included what I think are most relevant