Hi,
I'm trying to write to ELK. (using Docker)
Elastic is already installed and working properly.
I've installed Logstash on docker as well.
Problem is I can't see any logs being generated on kibana,
I thought that I will at least see some logs on Logstash itself (If I open docker ->logstash->logs I see the following:
2024-02-08 15:04:19 /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
2024-02-08 15:04:19 /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
2024-02-08 15:03:59 Using bundled JDK: /usr/share/logstash/jdk
2024-02-08 15:04:29 Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
2024-02-08 15:04:29 [2024-02-08T13:04:29,759][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
2024-02-08 15:04:29 [2024-02-08T13:04:29,770][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.12.1", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
2024-02-08 15:04:29 [2024-02-08T13:04:29,775][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
2024-02-08 15:04:29 [2024-02-08T13:04:29,778][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
2024-02-08 15:04:29 [2024-02-08T13:04:29,779][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
2024-02-08 15:04:29 [2024-02-08T13:04:29,816][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
2024-02-08 15:04:29 [2024-02-08T13:04:29,820][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
2024-02-08 15:04:30 [2024-02-08T13:04:30,250][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"b32b5bec-6616-4068-b36c-31bf31609da7", :path=>"/usr/share/logstash/data/uuid"}
2024-02-08 15:04:31 [2024-02-08T13:04:31,064][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
2024-02-08 15:04:31 [2024-02-08T13:04:31,067][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
2024-02-08 15:04:31 Please configure Elastic Agent to monitor Logstash. Documentation can be found at:
2024-02-08 15:04:31 https://www.elastic.co/guide/en/logstash/current/monitoring-with-elastic-agent.html
2024-02-08 15:04:31 [2024-02-08T13:04:31,563][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
2024-02-08 15:04:31 [2024-02-08T13:04:31,948][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
2024-02-08 15:04:31 [2024-02-08T13:04:31,951][INFO ][logstash.licensechecker.licensereader] Elasticsearch version determined (7.9.2) {:es_version=>7}
2024-02-08 15:04:31 [2024-02-08T13:04:31,951][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
2024-02-08 15:04:32 [2024-02-08T13:04:32,086][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
2024-02-08 15:04:32 [2024-02-08T13:04:32,087][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
2024-02-08 15:04:32 [2024-02-08T13:04:32,317][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
2024-02-08 15:04:33 [2024-02-08T13:04:33,079][INFO ][org.reflections.Reflections] Reflections took 193 ms to scan 1 urls, producing 132 keys and 468 values
2024-02-08 15:04:33 [2024-02-08T13:04:33,476][INFO ][logstash.javapipeline ] Pipeline `.monitoring-logstash` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
2024-02-08 15:04:33 [2024-02-08T13:04:33,519][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
2024-02-08 15:04:33 [2024-02-08T13:04:33,529][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
2024-02-08 15:04:33 /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/amazing_print-1.5.0/lib/amazing_print/formatter.rb:37: warning: previous definition of cast was here
2024-02-08 15:04:33 [2024-02-08T13:04:33,560][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
2024-02-08 15:04:33 [2024-02-08T13:04:33,560][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (7.9.2) {:es_version=>7}
2024-02-08 15:04:33 [2024-02-08T13:04:33,561][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
2024-02-08 15:04:33 [2024-02-08T13:04:33,651][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
2024-02-08 15:04:33 [2024-02-08T13:04:33,665][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x62602e4e /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
2024-02-08 15:04:33 [2024-02-08T13:04:33,990][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
2024-02-08 15:04:34 [2024-02-08T13:04:34,029][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
2024-02-08 15:04:34 [2024-02-08T13:04:34,048][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
2024-02-08 15:04:34 [2024-02-08T13:04:34,072][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
2024-02-08 15:04:34 [2024-02-08T13:04:34,073][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.9.2) {:es_version=>7}
2024-02-08 15:04:34 [2024-02-08T13:04:34,073][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
2024-02-08 15:04:34 [2024-02-08T13:04:34,136][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `true`
2024-02-08 15:04:34 [2024-02-08T13:04:34,137][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
2024-02-08 15:04:34 [2024-02-08T13:04:34,261][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf", "/usr/share/logstash/pipeline/logstash.conf \\logstash"], :thread=>"#<Thread:0x31900df7 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
2024-02-08 15:04:34 [2024-02-08T13:04:34,644][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.98}
2024-02-08 15:04:34 [2024-02-08T13:04:34,668][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
2024-02-08 15:04:34 [2024-02-08T13:04:34,891][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.63}
2024-02-08 15:04:34 [2024-02-08T13:04:34,974][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5044"}
2024-02-08 15:04:34 [2024-02-08T13:04:34,989][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
2024-02-08 15:04:35 [2024-02-08T13:04:35,000][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}
2024-02-08 15:04:35 [2024-02-08T13:04:35,073][INFO ][org.logstash.beats.Server][main][1816582b30d0806a9972e27ed0b7c278fffa1aaad974e5b8a1fae2318bfa24df] Starting server on port: 5044
This is my logstash.conf file:
input {
stdin{}
}
filter {
# Define any filters you need to apply to your incoming data, such as parsing, formatting, or enriching
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
# Define where you want to send your processed data, such as Elasticsearch, a file, or a message queue
elasticsearch {
hosts => ["elasticsearch:9200"]
}
stdout { codec => rubydebug }
gelf {
port => 12201
type => "gelf"
}
}
and this is my docker.compose file:
version: '3.4'
services:
iai-modernization-service-webapi:
environment:
- ASPNETCORE_ENVIRONMENT=Development
- ASPNETCORE_URLS=http://0.0.0.0:80
ports:
- "80:80"
- "443:443"
- "9999:9999"
networks:
- webapi-network
logging:
driver: gelf
options:
gelf-address: "udp://localhost:12201" # Logstash UDP input port
tag: "myapp"
zipkin:
image: "openzipkin/zipkin:latest"
container_name: zipkin
ports:
- "9411:9411"
elasticsearch:
container_name: elasticsearch
image: docker.elastic.co/elasticsearch/elasticsearch:7.9.2
ports:
- 9200:9200
volumes:
- elasticsearch-data:/usr/share/elasticsearch/data
environment:
- xpack.monitoring.enabled=true
- xpack.watcher.enabled=false
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- discovery.type=single-node
networks:
- webapi-network
kibana:
container_name: kibana
image: docker.elastic.co/kibana/kibana:7.9.2
ports:
- 5601:5601
depends_on:
- elasticsearch
- logstash
logging:
driver: gelf
options:
gelf-address: "udp://localhost:12201"
tag: "demo2_kibana"
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
networks:
- webapi-network
rabbitmq:
container_name: rabbitmq
image: rabbitmq:3-management
ports:
- "5672:5672" # RabbitMQ default port
- "15672:15672" # RabbitMQ management plugin port
environment:
RABBITMQ_DEFAULT_USER: "guest"
RABBITMQ_DEFAULT_PASS: "guest"
volumes:
- ~/.docker-conf/rabbitmq/data/:/var/lib/rabbitmq/
- ~/.docker-conf/rabbitmq/log/:/var/log/rabbitmq/
networks:
webapi-network:
aliases:
- rabbitmq
logstash:
container_name: logstash
image: docker.elastic.co/logstash/logstash:8.12.1
links:
- elasticsearch
ports:
- "12201:12201/udp"
volumes:
- ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf \logstash:latest
networks:
- webapi-network
depends_on:
- elasticsearch
volumes:
elasticsearch-data:
networks:
webapi-network:
driver: bridge
#volumes:
# - ${APPDATA}/Microsoft/UserSecrets:/root/.microsoft/usersecrets:ro
# - ${APPDATA}/ASP.NET/Https:/root/.aspnet/https:ro
3 questions:
- What am I doing wrong?
- I do u even write to stdin ? I thought that I need to use the docker->logstash->logs termnal but it is read only,
- How do I change my code, so that my logs will be written to Elastic?
thanks