Logs not pushed to elasticsearch

I have an issue with logstash that is not pushing logs to elastic. Below is the config file filebeat.yml that is used. I have tried to check on kibana if the index is even created and it is not.

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
  }
}

filter{
  grok{
        match=>{
         "message"=>"%{IPORHOST:client_ip} - %{DATA:user_name} \[%{HTTPDATE:date}\] \"%{WORD:http_method} %{DATA:url} HTTP/%{NUMBER:http_version}\" %{NUMBER:response_code} %{NUMBER:sent_bytes} \"%{DATA:referrer}\" \"%{DATA:agent}\""
          }
       }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "apache-log"
    user => "elastic"
    password => "F20Wgn77_CRjQmy3Bvjm"
  }
}

There is no error in the logs

Sending Logstash logs to D:/programs/ELK/logstash/logs which is now configured via log4j2.properties
[2023-04-19T22:23:08,665][INFO ][logstash.runner          ] Log4j configuration path used is: D:\programs\ELK\logstash\config\log4j2.properties
[2023-04-19T22:23:08,672][WARN ][logstash.runner          ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2023-04-19T22:23:08,673][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.7.0", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.6+10 on 17.0.6+10 +indy +jit [x86_64-mswin32]"}
[2023-04-19T22:23:08,677][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-04-19T22:23:08,720][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-04-19T22:23:10,176][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-04-19T22:23:10,536][INFO ][org.reflections.Reflections] Reflections took 129 ms to scan 1 urls, producing 132 keys and 462 values
[2023-04-19T22:23:11,219][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-04-19T22:23:11,242][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2023-04-19T22:23:11,350][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@localhost:9200/]}}
[2023-04-19T22:23:11,466][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@localhost:9200/"}
[2023-04-19T22:23:11,474][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.7.0) {:es_version=>8}
[2023-04-19T22:23:11,475][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-04-19T22:23:11,488][INFO ][logstash.outputs.elasticsearch][main] **Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"apache-log"}**
[2023-04-19T22:23:11,489][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2023-04-19T22:23:11,491][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2023-04-19T22:23:11,493][WARN ][logstash.filters.grok    ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2023-04-19T22:23:11,503][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2023-04-19T22:23:11,636][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["D:/programs/ELK/logstash/logstash-sample.conf"], :thread=>"#<Thread:0x5a74ed33@D:/programs/ELK/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-04-19T22:23:12,395][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.76}
[2023-04-19T22:23:12,403][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2023-04-19T22:23:12,416][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-04-19T22:23:12,441][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

Maybe filebeat is already read a log. LS log looks OK.

Have you try to use dimamic index name

index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}-apache"

are you shure the filebeat is sending to logstash?

I have noticed that filebeat service doesn't start because of this error

{"log.level":"error","@timestamp":"2023-04-25T22:38:19.456+0200","log.origin":{"file.name":"instance/beat.go","file.line":1071},"message":"Exiting: Failed to start crawler: creating module reloader failed: could not create module registry for filesets: module iis is configured but has no enabled filesets","service.name":"filebeat","ecs.version":"1.6.0"}

You have enabled module iis, but you haven't set log path.

1 Like

it works indeed:

- module: iis
  # Access logs
  access:
    enabled: true
    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: 
    - D:\Learning\log\access.log

  # Error logs
  error:
    enabled: true
    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: 
    - D:\Learning\log\error.log
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.