Hi @leandrojmp, Thank you for your quick response. I've added data_stream => false
to my logstash.conf file, and the error disappeared. But when I try to create a data view and type logstash* in the "Index pattern" field, I get the following message:
Here are the logs:
logstash | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash | [2023-09-25T22:32:17,029][WARN ][deprecation.logstash.runner] NOTICE: Running Logstash as superuser is not recommended and won't be allowed in the future. Set 'allow_superuser' to 'false' to avoid startup errors in future releases.
logstash | [2023-09-25T22:32:17,048][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
logstash | [2023-09-25T22:32:17,050][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.7.1", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.7+7 on 17.0.7+7 +indy +jit [x86_64-linux]"}
logstash | [2023-09-25T22:32:17,056][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
logstash | [2023-09-25T22:32:18,036][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
logstash | [2023-09-25T22:32:18,586][INFO ][org.reflections.Reflections] Reflections took 206 ms to scan 1 urls, producing 132 keys and 462 values
logstash | [2023-09-25T22:32:19,011][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
logstash | [2023-09-25T22:32:19,029][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://es01:9200"]}
logstash | [2023-09-25T22:32:19,204][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@es01:9200/]}}
logstash | [2023-09-25T22:32:19,485][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@es01:9200/"}
logstash | [2023-09-25T22:32:19,495][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.7.1) {:es_version=>8}
logstash | [2023-09-25T22:32:19,495][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
logstash | [2023-09-25T22:32:19,510][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
logstash | [2023-09-25T22:32:19,524][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
logstash | [2023-09-25T22:32:19,532][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x28662611@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
logstash | [2023-09-25T22:32:20,423][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.89}
logstash | [2023-09-25T22:32:20,470][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_58135e48542f6990bba1ac621e1c7fce", :path=>["/var/log/docker/docker-valheim.log"]}
logstash | [2023-09-25T22:32:20,474][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
logstash | [2023-09-25T22:32:20,480][INFO ][filewatch.observingtail ][main][8a8a57189f722cb8c2a51e15426fc81e5b0903e7dbd0a0a3451ca75c14c78001] START, creating Discoverer, Watch with file and sincedb collections
logstash | [2023-09-25T22:32:20,494][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}