Hi All, I am facing the issue, Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"info_apim_event"}

input {
    beats {
        port => 5044
    }
}

filter {
    grok {
        match => {
            "message" => "TID: \[%{DATA:thread}\] \[%{DATA:component}\] \[%{TIMESTAMP_ISO8601:timestamp}\] %{LOGLEVEL:logLevel} \{%{JAVACLASS:javaClass}\} - %{GREEDYDATA:message}"
        }
    }
}

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => info_apim_event
        # user => "elastic"
        # password => ""
        ssl_certificate_verification => false
    }
}

This is my logstash file , Can someone please help me

What version Logstash and Elastic search?

Try adding this to the output section

action => "create"

Please format your code going forward...

Can you share the entire log from Logstash?

This kind of log you shared normally appear as an INFO message.

Interesting....

To resolve the issue, modify your Logstash output to use data streams by adding data_stream => true and specifying data_stream_type , data_stream_dataset , and data_stream_namespace . Ensure your Elasticsearch version supports data streams (7.9+).


Hi leandrojmp, Here I attached entire logs

Hello, please do not share screenshots of logs, share your log as plain text.

It is pretty hard to read and impossible to search on a image.

Also, for what I was able to see, there is no log line with the message you shared in your first post.

But as mentioned this kind of message is not an error, it is just an informational message saying that your output settings are incompatible with the data streams, so your data will use normal indices.

C:\devsoftwares\logstash-8.12.0\bin>logstash -f .\config\logstash-sample.conf --config.reload.automatic
"Using bundled JDK: C:\devsoftwares\logstash-8.12.0\jdk\bin\java.exe"
C:/devsoftwares/logstash-8.12.0/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
C:/devsoftwares/logstash-8.12.0/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
Sending Logstash logs to C:/devsoftwares/logstash-8.12.0/logs which is now configured via log4j2.properties
[2024-03-04T18:04:34,406][INFO ][logstash.runner ] Log4j configuration path used is: C:\devsoftwares\logstash-8.12.0\config\log4j2.properties
[2024-03-04T18:04:34,410][WARN ][logstash.runner ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2024-03-04T18:04:34,411][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.12.0", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-mswin32]"}
[2024-03-04T18:04:34,413][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-03-04T18:04:34,417][INFO ][logstash.runner ] Jackson default value override logstash.jackson.stream-read-constraints.max-string-length configured to 200000000
[2024-03-04T18:04:34,418][INFO ][logstash.runner ] Jackson default value override logstash.jackson.stream-read-constraints.max-number-length configured to 10000
[2024-03-04T18:04:34,467][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-03-04T18:04:35,936][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-03-04T18:04:36,282][INFO ][org.reflections.Reflections] Reflections took 122 ms to scan 1 urls, producing 132 keys and 468 values
[2024-03-04T18:04:36,954][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "ssl_certificate_verification" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_verification_mode' instead. If you have any questions about this, please visit the logstash channel on freenode irc. {:name=>"ssl_certificate_verification", :plugin=><LogStash::Outputs::Elasticsearch ssl_certificate_verification=>false, index=>"info_log", password=>, id=>"ec32f064b5c66637f4aa83c47b7c00186ba58a031ce2a474ed75181614a9630c", user=>"elastic", hosts=>[https://:9200], enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_63b1362f-e61b-4c3b-b1e2-6ac3fa85c059", enable_metric=>true, charset=>"UTF-8">, workers=>1, ssl_verification_mode=>"full", sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>true, compression_level=>1, retry_initial_interval=>2, retry_max_interval=>64, dlq_on_failed_indexname_interpolation=>true, data_stream_type=>"logs", data_stream_dataset=>"generic", data_stream_namespace=>"default", data_stream_sync_fields=>true, data_stream_auto_routing=>true, manage_template=>true, template_overwrite=>false, template_api=>"auto", doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy">}
[2024-03-04T18:04:36,992][INFO ][logstash.javapipeline ] Pipeline main is configured with pipeline.ecs_compatibility: v8 setting. All plugins in this pipeline will default to ecs_compatibility => v8 unless explicitly configured otherwise.
[2024-03-04T18:04:37,016][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["https://
:9200"]}
[2024-03-04T18:04:37,021][WARN ][logstash.outputs.elasticsearch][main] You have enabled encryption but DISABLED certificate verification, to make sure your data is secure set ssl_verification_mode => full
[2024-03-04T18:04:37,159][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://elastic:xxxxxx@*:9200/]}}
[2024-03-04T18:04:38,548][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@
:9200/"}
[2024-03-04T18:04:38,549][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.12.0) {:es_version=>8}
[2024-03-04T18:04:38,551][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>8}
[2024-03-04T18:04:38,841][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"info_log"}
[2024-03-04T18:04:38,842][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (data_stream => auto or unset) resolved to false
[2024-03-04T18:04:38,851][INFO ][logstash.filters.json ][main] ECS compatibility is enabled but target option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the target option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-03-04T18:04:38,852][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-03-04T18:04:38,959][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["C:/devsoftwares/logstash-8.12.0/config/logstash-sample.conf"], :thread=>"#<Thread:0x175a8ed8 C:/devsoftwares/logstash-8.12.0/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-03-04T18:04:39,131][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-03-04T18:04:39,854][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.89}
[2024-03-04T18:04:39,878][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/devsoftwares/logstash-8.12.0/data/plugins/inputs/file/.sincedb_fd85d4ae41dc4a798a3e1a19dc33bdb7", :path=>["D:/infologs/a_apim_metrics.log"]}
[2024-03-04T18:04:39,880][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-03-04T18:04:39,886][INFO ][filewatch.observingtail ][main][fefa0a2d85c42002c3f7d065d181ea5d9cda24d2daa3c5522804d1f80a6ea537] START, creating Discoverer, Watch with file and sincedb collections
[2024-03-04T18:04:39,903][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}

Hi @leandrojmp , Please help me on this issue, i am unable create see the index, in index management

This is a INFO log, this is not an issue, this only means that your index is not compatible with data streams, so it will work as a normal index.

This is not an error and there are no ERROR or WARN lines in the log you shared that would impact on logstash being able to write to an index.

So it is not clear what is your issue here.

Also, the configuration you shared has a beats input, but from your logstash logs there is no beats input running, just a file input.

The logstash configuration shared is not the configuration that logstash is running, you need to validate which is the configuration you are using.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.