Not eligible for data streams because config contains one or more settings that are not compatible with data streams

I tried to upload my template using Logstash, but it did not.
It says that

[2023-07-05T00:03:53,496][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"template"=>"/etc/logstash/template/winlogbeat.template.json", "template_name"=>"my-index-template", "index"=>"my-index-stream*"}

my pipeline config

input {
  beats{
    port => 6044
  }
}

filter {
    if [event_id] == 4624 {
        grok {
             match => { "message" => "%{DATA:messageTitle}\n\n%{DATA:subjectHeader}\n\t%{DATA:subjectSecurityIDTitle}\t\t%{DATA:subjectSecurityID}\n\t%{DATA:subjectNameTitle}\t\t%{DATA:subjectName}\n\t%{DATA:subjectDomainTitle}\t\t%{DATA:subjectDomain}\n\t%{DATA:subjectLogonIDTitle}\t\t%{DATA:subjectLogonID}\n\n%{DATA:logonTypeTitle}\t\t\t%{DATA:logonType}\n\n%{DATA:logonHeader}\n\t%{DATA:logonSIDTitle}\t\t%{DATA:logonSID}\n\t%{DATA:logonAccountTitle}\t\t%{DATA:logonAccount}\n\t%{DATA:logonDomainTitle}\t\t%{DATA:logonDomain}\n\t%{DATA:logonIDTitle}\t\t%{DATA:logonID}\n\t%{DATA:logonGUIDTitle}\t\t%{DATA:logonGUID}\n\n%{DATA:procInfoHeader}\n\t%{DATA:procIDTitle}\t\t%{DATA:procID}\n\t%{DATA:procNameTitle}\t\t%{DATA:procPath}\n\n%{DATA:networkInfoHeader}\n\t%{DATA:networkWorkstationTitle}\t%{DATA:networkWorkstation}\n\t%{DATA:networkAddressTitle}\t%{DATA:networkAddress}\n\t%{DATA:networkPortTitle}\t\t%{DATA:networkPort}\n\n%{DATA:authInfoHeader}\n\t%{DATA:authProcessTitle}\t\t%{DATA:authProcess}\n\t%{DATA:authPackageTitle}\t%{DATA:authPackage}\n\t%{DATA:authTransitedServiceTitle}\t%{DATA:authTransitedService}\n\t%{DATA:authPackageNameTitle}\t%{DATA:authPackageName}\n\t%{DATA:authKeyLengthTitle}\t\t%{DATA:authKeyLength}\n\n%{GREEDYDATA:messageEnd}" }
        }
    }

}

output {
  elasticsearch {
    hosts => ["https://172.16.7.75:9200"]
    user => "admin"
    password => "s2312s*"
    cacert => "/etc/elasticsearch/certs/http_ca.crt"
    ssl => true
    ssl_certificate_verification => false

# INDEX TEMPLATE
    template => "/etc/logstash/template/winlogbeat.template.json"
    template_name => "my-index-template"
    index => "my-data-stream*"
  }
  stdout {
    codec => rubydebug
  }
  file {
    path => "/var/log/logstash/output.log"
  }
}

Can someone tell me where I went wrong?
My template is from winlogbeat I just export it using this command .\winlogbeat.exe export template --es.version 8.7 | Out-File -Encoding UTF8 winlogbeat.template.json

Assuming your stack is all 8.7, an elasticsearch output will, by default, create a data stream, not an index. If you want to create an index instead then set data_stream => false on the output.

If you do want a data stream then you cannot set the index name (you would use the various data_stream naming options).

Also, when writing to a data stream the output will not load a template, but I believe a template is required. You would have to load it through one of elasticsearch's template APIs.

Thank you @Badger

I have uploaded the template using Logstash. I made a mistake in index naming.
When naming the index, it should not include the wildcards (*)

index => my-data-stream

I have this question, is it possible to load the index and data stream simultaneously through the Logstash pipeline? @Badger

If you want to load the same events into to both an index and a data stream then you could use two outputs in the same logstash pipeline. I don't run elasticsearch but I am struggling to imagine a use-case where that would be useful.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.