Hi, for context, in the current setup I send data from Filebeat to Logstash, then using tags and an output like below I send data to Elasticsearch, where I have an index template with data streams enabled configured:
elasticsearch {
index => "%{[@metadata][index]}"
action => create
hosts => ["https://hostname:9200"]
ssl => true
ssl_certificate_verification => true
cacert => 'path/ca.crt'
user => logstash_internal
password => "${LS_PWD}"
}
Now to the question - How can I send data to Elasticsearch using Filebeat Elasticsearch output only?
I know there is an index option, but it requires to setup below properties:
setup.template.name
setup.template.pattern
I've tried to setup all 3 of them to the value of my index template name, but it gives me error:
[2022-12-22T11:59:37,638][DEBUG][o.e.a.s.m.TransportMasterNodeAction] [ifcm_elastic_1] unexpected exception during publication
java.lang.IllegalArgumentException: legacy template [processbeat-8.0.0] has index patterns [processbeat-8.0.0-*] matching patterns from existing composable templates [logs_template,processbeat] with patterns (logs_template => [*int-logs*, *int1-logs*, *int2-logs*],processbeat => [
processbeat*]), use composable templates (/_index_template) instead
at org.elasticsearch.cluster.metadata.MetadataIndexTemplateService.innerPutTemplate(MetadataIndexTemplateService.java:1062) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.metadata.MetadataIndexTemplateService$6.execute(MetadataIndexTemplateService.java:1009) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.metadata.MetadataIndexTemplateService.lambda$static$3(MetadataIndexTemplateService.java:134) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.MasterService.innerExecuteTasks(MasterService.java:1052) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:1017) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:278) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:170) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:110) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:148) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:825) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:257) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223) ~[elasticsearch-8.5.2.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
at java.lang.Thread.run(Thread.java:1589) ~[?:?]
(the index template name is "processbeat")
I had the same error in Logstash until I set below setting:
action => create
Can I do the same in Filebeat somehow?