Help datastreams

The first thing I learned was how to ingest logs to indexes, and I was asked to perform this process but already using datastreams and I set myself the task of doing it.

I have already created the ILM (Index Lifecycle Management) and the index template.

I have named the policy “tmes_policy” and the template “tmes_template”.

This is my configuration in the .conf file

The problem I have is that the datastream is created without any problem and data is displayed from kibana but it is applying a policy called “logs” through an index template called “logs” and I don't know how to do to allow me to apply a different policy from the index template.

output {
  if [type] == "tmes" {
    elasticsearch {
      id => "xxxx_output_tmes"
      hosts => ["https://1.1.1.1:9200"]
      data_stream => true
      data_stream_type => "logs"
      data_stream_dataset => "tmes"
      data_stream_namespace => "default"
      user => "elastic"
      password => "elastic"
      ssl_enabled  => true
      ssl_certificate_authorities => "/etc/logstash/certs/certificado-ca.crt"
    }
  }
}

It looks like you're matching one of the built-in templates. See the "Avoid index pattern collisions" section of Index templates | Elasticsearch Guide [8.14] | Elastic. You can either use a different index pattern, or use a higher priority on your template.

1 Like

I think the issue goes this way, I don't know what @leandrojmp thinks.

I would avoid any data stream naming scheme that could have any collision with any of the ones that elastic uses, which are logs-*, metrics-*, synthetics-* and traces-*.

You could for example use just tmes as your datastream name.

But this leads to another issue with is the fact that logstash does not support custom data stream names, to work with that you need to follow the workaround mentioned in the post you linked.

1 Like

@leandrojmp thanks for your answer, according to what little I understand the solution you propose would not be using datastream since clearly with your suggestion we would be working with normal indexes or am I wrong?

No, you can still use datastreams, you just need to use a custom datastream name that does not collide with the ones that elastic is already using.

But since logstash does not support this, you need to configure the output with data_stream as false, but in your index template you configure it to be a data stream.

This is a workaround to trick logstash into sending data to custom data stream names that it does not support yet.

For example, in your tmes_template you would have something like this:

{
  "index_patterns": ["tmes-*"],
  "data_stream": { },
  "composed_of": [ "your-mappings", "your-settings" ],
  "priority": 500
}

This creates a template that will match anything that starts with tmes-*, and it will create the indices as data streams.

Then, in your logstash output you would have something like this:

output {
  elasticsearch {
      hosts => ["HOSTS"]
      index => "tmes-logs"
      action => "create"
      http_compression => true
      data_stream => false
      manage_template => false
      ilm_enabled => false
      cacert => 'ca.crt'
      user => 'USER'
      password => 'PASSWORD'
  }
}

This would create a data stream named tmes-logs because the request will match a template that is configured to create data streams.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.