Logstash templates? Are they needed?

Might seem like a dumb question.

I've been trying to learn the ELK stack, and keep shifting focus between each component.

At work, I'm "playing" with a small cluster in attempt to ingest and analyze BRO data. Outside of work, I'm running a single instance that's ingesting from an Apache VPS.

I have never created any templates, either is Logstash or Elasticsearch, yet my data always seems to arrive, and be accessible in Kibana.

It might seem silly, but can anyone tell me if I should be doing something different? :slight_smile:

ALSO: I've posted in other forums, but if anyone has any recommendations for a printed book about the whole stack (Elasticsearch, Kibana, Logstash), I would very much appreciate it. I'd rather a printed guide I can carry with me for instances when its' not practical to have two displays open (coffeeshop, etc). But that's an aside

Elasticsearch's dynamic mapper and Logstash's default index template is often but not always good enough. Occasions where you'd want a custom index template include:

  • You're not happy with the default data type used for strings fields (string vs. keyword).
  • You want to make sure certain fields are always mapped as e.g. integers, even if a single bad event arrives just after an index is rolled over and a new one is created.
  • You have IP address fields.
  • You want additional geo_point fields or you're not happy with the name of the predefined one.
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.