Is logstash necessarily

Hi everyone

I'm testing ELK in a virtual environment (WinServer AD + DNS, Ubuntu Server 22.04, Ubuntu Client 22.04 and Win 10 Client)
I've installed ELK stack on an Ubuntu Server 22.04 (I've been helped by a youtube video)

In this video, the person doesn't configure logstash so I didn't configure it and my ELK stack (Rather EK in my case) works well.
I'm not using any cloud and my cluster has just 1 node

LogStash will then use default params, so should work.
There are configurations:

  • logstash.yml - extra settings for LS
  • pipelines.yml - if you need extra pipelines for more advance configurations
  • *.conf - a configuration for data processing, the most important. Input, filter, and output define how to process data and where to send

Logstash is not even started and I set up security on Kibana and Elasticsearch by changing port, xpack, certificates...

Can you test LS with simple.conf

input {
  generator {
       message => "Info 11 Jan 2023 07:39:50.527 [Thread-A] - TaskScheduler, State=NORMAL"
       count => 1

} # input

filter {

output {

   file { path => "/somepath/test_%{+YYYY-MM-dd}.txt" } 

   stdout { codec => rubydebug{} }

It depends in your use case, what you want to collect, what you want to do with your data.

You only need to use Logstash if you want.

I've left my working desktop, I'll try it tomorrow. I'll be back, ty for reply

It would be monitoring the different machines of my infrastructure such as cpu usage, temperature, bandwidth etc..

It all depends on your use case. In the Elastic Stack (also known as ELK Stack), there are several components: Elasticsearch, Logstash, Kibana, and Beats or Elastic Agent. Elastic Agent is a first-class citizen in the Elastic ecosystem, and it is responsible for collecting and shipping data to Elasticsearch or Logstash.

Logstash is often used for data enrichment, processing, and distribution. If your use case doesn't require these operations, you can have Elastic Agent write directly to Elasticsearch, bypassing Logstash. However, if you need to transform, filter, or enrich the data before indexing it in Elasticsearch, using Logstash would be beneficial.

In summary, whether you need to use Logstash or not depends on your specific requirements and data processing needs. You can write directly from Elastic Agent to Elasticsearch if Logstash isn't necessary for your situation.


At my employer we use a setup like this:

  1. Application writes log messages to disk-file
  2. Filebeat reads log messages from from disk-file
  3. Filebeat sends messages (aka documents) to Logstash
  4. Logstash does some parsing (grokking) of the documents and extracts fields & values
  5. Logstash sends the documents to Elasticsearch for indexing and storage.
  6. Users log in to Kibana to view their log messages

It is entirely possible to skip steps 4-5 and have Filebeat send the documents directly to Elasticsearch. You can even skip steps 2-3 and have the Application send directly to Elasticsearch (if it knows how to talk to Elasticsearch).

If your setup is for monitoring systems data, not application messages, then yes, you can do a setup with just 3 components:

  1. ElasticAgent sends monitoring data to Elasticsearch
  2. Elasticsearch indexes and stores the data
  3. You view the monitoring data from Kibana (and control your 'fleet' of ElasticAgents from Kibana)
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.