Logstash pipeline problem (logstash not appearing in kibana)

Hi guys, thank you for having me on this forum.

I am trying to set up ELK for the first time. I have followed installation instructions from elastic.co with numbers of videos on yt and I have finally installed ELK.

The problem is that when I open Kibana and monitor health of the stack, I can see Kibana and Elastic but not Logstash. Something seems to be off.

I did some researches and I found on one of the topics that in order for this to work, I will need to configure pipe lines (I wish it was described in installation tutorial). Is that correct? Where can I find instructions on how to do it?

Finally, once I finish implementation of it on my home lab, I want to implement it in the production. I will need to purchase license since all traffic in production needs to be encrypted. However, I was no able to find out how much will this cost us.

CONFIGURATION
jdk 16.0.1



elastic.yml
---------------------------------- Cluster -----------------------------------
cluster.name: test-cluster
------------------------------------ Node ------------------------------------
node.name: test-node1
----------------------------------- Paths ------------------------------------
path.data: "C:\ELK\data"
path.logs: "C:\ELK\logs"

*everything else default



logstash.conf


input {
beats {
port => 5044
type => "log"
}
}

output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+yyyy.ww}"
document_type => "%{[@metadata][type]}"
}
}
*everything else is default



kibana
*default

NVM I could not wait for the response so I finally found out what was the problem.

For users who face similar issue.

U will find on the forum that you need to enable following parameters:
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.url: ["http://localhost:9200"]
xpack.monitoring.collection.interval: 10s
xpack.monitoring.collection.pipeline.details.enabled: true

However these are already depreciated. Instead use (assuming you hosting elastic and kibana on the same host)
xpack.monitoring.elasticsearch.host: ["http://localhost:9200"]

This kind of monitoring is also deprecated and will not work on future versions, you should monitor the stack using metricbeat.

For logstash you should follow this: Collect Logstash monitoring data with Metricbeat | Logstash Reference [7.12] | Elastic

1 Like

I am not sure if this is a right move from elastic. I am planning to use metricbeat down the road. However, on this stage (raw stock installation) I wanted to make sure that all three components of ELK are working well (before I start sending data with beats).

You can still monitor your cluster without metricbeat, this is what is called 'legacy monitoring' now, but it is not recommended and will probably stop working sometime in the future.

In production it is recommended to monitor the stack using metricbeat and have a different cluster for monitoring, it can be a single-node cluster just for monitoring.

1 Like

Thank you landrojmp. That is very interesting.

If you don't mind please help me to better understand this.
First, in production, I set up cluster with 3 nodes (lets call it production cluster)

  • 1 for elastic,
  • 1 for logstash
  • 1 for kibana
    (What is recommended minimal hardware configuration for each mode ?)

Next, I set up a separate cluster, with elastic,logstash and kibana on a single node (lets call it monitoring node).

Finally I install metricbeat on every of 3 nodes in production cluster and I configure beats to send data to monitoring mode.

Is this a correct approach? How can I ensure that all nodes from production cluster "talk" to each other with no problems, and that everything is configured right ?

BTW, this forum is amazing. I it difficult to find support like this nowadays.
Big thank you.

PS. this is very helpful

I think I have a right idea, just need to learn how to use API calls for elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.