Metribeats to kafka: no dashboard appear in kibana

Hello,

i have an issue with metricbeat. When i send data from metricbeat to kafka and kafka send to logstash and logstash then in kibana.
i can display the datas in kibana but i do not see the graphics displays in kibana , i only see the logs.

When i send the datas directly to elasticsearc i see the grapfics.
So i have some questions.
is it possible to display the graphics even if we use kafka, if so can you refer me to a tutorial or a guide so i can do it myself

Thank you

When you are sending the events from Logstash → Elasticsearch, are you writing to the same index as if you are sending from Metricbeat → Elasticsearch?

Hi @emmanuel_stevens_LED

First What version of the Stack are you on, this is important as there some configuration differences.

What I usually suggest is a progression

Step 1
metricbeat -> Elasticsearch
Works! This is good you said it already works.
Complete

Step 2
Next get this working...
Metricbeat -> Logstash -> Elasticsearch
There are several steps to get this to work, and you need a proper logstash config especially if you are using metricbeat modules. If you tell us what version you are one I can share a sample config.
Get that to work....

Only After you get Step 2 to work proceed to Step 3.

Step 3
Metricbeat - Kafka -> Logstash -> Elasticsearch
You will need to make sure beats is correctly writing to Kafka and that Logstash is pulling the right messages from the topc

I see people spend a lot of time try to set it all up at once and spend a lot of frustrating time... this is just my experience and advice.

Hello @stephenb ,
i am using version 8.3.2 for metricbeat.
Metricbeat is correctly writing to kafka, i can see the logs coming in the console as a consumer. Logstash send everything to kibana, i can see these logs in Observability --logs--stream
I want to display proper graphic and maybe put some filter to display the logs in a better way

I do not send the logs directly from metric beats to elasticsearch. I send them to kafka and aftre logstash take it from kafka and send the log to elastiocsearch. My configurations are base on what is on the web site, nothing more.

Did you index your Metricbeat events to logs index/data stream? Because the log stream UI defaults to either filebeat-* or logs-*, so either you changed this setting or you are indexing into the wrong index.

And based on the same screenshot, it seems that the JSON are not parsed correctly. Did you use the JSON codec in Logstash when you consume the events from Kafka?

May be worthwhile to share your Logstash pipeline configurations.

Here is my logstash config


input { kafka {
    bootstrap_servers => "192.168.208.160:9092"
    topics => ["test", "metricbeat"]
    }
}
#output {
#   stdout {}
#  }

output {
  elasticsearch {
    hosts => ["https://192.168.208.135:9200","https://192.168.208.136:9200"]
    cacert => '/etc/logstash/elasticsearch-ca.pem'
    user => 'elastic'
    password => '+5lwdwgSNuMF_8aJKQlD'
  }
}



You may want to remove your elastic user password, and you shouldn't be using elastic user for ingestion.

Having said that, the index is missing from the elasticsearch output, so most likely Logstash is indexing into logstash index by default.

Hi @emmanuel_stevens_LED

Seeing as you're going to ignore my advice :slight_smile: which is okay but I'll only be able to help a bit then.

Seeing as you are skipping step 2 above you can try this..

Your logstash configuration is not correct for the output ... This assumes your ran
metricbeat setup -e
when you first set up

Also this will not work if you setup modules... And need pipelines ... But for base metricbeat it will probably work

elasticsearch {
      hosts => "http://localhost:9200"
      user => "elastic"
      password => "password"
      manage_template => false
      index => "metricbeat-8.2.3"
      action => "create"
    }

You might also want to look learn here

@stephenb
I already had everything setup on my vmware. I already had a Kafka cluster running.
I am going to test what you said.
One more question, do i have to setup a conf file for each topic. I send metricbeat data on the metricbeat topic and send filebeat data on test topic. Do you have a suggestion or a link to share ?

thank you for your answers @hendry.lim , it is really helpful
For the user, when I use the logstash_system user to ingest data, it gave me an error not authorized. Since I am testing, so I use the elastic user.

My point is not comment on your final architecture, which is fine.. It is how to build it up and understand each component.

I'm not the Kafka expert perhaps someone else is..

I would definitely review the docs

No I think you can do one with 2 inputs and with conditional output based on tags...

Or

You can create 2 separate as well and name them in the pipelines.yml file

Your choice

You can consume from multiple topics with one kafka input and add conditional output based on the topic/beat name, for example.

Hello, I confugured the logstash to send the logs to my elk cluster, in that case i remove the kafka cluster so metricbeat communicate directly to logstash .
I putted the index in the logstash conf (output plugin ) as shown in your answer.

The dashboard still not receive the datas. Maybe i supected that kibana is pointing to anothe index.
Do you have a documents that can explain clearly what to do , or maybe a tutorial. I really need to use those dashboards
Thx



image

Read this... And follow the steps
Where it's say to clean up that means completely remove any / All metricbeat on indices and or data streams

Gives you step by step...

And with the new datastreams logstash should look like this.

input {
  beats {
    port => 5044
  }
}
output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "http://localhost:9200"
      pipeline => "%{[@metadata][pipeline]}"
      user => "elastic"
      password => "password"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
    }
  } else {
    elasticsearch {
      hosts => "http://localhost:9200"
      user => "elastic"
      password => "password"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
    }
  }
}

These are the steps / configuration that work / work for others.

Also, you're logstash config is not correct. You're not looking close at the elasticsearch output. You need to write to the data stream so it uses to correct mapping and then rolls over with the island. When you added the date to it, it wasn't doing that correctly.

Get that working then you can put Kafka back in.

hello,

I hadone what you said.
When i send the data corectly from metricbeat to elasticsearch, i have all the grafic. when i include logstash I change the logstash output to what you said and i send the data. i have some statistics but not all.

I also have an error message.
image
Here is a jason

 {
  "took": 16,
  "timed_out": false,
  "_shards": {
    "total": 2,
    "successful": 1,
    "skipped": 1,
    "failed": 1,
    "failures": [
      {
        "shard": 0,
        "index": "metricbeat-1",
        "node": "xD2iV5ovToG9T0iiO4pIqQ",
        "reason": {
          "type": "illegal_argument_exception",
          "reason": "Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default.
 Please use a keyword field instead. Alternatively, set fielddata=true on [system.network.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory."
        }
      }
    ]
  },
  "hits": {
    "total": 0,
    "max_score": 0,
    "hits": []
  }
}

My other question. do we have to go through that process for each beat ?

That is not the correct index and thus the mapping and fields are not correct, thus the data types are not correct and thus the dashboards and graphics are not correct / will not work.

This means that you did not use the logstash config I provided above or you changed it. I provided you the exact config that will , if you changed it you will need to show.. and what ever you are running / changing is not working

You need to clean up.
and follow the steps again AND use the logstash conf I provided... I am giving you a working solution that has been used by many.

If you use the logstash config EXACTLY as I have it above it work AFTER you now clean up and run through the steps again

  • Cleanup
  • Run filebeat setup -e when configured to point to elasticsearch
  • Run filebeat and see data loaded properly
  • Stop Filebeat
  • Point filebeat output to logstash
  • Start Logstsash with the EXACT conf that I provided.
  • Start Filebeat observe the correct behavior.

thank you, i will connect tonight and do it . i will get back to you.

thank you for your patience

1 Like

@emmanuel_stevens_LED I am trying to give you /team a working solution.

Elastic has a schema.. Metricbeat Has a Schema ... if you send data to Random Index names it will not work as you would expect...

When filebeat -> logstash -> elasticsearch THEN you can introduce kafka ...

Many teams use beats -> logstash -> elasticsearch but you need proper setup and configs..

There is even a section in the docs (but the example it out of date)

The config I gave above works... I currently use it...

input {
  beats {
    port => 5044
  }
}
output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "http://localhost:9200"
      pipeline => "%{[@metadata][pipeline]}"
      user => "elastic"
      password => "password"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
    }
  } else {
    elasticsearch {
      hosts => "http://localhost:9200"
      user => "elastic"
      password => "password"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
    }
  }
}

I want to thank you @stephenb . i can now receive all the metrics with logstash. Next step is to configure kafka.
I have another question. Do i have to do this for each server i want to monitor ?
do i have to configure them firs to send datas to elastic and after to logstash or is it ok for all the others ?