Logstash Output to Kibana with SSL?

I am new to Logstash, having previous experience with Filebeat and Winlogbeat.

In the Filebeat/Winlogbeat configuration we have separate output sections for Elasticsearch and Kibana. We configure the Kibana output section with paths to certificate files for authentication with Kibana, because without authentication we cannot create Kibana visualizations.

But in Logstash, there is only an elasticsearch output. I can't find any documentation on a Kibana output plugin for Logstash, where we may configure the paths to cert files. Am I just searching in the wrong places?

logstash does not provide support for the the Kibana APIs.

Any suggestions then? We need to be able to display this data on Kibana dashboards.

This is unrelated to Logstash.

The beats (filebeat, winlogbeat etc) does not have anyoutput to Kibana as Kibana does not receive data, it shows data present in Elasticsearch, what theyhave is a way to setup the built-in dashboards made by Elastic for data collected by them, this is what you configure.

Logstash has no integration with Kibana, if you are sending data from Logstash to Elasticsearch, you will need to create your own dashboards in Kibana.

Kibana is the visualization and management tool of the Elastic Stack, all the data is sent to and stored into Elasticsearch.

Wait, if there's no output to Kibana, why is there a Kibana output section in filebeat.yml?

It seems like we have never been able to get even a simple chart to show up in Kibana without SSL files configured in the Kibana output section of filebeat.yml or winlogbeat.yml. We can see data in Kibana Discover but that's it

But I could have been looking at the wrong config(s) the whole time.

I'll try some things. If we're still struggling with this maybe I'll start a thread on a Kibana appropriate forum if there are objections to continuing this thread here.

Filebeat needs Kibana url to import:

  • index template
  • dashboards
  • ingest pipeline
  • space.id (optional)

This is not an output per se, it is the configuration needed for Filebeat to be able to communicate with Kibana to install the dashboards as explained here.

Kibana dashboards are loaded into Kibana via the Kibana API. This requires a Kibana endpoint configuration.

This needs to be done on the first install and when there is an update to Filebeat or you install another filebeat module, this does not need to be done by the same filebeat that will collect your data. A common approach is to have a different filebeat the does not collect anything, just setup the dashboards.

Perhaps you should take a look at the quick start guide for filebeat
It shows how to enable a module, you can just use whichever module you like and configure it

If you have SSL issues it may be due to self-signed certs and there are docs on how to do that... or we can help with that

Oh I've used Filebeat on other VMs and love it. I'd use it on this VM too if I didn't need Logstash's mutate to get rid of the "_id" field.

Let me see if I can articulate my real problem more clearly.

In Kibana Discover, I see 3 documents that match index pattern itential-jobs-*. All 3 have a populated @timestamp field. All 3 were indexed by Logstash within the last 24 hours.

Then I do the following:

  1. create a new Dashboard
  2. set the time range to last 7 days (which should include the 24 hours)
  3. click Create Visualization
  4. select index pattern itential-jobs-*

I am unable to select @timestamp as the horizontal axis for a bar chart. I see the message @timestamp does not match any options when I type it into the field under "Select a field". But as mentioned previously, I can see 3 documents with populated @timestamp.

btw, it's version 7.17

Next time start with the actual issue .. we will get to an answer quicker :slight_smile:

Ok so things to check

Go to the index pattern and see what type of field @timestamp is is it date field?

2nd when you created the index pattern did you choose @timestamp as the time field?

If you go to Discover does it use @timestamp

The assuming you go to Lens are you using Date Histogram for the X Axis.. that would be correct choice... Are you trying a different type?

This is odd. @timestamp is not listed under fields, even though, as I said, it is being populated in the documents. Logstash was used to index the documents.

example document seen in Discover

  "_index": "itential-jobs-20230828",
  "_type": "_doc",
  "_id": "Qgl8PooBatf_EgwfVDjB",
  "_version": 1,
  "_score": 1,
  "_source": {
    "groups": [
    "@timestamp": "2023-08-28T23:30:49.990922086Z",

For any index pattern that was used by Filebeat, @timestamp is listed

Maybe I have to delete itential-jobs-*, and reset the mapping with @timestamp explicitly mapped? This is the mapping I set yesterday before indexing the 3 documents with Logstash.

  "itential-jobs-20230828" : {
    "mappings" : {
      "dynamic" : "false",
      "properties" : {
        "error" : {
          "type" : "flattened"
        "name" : {
          "type" : "text"
        "tasks" : {
          "type" : "flattened"
        "transitions" : {
          "type" : "flattened"
        "variables" : {
          "type" : "flattened"

So check the fields... Looks to me like it is in the source but was not indexed as a field

Yes I think you are ignoring the @timestamp field it is not getting indexed add it to the mapping and try again

That did it! Thank you very much!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.