Azure Filebeat issues with dashboard setup and data ingestion

Hi people,

I am currently new to elk stack and have set up elk locally and also have an elastic cloud instance.

So as part of the learning phase I am trying to use filebeat to consume data from Azure.

I have followed the documentation for this Filebeat quick start: installation and configuration | Filebeat Reference [7.13] | Elastic

What I have done so far is; I have a service bus resource whose logs and metrics I want to visualize on Kibana. I have setup an eventhub namespace, an event hub and a storage account.

As per the documentation, I have enabled the Azure module and configured my filebeat.yml to use the local elastic search (running on my machine)

I have observed two things:

  1. When running the setup command .\filebeat.exe setup or just for the dashboards .\filebeat.exe setup --dashboards

I keep getting this error message; which indicates that the dashboard isn't being setup
Exiting: 1 error: error loading index pattern: returned 408 to import file: <nil>. Response: {"statusCode":408,"error":"Request Time-out","message":"Request Time-out"}

But I do see the index in the index management.
2. But when I create the index pattern, I dont see any logs being recorded in the elastic search.

And my Filebeat yml is very basic

 filebeat.inputs:
- type: log
   enabled: false
  paths:
  - /var/log/*.log
# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

   # Set to true to enable config reloading
   reload.enabled: true
# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

setup.template.overwrite: true
# =================================== Kibana ===================================

setup.kibana:

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["http://localhost:9200"]

My Azure.yml is also very straightforward

Azure.yml

    - module: azure
      # All logs
      activitylogs:
        enabled: true
        var:      
          eventhub: "myeventhub"
          consumer_group: "$Default"
          connection_string: "myconnectionstringtoeventhubnamespace"
         storage_account: "mystorageaccount"
         storage_account_key: "mystorageaccountkey"

I am not sure where it is going wrong.
Does the setup have to setup the dashboard for me to visualize this data.
Otherwise I cant see it. Or the Azure yml is not setup correctly hence not picking the logs.

Is there a way to verify if the azure.yml if working fine or not.

Any suggestions would be helpful

Thanks

Is Kibana and Elasticsearch on the same 7.13 version?

You can also test the config

.\filebeat.exe test config

And test the connection

.\filebeat.exe test output

Hi Stephen,

I tried both commands and they come back as ok.

My Elasticsearch is on 7.13.0 and so is the Kibana.

I have tested then using logs from my c cdrive which was running fine.

Its when I added azure.yml ... I am not seeing any data coming in.

I was interested if you were still getting the 408?

With respect to the azure connection what do you see in the filebeat logs do you see it connecting / harvesting the azure logs?

.\filebeat.exe -e

Hi Stephen,

I had a look at the filebeat log .. I don't see any reference to azure in there

INFO	instance/beat.go:665	Home path: [C:\ELK\Filebeat] Config path: [C:\ELK\Filebeat] Data path: [C:\ELK\Filebeat\data] Logs path: [C:\ELK\Filebeat\logs]
INFO	instance/beat.go:673	Beat ID: bba9a310-1d63-4a77-b387-991c85d50517
INFO	[beat]	instance/beat.go:1014	Beat info	{"system_info": {"beat": {"path": {"config": "C:\\ELK\\Filebeat", "data": "C:\\ELK\\Filebeat\\data", "home": "C:\\ELK\\Filebeat", "logs": "C:\\ELK\\Filebeat\\logs"}, "type": "filebeat", "uuid": "bba9a310-1d63-4a77-b387-991c85d50517"}}}
INFO	[beat]	instance/beat.go:1023	Build info	{"system_info": {"build": {"commit": "054e224d226b42a1dd7c72dcf48c3f18de452e22", "libbeat": "7.13.0", "time": "2021-05-19T22:28:57.000Z", "version": "7.13.0"}}}
INFO	[beat]	instance/beat.go:1026	Go runtime info	{"system_info": {"go": {"os":"windows","arch":"amd64","max_procs":4,"version":"go1.15.12"}}}
INFO	[beat]	instance/beat.go:1030	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2021-05-31T06:46:08.26+10:00","name":"MELYNP0853","ip":["172.29.236.176/32","169.254.48.141/16","169.254.139.74/16","169.254.245.133/16","169.254.113.91/16","192.168.1.102/24","::1/128","127.0.0.1/8"],"kernel_version":"10.0.17763.1935 (WinBuild.160101.0800)","mac":["f4:30:b9:15:06:5f","14:ab:c5:b0:d4:2b","16:ab:c5:b0:d4:2a","00:ff:fb:d5:40:9c","14:ab:c5:b0:d4:2a"],"os":{"type":"windows","family":"windows","platform":"windows","name":"Windows 10 Enterprise","version":"10.0","major":10,"minor":0,"patch":0,"build":"17763.1935"},"timezone":"AEST","timezone_offset_sec":36000,"id":"b54908d4-d30d-4f7f-848c-a142edd395bd"}}}
INFO	[beat]	instance/beat.go:1059	Process info	{"system_info": {"process": {"cwd": "C:\\ELK\\Filebeat", "exe": "C:\\ELK\\Filebeat\\filebeat.exe", "name": "filebeat.exe", "pid": 10484, "ppid": 26008, "start_time": "2021-06-03T13:53:06.710+1000"}}}
INFO	instance/beat.go:309	Setup Beat: filebeat; Version: 7.13.0
INFO	[index-management]	idxmgmt/std.go:184	Set output.elasticsearch.index to 'filebeat-7.13.0' as ILM is enabled.
INFO	eslegclient/connection.go:99	elasticsearch url: http://localhost:9200
INFO	[publisher]	pipeline/module.go:113	Beat name: MyFileBeat
INFO	kibana/client.go:119	Kibana url: http://localhost:5601
INFO	[add_cloud_metadata]	add_cloud_metadata/add_cloud_metadata.go:101	add_cloud_metadata: hosting provider type not detected.
INFO	kibana/client.go:119	Kibana url: http://localhost:5601
ERROR	instance/beat.go:989	Exiting: 1 error: error loading index pattern: returned 408 to import file: <nil>. Response: {"statusCode":408,"error":"Request Time-out","message":"Request Time-out"}

As per the doc .. I have initialized azure module and provided the necessary connection string.
Is there anything else that needs to be setup to pull logs.
Also most of the examples on the internet are for activity logs ... can it be any other logs from monitor or just activity logs.

Thanks

If azure is enabled you should see logs like this... (minus the ERROR)

2021-06-03T17:04:11.537-0700    INFO    [azure-eventhub input]  azureeventhub/input.go:104      azure-eventhub input worker has started.        {"connection string": "(redacted)"}
2021-06-03T17:04:11.539-0700    ERROR   [azure-eventhub input]  azureeventhub/input.go:110      failed parsing connection string due to unmatched key value separated by '='      {"connection string": "(redacted)"}
2021-06-03T17:04:11.539-0700    INFO    [azure-eventhub input]  azureeventhub/input.go:111      azure-eventhub input worker has stopped.        {"connection string": "(redacted)"}

Hi Stephen,

I can confirm that I have enabled the azure module.

Is there any tool or command that I can look at to confirm if azure has been enabled and exporting data to filebeat and if filebeat is receiving it.

I have already confirmed azure is enabled by using \filebeat.bat modules list.

Thanks

Right but if you are not seeing log lines similar to above I would tell you that you are not executing the azure module.

exactly how are you starting filebeat.

Also you can just put the azure module right in the filebeat.yml

if you look at the filebeat.reference.yml you will see how to do that

I am on windows 10 machine.

So using Start-Service filebeat as mentioned in the docs

I can directly add the azure input in the filebeat.yml and give it a go.

Hi Stephen,
I have got the same error, do you know how to troubleshoot this, the key value are confirmed to be correct.

@wilsonwang
You need to look at the logs and see that the errors are.
If you can post the logs here (please format) we might be able to help

Here is the error log, a 404 ERROR, should be something wrong with the configuration in Eventhub

ERROR [azure-eventhub input] azureeventhub/input.go:110 unhandled error link 45c2a4a6-d14c-49b6-be4a-e5446b78638e: status code 404 and description: The messaging entity 'sb://cfc-logging-dev.servicebus.chinacloudapi.cn/cfc-logging-dev' could not be found. To know more visit Azure Service Bus Resource Manager exceptions - Azure Service Bus | Microsoft Docs. {"connection string": "Endpoint=sb://cfc-logging-dev.servicebus.chinacloudapi.cn/"}

thank you!

The 408 is indicating that kibana is timing out. Is kibana running at localhost:5601 as I don't see any config for kibana in ur posted yaml?? Also idk if it's just a copy and paste thing but I see uneven indentation in both posted configs.

Looks like that uri isn't value or doesn't exist. Is check your event hub config as it says.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.