Azure App Services to ELK

I currently manage more than 50 Azure App Services and looking for a solution to collect the logs and put them where devs can look at the logs and troubleshoot issues. If I go into the App Log Stream, it displays what I need to export into ELK. I setup an Event Hub to collect all the logs, and now trying to figure out how to get the Event Hub Events into ELK. I was advised to use Filebeat, but it states that it needs to be install on each of the servers... well, these are Azure App Services, and you don't have access to the server to install custom software. So I'll need to create a VM or App Service w/Docker Container and use the Filebeat Docker Image. It just seems there would be a cleaner and easier solution to do this. Am I going down the best path for this?

Hi @madmunki Welcome to the community.

Yes Filebeat Event Hub integration should be the best approach. You do need to install Filebeat on a VM (or VMs if you need to scale) but you should not need to install on each server / app instance can you point out where it says that ? (Perhaps our docs are not clear)

Event Hub is a collect and forward service (much like Kafka) and Filebeat should just connect to the Event Hub Service.

In the future Elastic Cloud will support "Cloud Native Integrations" so you would not need to install a beat or agent you would just configure that connector with Elastic Cloud

Here under Installation and Configuration

  • install Filebeat on each system you want to monitor

Right that means if you are collecting local logs off a local host you install on each host.

However for Streaming / Forwarding services Like Kafka / or Azure Event Hub etc you install on an additional VM because Filebeat makes a network connection to the service. Does that make sense?

So in short you will install Filebeat on an Azure VM or Docker Container and point the Filebeat Event Hub Module to the Event Hub Endpoint. Then depending on the Scale you may need to horizontally or vertically scale Filebeat but it does not need to go on each Host. Event Hub collects from each host / service and makes the logs available via the Endpoint.

Here is a little diagram Hope it make sense. Shows Different Options.

For servers, each server should have its own log shipper, so servers are not dependent on each other, that makes complete sense.

What about using the built-in Logstash? Is that able to connect to the Event Hub and put the logs into ELK? This would eliminate the need of a maintaining VM? You did say that Filebeat was the recommended solution, so Ill look into building out a Web App to host a Filebeat docker image. Once I get it working, Ill do one Docker Filebeat per RSG. Each RSG contains 7-9 APIs App Services, and one Event Hub that are all shipping logs to.

Logstash Solution:

No Using Logstash require a vm as well, there is no "Built In Logstash" it is a separate component just like filebeat. You would manage that just like Filebeat.

Logstash would just replace the Filebeat in the picture above, Logstash is a little bit more complex to manage than Flebeat.

I have Filestash being deployed as a container now, but need to finalize the configuration. Where do I find my Cloud ID? I didn't create our enterprise deployment, so not sure if the guy who did, wrote it down. We are using xxxx.eastus2.azure.elastic-cloud.com for the deployment, if that matters.

First it Filebeat :slight_smile:

Your Cloud ID can be found on the Elastic Cloud Console
You will also need to have the Auth which he should have written down / downloaded etc.

Otherwise you can get the the Elasticsearch and Kibana URL and use the username and password approach.

Looks like I'll have to go the UN/PW approach. This is my config so far, Filebeat still isn't initializing correctly. btw, I really appreciate the help!

filebeat.inputs:
- type: azure-eventhub
  eventhub: "(redacted)-useast-hub"
  connection_string: "Endpoint=(redacted)"
  storage_account: "(redacted)"
  storage_account_key: "(redacted)"
  storage_account_container: "$logs"
  
output.elasticsearch:
  hosts: ["https://(redacted).eastus2.azure.elastic-cloud.com:9243"]
  username: "(redacted)"
  password: "(redacted)"

My management screen looks different.

Your Picture is Kibana not Elastic Cloud Management Console, 2 different apps / UIs

If you share the filebeat logs perhaps I can help...

1 Like

Getting a failed to connect now. Pretty sure I done have my elastic-cloud.com url:port set right.

2021-05-20T00:18:45.093438783Z 2021-05-20T00:18:45.093Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(elasticsearch(https://[redacted].eastus2.azure.elastic-cloud.com:9243)): Failed to parse JSON response: invalid character '<' looking for beginning of value

2021-05-20T00:18:45.094333295Z 2021-05-20T00:18:45.093Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(elasticsearch(https://[redacted].eastus2.azure.elastic-cloud.com:9243)) with 8 reconnect attempt(s)

2021-05-20T00:18:45.095054405Z 2021-05-20T00:18:45.094Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer

2021-05-20T00:18:45.095601412Z 2021-05-20T00:18:45.095Z INFO [publisher] pipeline/retry.go:223 done

2021-05-20T00:19:01.071727632Z 2021-05-20T00:19:01.071Z INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cgroup":{"cpuacct":{"total":{"ns":24212953}},"memory":{"mem":{"usage":{"bytes":159744}}}},"cpu":{"system":{"ticks":310,"time":{"ms":3}},"total":{"ticks":1220,"time":{"ms":25},"value":1220},"user":{"ticks":910,"time":{"ms":22}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":14},"info":{"ephemeral_id":"0899e95a-19a4-4099-840f-70cdcb2b7082","uptime":{"ms":181415}},"memstats":{"gc_next":58588928,"memory_alloc":31129504,"memory_sys":262144,"memory_total":120574208,"rss":97832960},"runtime":{"goroutines":49}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":0},"read":{"bytes":22616},"write":{"bytes":952}},"pipeline":{"clients":1,"events":{"active":4117,"retry":50}}},"registrar":{"states":{"current":0}},"system":{"load":{"1":0.52,"15":0.76,"5":0.72,"norm":{"1":0.52,"15":0.76,"5":0.72}}}}}}

2021-05-20T00:19:17.424853935Z 2021-05-20T00:19:17.424Z ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(elasticsearch(https://[redacted].eastus2.azure.elastic-cloud.com:9243)): Failed to parse JSON response: invalid character '<' looking for beginning of value

2021-05-20T00:19:17.424894836Z 2021-05-20T00:19:17.424Z INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(elasticsearch(https://[redacted].eastus2.azure.elastic-cloud.com:9243)) with 9 reconnect attempt(s)

Since our admin of the current elastic-cloud is out, I signed up for a 14 day trial, to get everything working and figured out. When he comes back, I can switch over the Filebeat end-point to push to the enterprise elastic search.

I think you have it configured pointing at the Kibana endpoint not the elasticsearch endpoint.

That error above is what it looks like when you try to connect to Kibana instead of elasticsearch.

You can test this by simply trying to curl the endpoint.

curl -u "user:password" https://elasticsearchurl

If you have the right URL username and password you'll get a happy little message about your cluster.

1 Like

I created a 14 Day Trial so that I can access the Elastic Cloud Console, and get the elastic password, cloud-id, etc. Once my admin gets back from leave, I'll change it back over to the enterprise cluster.

So far, I am seeing Filebeat-7.12-* index being created in the Elastic data, and able to view the events as well! Now I am working on getting multiple environment logs in Elastic, and figuring out how to separate them back out. For production, it will have its own deployment, but non-prod will have 3 (Dev, QA, Staging) and this is what the developers will be given access to.

I appreciate everything @stephenb! I have all the non-prod logs being imported into Elastic, and can "somewhat" find them, I just need to learn Elastic now and how to build Index, reports, etc, but that is another discuss topic for another day! Thanks again for all the help!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.