Dec 5th, 2024: [EN] Keep track of your Steam Deck gaming with the Elastic Agent

Esse artigo também está disponível em Português.

Let's take Observability to a different spin and and use our favorite tools to monitor our gaming performance. Today we'll explore how to use the Elastic Agent to monitor a Steam Deck so we can see the games we play the most, how much resources they consume and how the GPU is performing.

We'll cover:

  • Install a Elastic Agent
  • Set up integrations
  • Use the Custom API integration to collect custom data
  • Build Kibana dashboards
  • Export and Import dashboards

Process data

The System Integration gives us all the data we need about the running processes, their names, how much CPU and memory they're using. We'll use this to see how much resources the process are using as well as to see which games we play the most.

GPU data

In general Observability data does not include how the CPU or GPU cores are performing. While it's possible to collect NVIDIA GPU data, the Steam Deck uses a custom AMD GPU and we want to get our hands dirty and do some work ourselves.

Linux distros usually include lm_sensors and it's present on Steam OS giving us all the information we need.

Running sensors on my Steam Deck I get the following:

nvme-pci-0100
Adapter: PCI adapter
Composite:    +45.9°C  (low  = -273.1°C, high = +82.8°C)
                       (crit = +84.8°C)
Sensor 1:     +45.9°C  (low  = -273.1°C, high = +65261.8°C)
 
BAT1-acpi-0
Adapter: ACPI interface
in0:           8.40 V  
curr1:         1.74 A  
 
amdgpu-pci-0400
Adapter: PCI adapter
vddgfx:      650.00 mV 
vddnb:       655.00 mV 
edge:         +44.0°C  
slowPPT:       7.12 W  (avg =  11.10 W, cap =  15.00 W)
fastPPT:      11.10 W  (cap =  15.00 W)
 
steamdeck_hwmon-isa-0000
Adapter: ISA adapter
PD Contract Voltage:   5.00 V  
System Fan:             0 RPM
Battery Temp:         +25.0°C  
PD Contract Current: 1000.00 mA 
 
acpitz-acpi-0
Adapter: ACPI interface
temp1:        +54.0°C  (crit = +105.0°C)

All we need is there, but not quite in the best format to ingest this data, let's get a JSON then with sensors -j:

{
   "nvme-pci-0100":{
      "Adapter": "PCI adapter",
      "Composite":{
         "temp1_input": 31.850,
         "temp1_max": 82.850,
         "temp1_min": -273.150,
         "temp1_crit": 84.850,
         "temp1_alarm": 0.000
      },
      "Sensor 1":{
         "temp2_input": 31.850,
         "temp2_max": 65261.850,
         "temp2_min": -273.150
      }
   },
   "BAT1-acpi-0":{
      "Adapter": "ACPI interface",
      "in0":{
         "in0_input": 8.656
      },
      "curr1":{
         "curr1_input": 0.159
      }
   },
   "amdgpu-pci-0400":{
      "Adapter": "PCI adapter",
      "vddgfx":{
         "in0_input": 0.650
      },
      "vddnb":{
         "in1_input": 0.655
      },
      "edge":{
         "temp1_input": 41.000
      },
      "slowPPT":{
         "power1_average": 2.072,
         "power1_input": 2.040,
         "power1_cap": 15.000
      },
      "fastPPT":{
         "power2_average": 2.072,
         "power2_cap": 15.000
      }
   },
   "steamdeck_hwmon-isa-0000":{
      "Adapter": "ISA adapter",
      "PD Contract Voltage":{
         "in0_input": 5.000
      },
      "System Fan":{
         "fan1_input": 1522.000,
         "fan1_fault": 0.000
      },
      "Battery Temp":{
         "temp1_input": 25.000
      },
      "PD Contract Current":{
         "curr1_input": 1.000
      }
   },
   "acpitz-acpi-0":{
      "Adapter": "ACPI interface",
      "temp1":{
         "temp1_input": 42.000,
         "temp1_crit": 105.000
      }
   }
}

We can't just have the Elastic Agent running arbitrary commands and parsing the output, therefore, in order to make this data available we'll need a wee HTTP server which can fetch this data.

I've built one in Go, go ahead to GitHub - AndersonQ/steamdeck-sensors-api and install it or build your own version of it.

With this server we can configure the Elastic Agent to collect data from the API we just created. To make sure it's always up and running, steamdeck-sensors-api can install and register itself as a systemd service.

Now we have all the data we need, let's collect it.

Collecting data

Go ahead and install an Elastic Agent. By default the System integration is added to any Elastic Agent policy. Go ahead create a policy, I'll call mine Steam Deck, and add an Elastic Agent to your Steam Deck.

Check the FAQ to see how to disable the read only mode with sudo steamos-readonly disable. Btw, did you forget your sudo password? You can easily reset it, see how here.

Now we have an agent and a policy, we can add and configure the Custom API integration
Go to Management > Integrations and search for "custom api":

Add it to your Steam Deck policy and let's configure it:

  • "Dataset name" to system.gpu
  • "Request URL": http://localhost:4242/gpu
  • "Request Interval": 30s - you can choose another interval if you prefer. To start playing with it and testing visualisations I used 1s to see new data in real time.
  • "Request HTTP Method": GET
  • "Processors":
- decode_json_fields:
    fields: ["message"]
    target: "system.gpu"
    overwrite_keys: true
    add_error_key: true
    expand_keys: true


The processor is crucial for formatting the final event correctly. Without it, the steamdeck-sensors-api response would be a string in the message field.

Now save and add the integration to your Steam Deck policy. If you already installed the agent, it'll automatically deploy the newly updated policy, if you haven't installed it yet, go and install the agent.

Checking the data

Go to Discover, select the metrics-* data view. Filter for event.dataset :"system.process". Then, add process.name, system.process.cpu.total.pct and system.process.memory.size. Open some game on your Steam Deck and try to find it among all the process metrics. :slight_smile:

For the GPU data, go to the logs-* data view and filter for event.dataset :"system.gpu"

Now we just need to create a dashboard.

Steam Deck dashboard

Here is the dashboard I created:

Astro-Win64-Shi is Astroneer, great game, I totally recommend it.

You can import this dashboard, it's on my github repo. I'll explain how to import it in a bit. But first let's see how to create a dashboard and add it a visualisation to it.

Go to `Dashboards > Create dashboard

  • In the query bar, add a filter to display data only from your Steam Deck. Filter by host.hostname : "steamdeck", adjusting the hostname if you've changed it on your device."

  • Click on Create visialization

  • Select the metrics-* dataview

  • search for process.cpu

  • drag and drop system.process.cpu.total.pct to the drop area

it'll create a chart like this:

  • go to "Breakdown" and choose process.name, and set "Number of values" to 15, then close the panel
  • instead of "Bar", choose "Treemap" for the visualization

it should be like that:

  • go ahead and click "Save and Return"

Now lets create one for the most played games.

  • click on Create Visualization"
  • on the search bar add the filter process.working_directory.text :"/home/deck/.local/share/Steam/steamapps/common/*". This query on process.working_directory will limit the process metrics to only the process which working directory is /home/deck/.local/share/Steam/steamapps/common/*. This should be the path for all your installed games. Depending on your setup, if you have an SD card, it might change. So if this does not work for you, try checking the process metrics, find the one for your games and see which working_directory they're using.
  • drag and drop process.name to the drop area
  • select "Tag cloud" for the visualization" type
  • click on "Top 5 values of process.name" and set the "Number of Values" to 15
  • close the panel
  • click on "Save and Return"

It should be like that before you click on "Save and Return"

The dashboard will look like:

To build visualisations with the GPU data, select the logs-* data view and filter for event.dataset : "system.gpu", like that:

I'll leave the creation of the other visualizations as an exercise for you. :wink:

Importing and exporting dashboards

You can import the dashboards with the Kibana Saved Objects API.

curl -u USER:PASSWORD -X POST -H 'Content-Type: multipart/form-data' -H 'kbn-xsrf: true'  YOUR-KIBANA-HOST/api/saved_objects/_import\?createNewCopies\=true --form file=@steam-deck-dashboard.ndjson

Get the dashboard to import here.

To export a dashboard, use

curl -u USER:PASSWORD -X POST -H 'Content-Type: application/json' -H 'kbn-xsrf: true' YOUR-KIBANA-HOST/api/saved_objects/_export \
-H "Content-Type: application/json; Elastic-Api-Version=2023-10-31" \
-H "kbn-xsrf: string" \
-d '{"objects":[{"id":"YOUR-DASHBOARD-ID","type":"dashboard"}],"excludeExportDetails":true,"includeReferencesDeep":true}' > steam-deck-dashboard.ndjson

To find the dashboard ID, open the dashboard, check the URL, it'll be something like that:

https://KIBANA-HOST/app/dashboards#/view/<dashboard-id>?

https://KIBANA-HOST/app/dashboards#/view/bfcd09b3-effe-4a65-b58b-b6c3d528cc3e?

Conclusion

It's great fun to use the Elastic Stack to monitor my Steam Deck, gaining insights into its performance, how games utilize resources, and identifying running programs and game binaries.

Most importantly, it's a fun way to get started with the Elastic Agent, ingesting monitoring data, and creating visualizations and dashboards. It also provides a glimpse into the vast amount of data we can collect, inspiring different ways to use it. :slight_smile:

You can grab a free trial on Elastic Cloud or easily run your own Elastic Stack on Docker or download and run it manually.

Now you already know how to monitor your Steam Deck and how to run a Elastic Stack, why don't you try it yourself?

2 Likes