Table: azure_compute_disk_metric_write_ops_daily - Query Azure Compute Disk Metrics using SQL
Azure Compute Disk is a resource within Microsoft Azure that provides scalable and secure disk storage for Azure Virtual Machines. It offers high-performance, highly durable block storage for your mission-critical workloads. You can use it to persist data by writing it to the disk, or to read data from the disk.
Table Usage Guide
The azure_compute_disk_metric_write_ops_daily
table provides insights into daily write operations on Azure Compute Disks. As a system administrator or a DevOps engineer, you can use this table to monitor disk performance and usage, enabling you to proactively address any potential issues. This can help you ensure optimal performance and availability of your Azure resources.
Examples
Basic info
Analyze the daily write operations on Azure compute disks to understand their usage patterns and performance metrics. This query can be useful in identifying potential bottlenecks or areas for optimization in your storage infrastructure.
select name, timestamp, minimum, maximum, average, sample_countfrom azure_compute_disk_metric_write_ops_dailyorder by name, timestamp;
select name, timestamp, minimum, maximum, average, sample_countfrom azure_compute_disk_metric_write_ops_dailyorder by name, timestamp;
Operations Over 10 Bytes average
Determine the areas in which the average daily write operations on Azure Compute Disk exceed 10 bytes. This can help optimize disk usage by identifying potential inefficiencies or areas of high activity.
select name, timestamp, round(minimum :: numeric, 2) as min_write_ops, round(maximum :: numeric, 2) as max_write_ops, round(average :: numeric, 2) as avg_write_ops, sample_countfrom azure_compute_disk_metric_write_ops_dailywhere average > 10order by name, timestamp;
select name, timestamp, round(minimum, 2) as min_write_ops, round(maximum, 2) as max_write_ops, round(average, 2) as avg_write_ops, sample_countfrom azure_compute_disk_metric_write_ops_dailywhere average > 10order by name, timestamp;
Schema for azure_compute_disk_metric_write_ops_daily
Name | Type | Operators | Description |
---|---|---|---|
_ctx | jsonb | Steampipe context in JSON form. | |
average | double precision | The average of the metric values that correspond to the data point. | |
cloud_environment | text | The Azure Cloud Environment. | |
maximum | double precision | The maximum metric value for the data point. | |
minimum | double precision | The minimum metric value for the data point. | |
name | text | The name of the disk. | |
resource_group | text | The resource group which holds this resource. | |
sample_count | double precision | The number of metric values that contributed to the aggregate value of this data point. | |
sp_connection_name | text | =, !=, ~~, ~~*, !~~, !~~* | Steampipe connection name. |
sp_ctx | jsonb | Steampipe context in JSON form. | |
subscription_id | text | =, !=, ~~, ~~*, !~~, !~~* | The Azure Subscription ID in which the resource is located. |
sum | double precision | The sum of the metric values for the data point. | |
timestamp | timestamp with time zone | The time stamp used for the data point. | |
unit | text | The units in which the metric value is reported. |
Export
This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.
You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh
script:
/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- azure
You can pass the configuration to the command with the --config
argument:
steampipe_export_azure --config '<your_config>' azure_compute_disk_metric_write_ops_daily