steampipe plugin install azure

Table: azure_compute_disk_metric_read_ops_hourly - Query Azure Compute Disk Metrics using SQL

Azure Compute Disks are data storage units available in Microsoft Azure. They are used to store data for Azure Virtual Machines and other services. Azure Compute Disks provide high-performance, durable storage for I/O-intensive workloads.

Table Usage Guide

The azure_compute_disk_metric_read_ops_hourly table provides insights into read operations of Azure Compute Disks on an hourly basis. As a system administrator or a DevOps engineer, explore disk-specific details through this table, including the number of read operations, the time of operations, and associated metadata. Utilize it to monitor disk performance, identify usage patterns, and detect potential performance issues.

Examples

Basic info

Assess the elements within the Azure compute disk's read operations on an hourly basis. This can help in identifying patterns, understanding usage trends, and planning for capacity or performance optimization.

select
name,
timestamp,
minimum,
maximum,
average,
sample_count
from
azure_compute_disk_metric_read_ops_hourly
order by
name,
timestamp;
select
name,
timestamp,
minimum,
maximum,
average,
sample_count
from
azure_compute_disk_metric_read_ops_hourly
order by
name,
timestamp;

Operations Over 10 Bytes average

This query is used to monitor disk read operations on Azure, specifically focusing on instances where the average read operation exceeds 10 bytes. This is useful for identifying potential performance issues or bottlenecks in the system, allowing for proactive management and optimization of resources.

select
name,
timestamp,
round(minimum :: numeric, 2) as min_read_ops,
round(maximum :: numeric, 2) as max_read_ops,
round(average :: numeric, 2) as avg_read_ops,
sample_count
from
azure_compute_disk_metric_read_ops_hourly
where
average > 10
order by
name,
timestamp;
select
name,
timestamp,
round(minimum, 2) as min_read_ops,
round(maximum, 2) as max_read_ops,
round(average, 2) as avg_read_ops,
sample_count
from
azure_compute_disk_metric_read_ops_hourly
where
average > 10
order by
name,
timestamp;

Schema for azure_compute_disk_metric_read_ops_hourly

NameTypeOperatorsDescription
_ctxjsonbSteampipe context in JSON form, e.g. connection_name.
averagedouble precisionThe average of the metric values that correspond to the data point.
cloud_environmenttextThe Azure Cloud Environment.
maximumdouble precisionThe maximum metric value for the data point.
minimumdouble precisionThe minimum metric value for the data point.
nametextThe name of the disk.
resource_grouptextThe resource group which holds this resource.
sample_countdouble precisionThe number of metric values that contributed to the aggregate value of this data point.
subscription_idtextThe Azure Subscription ID in which the resource is located.
sumdouble precisionThe sum of the metric values for the data point.
timestamptimestamp with time zoneThe time stamp used for the data point.
unittextThe units in which the metric value is reported.

Export

This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- azure

You can pass the configuration to the command with the --config argument:

steampipe_export_azure --config '<your_config>' azure_compute_disk_metric_read_ops_hourly