Table: databricks_sql_warehouse_config - Query Databricks SQL Warehouse Configurations using SQL
Databricks SQL Warehouse Configuration is a set of parameters that define the behavior and capacity of a SQL warehouse in Databricks. It determines the size, type, and other attributes of the warehouse, influencing its performance and cost. These configurations can be adjusted to optimize the warehouse for specific workloads or usage patterns.
Table Usage Guide
The databricks_sql_warehouse_config
table provides insights into the configuration parameters of SQL warehouses within Databricks. As a data engineer or DBA, explore the details of these configurations through this table, including size, type, and other attributes. Utilize it to monitor and optimize the performance and cost of your SQL warehouses, based on specific workloads or usage patterns.
Examples
Get warehouse sql configuration
Explore the security policies and configuration parameters of your SQL warehouse in Databricks. This is useful for assessing the current configuration and identifying any potential areas for optimization or security enhancement.
select security_policy, cp ->> 'key' as config_parameter_key, cp ->> 'value' as config_parameter_value, account_idfrom databricks_sql_warehouse_config, jsonb_array_elements( sql_configuration_parameters -> 'configuration_pairs' ) as cp;
select security_policy, json_extract(cp.value, '$.key') as config_parameter_key, json_extract(cp.value, '$.value') as config_parameter_value, account_idfrom databricks_sql_warehouse_config, json_each( sql_configuration_parameters, '$.configuration_pairs' ) as cp;
Check if warehouse config uses a security policy
Analyze the settings to understand if the warehouse configuration employs a security policy. This information is crucial to ensure the security and integrity of your data.
select security_policy, google_service_account, instance_profile_arn, account_idfrom databricks_sql_warehouse_configwhere security_policy <> 'NONE';
select security_policy, google_service_account, instance_profile_arn, account_idfrom databricks_sql_warehouse_configwhere security_policy <> 'NONE';
Get data acces configuration
Analyze the settings to understand the security policy and data access configurations within your Databricks SQL warehouse. This can be beneficial to maintain and review your data security standards across the account.
select security_policy, ac ->> 'key' as config_parameter_key, ac ->> 'value' as config_parameter_value, account_idfrom databricks_sql_warehouse_config, jsonb_array_elements(data_access_config) as ac;
select security_policy, json_extract(ac.value, '$.key') as config_parameter_key, json_extract(ac.value, '$.value') as config_parameter_value, account_idfrom databricks_sql_warehouse_config, json_each(data_access_config) as ac;
Get all enabled warehouse types for the workspace
Explore which warehouse types are currently active within a workspace. This information aids in understanding the resources available for data processing and analytics tasks.
select security_policy, wt ->> 'warehouse_type' as warehouse_type, wt ->> 'enabled' as enabled, account_idfrom databricks_sql_warehouse_config, jsonb_array_elements(enabled_warehouse_types) as wtwhere wt ->> 'enabled' = 'true';
select security_policy, json_extract(wt.value, '$.warehouse_type') as warehouse_type, json_extract(wt.value, '$.enabled') as enabled, account_idfrom databricks_sql_warehouse_config, json_each(enabled_warehouse_types) as wtwhere json_extract(wt.value, '$.enabled') = 'true';
Get details of thew instance profile used to pass IAM role to the cluster
Analyze the settings to understand the linkage between the Databricks SQL warehouse configuration and the compute instance profile. This can be particularly useful in assessing how IAM roles are passed to clusters for managing permissions and access control.
select c.google_service_account, i.iam_role_arn, i.is_meta_instance_profile, i.account_idfrom databricks_sql_warehouse_config as c left join databricks_compute_instance_profile as i on c.instance_profile_arn = i.instance_profile_arn;
select c.google_service_account, i.iam_role_arn, i.is_meta_instance_profile, i.account_idfrom databricks_sql_warehouse_config as c left join databricks_compute_instance_profile as i on c.instance_profile_arn = i.instance_profile_arn;
Schema for databricks_sql_warehouse_config
Name | Type | Operators | Description |
---|---|---|---|
_ctx | jsonb | Steampipe context in JSON form, e.g. connection_name. | |
account_id | text | The Databricks Account ID in which the resource is located. | |
channel | jsonb | Channel selection details. | |
data_access_config | jsonb | Spark confs for external hive metastore configuration JSON serialized. | |
enabled_warehouse_types | jsonb | List of Warehouse Types allowed in this workspace. | |
google_service_account | text | Google Service Account used to pass to cluster to access Google Cloud Storage. | |
instance_profile_arn | text | Instance profile used to pass IAM role to the cluster. | |
security_policy | text | Security policy for warehouses. | |
sql_configuration_parameters | jsonb | SQL configuration parameters. |
Export
This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.
You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh
script:
/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- databricks
You can pass the configuration to the command with the --config
argument:
steampipe_export_databricks --config '<your_config>' databricks_sql_warehouse_config