Table: databricks_sharing_provider - Query Databricks Sharing Providers using SQL
Databricks Sharing Providers is a resource within Databricks that allows you to manage and configure the sharing settings for your workspace. It provides a centralized way to set up and manage sharing providers, including email, Slack, and more. Databricks Sharing Providers helps you stay informed about the sharing configurations and take appropriate actions when predefined conditions are met.
Table Usage Guide
The databricks_sharing_provider
table provides insights into the sharing providers within Databricks. As a DevOps engineer, explore sharing provider-specific details through this table, including configurations and associated metadata. Utilize it to uncover information about sharing providers, such as those with specific configurations, the relationships between sharing providers, and the verification of sharing settings.
Examples
Basic info
Explore the basic information from your Databricks sharing provider to understand who created specific data and when. This can help you track data provenance and maintain accountability within your team.
select name, comment, data_provider_global_metastore_id, metastore_id created_at, created_by, account_idfrom databricks_sharing_provider;
select name, comment, data_provider_global_metastore_id, metastore_id, created_at, created_by, account_idfrom databricks_sharing_provider;
List providers created in the last 7 days
Explore which data sharing providers were created in the past week. This can be beneficial to maintain an up-to-date overview of recent changes in your Databricks environment.
select name, comment, data_provider_global_metastore_id, metastore_id created_at, created_by, account_idfrom databricks_sharing_providerwhere created_at >= now() - interval '7' day;
select name, comment, data_provider_global_metastore_id, metastore_id, created_at, created_by, account_idfrom databricks_sharing_providerwhere created_at >= datetime('now', '-7 day');
List providers authenticated by Databricks
Explore which providers are authenticated by Databricks to understand the security and access configuration of your data sharing setup. This is useful in auditing your data sharing environment and ensuring only authorized providers have access.
select name, comment, data_provider_global_metastore_id, metastore_id created_at, created_by, account_idfrom databricks_sharing_providerwhere authentication_type = 'DATABRICKS';
select name, comment, data_provider_global_metastore_id, metastore_id, created_at, created_by, account_idfrom databricks_sharing_providerwhere authentication_type = 'DATABRICKS';
List all shares by each provider
Explore the distribution of shares among different providers to better manage and optimize resource allocation. This can assist in identifying which providers have the most shares, aiding in strategic decision-making.
select name as provider_name, s ->> 'name' as provider_share_name, account_idfrom databricks_sharing_provider, jsonb_array_elements(shares) as s;
select name as provider_name, json_extract(s.value, '$.name') as provider_share_name, account_idfrom databricks_sharing_provider, json_each(shares) as s;
Get recipient profile for each shared provider
Explore which shared providers have been configured with token-based authentication. This can help in understanding the distribution and usage of different authentication types across your shared providers.
select name as share_name, recipient_profile ->> 'bearer_token' as bearer_token, recipient_profile ->> 'endpoint' as endpoint, recipient_profile ->> 'share_credentials_version' as share_credentials_version, account_idfrom databricks_sharing_providerwhere authentication_type = 'TOKEN';
select name as share_name, json_extract(recipient_profile, '$.bearer_token') as bearer_token, json_extract(recipient_profile, '$.endpoint') as endpoint, json_extract(recipient_profile, '$.share_credentials_version') as share_credentials_version, account_idfrom databricks_sharing_providerwhere authentication_type = 'TOKEN';
List the owner in order of the number of providers
Discover the segments that have the highest number of providers, organized by the owner. This can help assess the distribution and management of providers across different owners.
select owner, count(*) as provider_countfrom databricks_sharing_providergroup by ownerorder by provider_count desc;
select owner, count(*) as provider_countfrom databricks_sharing_providergroup by ownerorder by provider_count desc;
Schema for databricks_sharing_provider
Name | Type | Operators | Description |
---|---|---|---|
_ctx | jsonb | Steampipe context in JSON form, e.g. connection_name. | |
account_id | text | The Databricks Account ID in which the resource is located. | |
authentication_type | text | The delta sharing authentication type. | |
cloud | text | Cloud vendor of the provider's UC metastore. | |
comment | text | Description about the provider. | |
created_at | timestamp with time zone | Timestamp when the provider was created. | |
created_by | text | User who created the provider. | |
data_provider_global_metastore_id | text | = | The global UC metastore id of the data provider. |
metastore_id | text | UUID of the provider's UC metastore. | |
name | text | = | Name of the provider. |
owner | text | User who owns the provider. | |
recipient_profile | jsonb | The recipient profile description. | |
recipient_profile_str | text | The recipient profile as a string. | |
region | text | Cloud region of the provider's UC metastore. | |
shares | jsonb | An array of provider's shares. | |
title | text | The title of the resource. | |
updated_at | timestamp with time zone | Timestamp when the provider was last updated. | |
updated_by | text | User who last modified the provider. |
Export
This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.
You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh
script:
/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- databricks
You can pass the configuration to the command with the --config
argument:
steampipe_export_databricks --config '<your_config>' databricks_sharing_provider