steampipe plugin install gcp

Table: gcp_bigquery_table - Query BigQuery Tables using SQL

BigQuery is a fully-managed, serverless data warehouse that enables super-fast SQL queries using the processing power of Google's infrastructure. It allows you to analyze large datasets in real-time with a SQL-like syntax, without the need to manage the underlying infrastructure. BigQuery is designed to be not only fast, but also easy to use, and cost-effective.

Table Usage Guide

The gcp_bigquery_table table provides insights into BigQuery Tables within Google Cloud Platform (GCP). As a data analyst or data engineer, explore table-specific details through this table, including creation time, expiration time, labels, and more. Utilize it to uncover information about tables, such as those with specific labels, the status of tables, and the verification of table metadata.

Examples

Basic info

Discover the segments that are part of your Google Cloud BigQuery data, including their creation time and geographical location. This can aid in understanding the distribution and age of your data across different regions.

select
table_id,
dataset_id,
self_link,
creation_time,
location
from
gcp_bigquery_table;
select
table_id,
dataset_id,
self_link,
creation_time,
location
from
gcp_bigquery_table;

List tables that are not encrypted using CMK

Discover the segments that are potentially vulnerable due to the absence of encryption using a Customer-Managed Key (CMK). This aids in identifying areas that require enhanced security measures.

select
table_id,
dataset_id,
location,
kms_key_name
from
gcp_bigquery_table
where
kms_key_name is null;
select
table_id,
dataset_id,
location,
kms_key_name
from
gcp_bigquery_table
where
kms_key_name is null;

List tables which do not have owner tag key

Discover the segments that lack an owner tag key within your Google Cloud Platform's BigQuery datasets. This can be beneficial for identifying potential areas of unclaimed resources or orphaned datasets that need attention.

select
dataset_id,
location
from
gcp_bigquery_table
where
tags -> 'owner' is null;
select
dataset_id,
location
from
gcp_bigquery_table
where
json_extract(tags, '$.owner') is null;

Schema for gcp_bigquery_table

NameTypeOperatorsDescription
_ctxjsonbSteampipe context in JSON form.
akasjsonbArray of globally unique identifier strings (also known as) for the resource.
clustering_fieldsjsonbOne or more fields on which data should be clustered.
creation_timetimestamp with time zoneThe time when this table was created, in milliseconds since the epoch.
dataset_idtext=The ID of the dataset containing this table.
descriptiontextA user-friendly description of this table.
etagtextA hash of the table metadata, used to ensure there were no concurrent modifications to the resource when attempting an update.
expiration_timetimestamp with time zoneThe time when this table expires, in milliseconds since the epoch. If not present, the table will persist indefinitely. Expired tables will be deleted and their storage reclaimed. The defaultTableExpirationMs property of the encapsulating dataset can be used to set a default expirationTime on newly created tables.
external_data_configurationjsonbDescribes the data format, location, and other properties of a table stored outside of BigQuery.
idtextAn opaque ID uniquely identifying the table.
kindtextThe type of resource ID.
kms_key_nametextDescribes the Cloud KMS encryption key that will be used to protect destination BigQuery table.
labelsjsonbThe labels associated with this table. You can use these to organize and group your tables. Label keys and values can be no longer than 63 characters, can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed. Label values are optional. Label keys must start with a letter and each label in the list must have a different key.
last_modified_timetimestamp with time zoneThe time when this table was last modified.
locationtextThe GCP multi-region, region, or zone in which the resource is located.
materialized_viewjsonbDescribes materialized view definition.
num_bytesbigintThe size of this table in bytes, excluding any data in the streaming buffer.
num_long_term_bytesbigintThe number of bytes in the table that are considered 'long-term storage'.
num_physical_bytesbigintThe physical size of this table in bytes, excluding any data in the streaming buffer.
num_rowsbigintThe number of rows of data in this table, excluding any data in the streaming buffer.
projecttext=, !=, ~~, ~~*, !~~, !~~*The GCP Project in which the resource is located.
range_partitioningjsonbIf specified, configures range partitioning for this table.
require_partition_filterbooleanIf set to true, queries over this table require a partition filter that can be used for partition elimination to be specified.
schema_fieldsjsonbDescribes the fields in a table.
self_linktextA URL that can be used to access this resource again.
snapshot_timetimestamp with time zoneThe time at which the base table was snapshot.
sp_connection_nametext=, !=, ~~, ~~*, !~~, !~~*Steampipe connection name.
sp_ctxjsonbSteampipe context in JSON form.
streaming_bufferjsonbContains information regarding this table's streaming buffer, if one is present.
table_idtext=The ID of the table resource.
tagsjsonbA map of tags for the resource.
time_partitioningjsonbIf specified, configures time-based partitioning for this table.
titletextTitle of the resource.
typetextThe type of table. Possible values are: TABLE, VIEW.
viewjsonbdditional details for a view.
view_querytextA query that BigQuery executes when the view is referenced.
view_use_legacy_sqlbooleanTrue if view is defined in legacy SQL dialect, false if in standard SQL.

Export

This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- gcp

You can pass the configuration to the command with the --config argument:

steampipe_export_gcp --config '<your_config>' gcp_bigquery_table