turbot/terraform
steampipe plugin install terraform

Terraform + Steampipe

A Terraform configuration file is used to declare resources, variables, modules, and more.

Steampipe is an open source CLI to instantly query data using SQL.

The plugin supports scanning Terraform configuration files from various sources (e.g., Local files, Git, S3 etc.), parsing Terraform states and parsing Terraform plans as well.

Documentation

Get Started

Install

Download and install the latest Terraform plugin:

steampipe plugin install terraform

Configuration

Installing the latest terraform plugin will create a config file (~/.steampipe/config/terraform.spc) with a single connection named terraform:

connection "terraform" {
plugin = "terraform"
configuration_file_paths = ["*.tf"]
plan_file_paths = ["tfplan.json", "*.tfplan.json"]
state_file_paths = ["*.tfstate"]
}

For a full list of configuration arguments, please see the default configuration file.

Run a Query

Run steampipe:

steampipe query

Query all resources in your Terraform files:

select
name,
type,
jsonb_pretty(arguments) as args
from
terraform_resource;
> select name, type, jsonb_pretty(arguments) as args from terraform_resource;
+------------+----------------+--------------------------------------------+
| name | type | args |
+------------+----------------+--------------------------------------------+
| app_server | aws_instance | { |
| | | "ami": "ami-830c94e3", |
| | | "tags": { |
| | | "Name": "ExampleAppServerInstance" |
| | | }, |
| | | "instance_type": "t2.micro" |
| | | } |
| app_volume | aws_ebs_volume | { |
| | | "size": 40, |
| | | "tags": { |
| | | "Name": "HelloWorld" |
| | | }, |
| | | "availability_zone": "us-west-2a" |
| | | } |
| app_bucket | aws_s3_bucket | { |
| | | "acl": "private", |
| | | "tags": { |
| | | "Name": "Test bucket", |
| | | "Environment": "Dev" |
| | | }, |
| | | "bucket": "my-app-bucket" |
| | | } |
+------------+----------------+--------------------------------------------+

Configuring Paths

The plugin requires a list of locations to search for the Terraform configuration files. Paths can be configured with Local files, Git URLs, S3 URLs etc.

Note: Local file paths are resolved relative to the current working directory (CWD).

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [
"terraform_test.tf",
"github.com/turbot/steampipe-plugin-aws//aws-test/tests/aws_acm_certificate//variables.tf"
]
}

Paths may include wildcards and support ** for recursive matching. For example:

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [
"*.tf",
"~/*.tf",
"github.com/turbot/steampipe-plugin-aws//aws-test/tests/aws_acm_certificate//*.tf",
"github.com/hashicorp/terraform-guides//infrastructure-as-code//**/*.tf",
"bitbucket.org/benturrell/terraform-arcgis-portal//modules/shared//*.tf",
"gitlab.com/gitlab-org/configure/examples/gitlab-terraform-aws//*.tf",
"s3::https://bucket.s3.us-east-1.amazonaws.com/test_folder//*.tf"
]
}

Note: If any path matches on * without .tf, all files (including non-Terraform configuration files) in the directory will be matched, which may cause errors if incompatible file types exist.

Configuring Local File Paths

You can define a list of local directory paths to search for terraform files. Paths are resolved relative to the current working directory. For example:

  • *.tf matches all Terraform configuration files in the CWD.
  • **/*.tf matches all Terraform configuration files in the CWD and all sub-directories.
  • ../*.tf matches all Terraform configuration files in the CWD's parent directory.
  • steampipe*.tf matches all Terraform configuration files starting with "steampipe" in the CWD.
  • /path/to/dir/*.tf matches all Terraform configuration files in a specific directory. For example:
    • ~/*.tf matches all Terraform configuration files in the home directory.
    • ~/**/*.tf matches all Terraform configuration files recursively in the home directory.
  • /path/to/dir/main.tf matches a specific file.
connection "terraform" {
plugin = "terraform"
configuration_file_paths = [ "*.tf", "~/*.tf", "/path/to/dir/main.tf" ]
}

Configuring Remote Git Repository URLs

You can also configure paths with any Git remote repository URLs, e.g., GitHub, BitBucket, GitLab. The plugin will then attempt to retrieve any Terraform configuration files from the remote repositories.

For example:

  • github.com/turbot/steampipe-plugin-aws//*.tf matches all top-level Terraform configuration files in the specified repository.
  • github.com/turbot/steampipe-plugin-aws//**/*.tf matches all Terraform configuration files in the specified repository and all subdirectories.
  • github.com/turbot/steampipe-plugin-aws//**/*.tf?ref=fix_7677 matches all Terraform configuration files in the specific tag of a repository.
  • github.com/turbot/steampipe-plugin-aws//aws-test/tests/aws_acm_certificate//*.tf matches all Terraform configuration files in the specified folder path.

If the example formats above do not work for private repositories, this could be due to git credentials being stored by another tool, e.g., VS Code. An alternative format you can try is:

  • git::ssh://git@github.com/test_org/test_repo//*.tf

You can specify a subdirectory after a double-slash (//) if you want to download only a specific subdirectory from a downloaded directory.

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [ "github.com/turbot/steampipe-plugin-aws//aws-test/tests/aws_acm_certificate//*.tf" ]
}

Similarly, you can define a list of GitLab and BitBucket URLs to search for Terraform configuration files:

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [
"github.com/turbot/steampipe-plugin-aws//**/*.tf",
"github.com/hashicorp/terraform-guides//infrastructure-as-code//**/*.tf",
"bitbucket.org/benturrell/terraform-arcgis-portal//modules/shared//*.tf",
"bitbucket.org/benturrell/terraform-arcgis-portal//modules//**/*.tf",
"gitlab.com/gitlab-org/configure/examples/gitlab-terraform-aws//*.tf",
"gitlab.com/gitlab-org/configure/examples/gitlab-terraform-aws//**/*.tf"
]
}

Configuring S3 URLs

You can also query all Terraform configuration files stored inside an S3 bucket (public or private) using the bucket URL.

Accessing a Private Bucket

In order to access your files in a private S3 bucket, you will need to configure your credentials. You can use your configured AWS profile from local ~/.aws/config, or pass the credentials using the standard AWS environment variables, e.g., AWS_PROFILE, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION.

We recommend using AWS profiles for authentication.

Note: Make sure that region is configured in the config. If not set in the config, region will be fetched from the standard environment variable AWS_REGION.

You can also authenticate your request by setting the AWS profile and region in paths. For example:

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [
"s3::https://bucket-2.s3.us-east-1.amazonaws.com//*.tf?aws_profile=<AWS_PROFILE>",
"s3::https://bucket-2.s3.us-east-1.amazonaws.com/test_folder//*.tf?aws_profile=<AWS_PROFILE>"
]
}

Note:

In order to access the bucket, the IAM user or role will require the following IAM permissions:

  • s3:ListBucket
  • s3:GetObject
  • s3:GetObjectVersion

If the bucket is in another AWS account, the bucket policy will need to grant access to your user or role. For example:

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ReadBucketObject",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::123456789012:user/YOUR_USER"
},
"Action": ["s3:ListBucket", "s3:GetObject", "s3:GetObjectVersion"],
"Resource": ["arn:aws:s3:::test-bucket1", "arn:aws:s3:::test-bucket1/*"]
}
]
}

Accessing a Public Bucket

Public access granted to buckets and objects through ACLs and bucket policies allows any user access to data in the bucket. We do not recommend making S3 buckets public, but if there are specific objects you'd like to make public, please see How can I grant public read access to some objects in my Amazon S3 bucket?.

You can query any public S3 bucket directly using the URL without passing credentials. For example:

connection "terraform" {
plugin = "terraform"
configuration_file_paths = [
"s3::https://bucket-1.s3.us-east-1.amazonaws.com/test_folder//*.tf",
"s3::https://bucket-2.s3.us-east-1.amazonaws.com/test_folder//**/*.tf"
]
}

Scanning Terraform

The plugin supports scanning the Terraform plans given in JSON and allows the users to query them using Steampipe.

Note: The plugin only scans resources from the Terraform plan. Tables will return details of the resources as they will be after the plan has been applied.

To get the Terraform plan in JSON format simply follow the below steps:

  • Run terraform plan with -out flag to store the generated plan to the given filename. Terraform will allow any filename for the plan file, but a typical convention is to name it tfplan.
terraform plan -out=tfplan
  • Run terraform show command with -json flag to get the plan in JSON format, and store the output in a file.
terraform show -json tfplan > tfplan.json
  • And, finally add the path tfplan.json to the plan_file_paths argument in the config to read the plan using Steampipe.
connection "terraform" {
plugin = "terraform"
plan_file_paths = [
"/path/to/tfplan.json",
"github.com/turbot/steampipe-plugin-aws//aws-test/tests/plan_files//tfplan.json",
"s3::https://bucket-1.s3.us-east-1.amazonaws.com/test_plan//*.json"
]
}

Scanning Terraform State

The plugin supports scanning the Terraform states and allows the users to query them using Steampipe.

Note: The plugin only scans the the outputs and resources from the Terraform state.

To get the Terraform state simply follow the below steps:

  • Run terraform apply to automatically generate state file terraform.tfstate.
terraform apply
  • Add the path of the file terraform.tfstate to the state_file_paths argument in the config to read the state using Steampipe.
connection "terraform" {
plugin = "terraform"
state_file_paths = [
"terraform.tfstate",
"github.com/turbot/steampipe-plugin-aws//aws-test/tests/state_files//terraform.tfstate",
"s3::https://bucket-1.s3.us-east-1.amazonaws.com/state_files//*.tfstate"
]
}

Get Involved

Postgres FDW

This plugin is available as a native Postgres FDW. Unlike Steampipe CLI, which ships with an embedded Postgres server instance, the Postgres FDW can be installed in any supported Postgres database version.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_postgres_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/postgres.sh)" -- terraform

The installer will prompt you for the plugin name and version, download and install the appropriate files for your OS, system architecture, and Postgres version.

To configure the Postgres FDW, you will create an extension, foreign server, and schema and import the foreign schema.

CREATE EXTENSION IF NOT EXISTS steampipe_postgres_terraform;
CREATE SERVER steampipe_terraform FOREIGN DATA WRAPPER steampipe_postgres_terraform OPTIONS (config '<your_config>');
CREATE SCHEMA terraform;
IMPORT FOREIGN SCHEMA terraform FROM SERVER steampipe_terraform INTO terraform;

SQLite Extension

This plugin is available as a SQLite Extension, making the tables available as SQLite virtual tables.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_sqlite_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/sqlite.sh)" -- terraform

The installer will prompt you for the plugin name, version, and destination directory. It will then determine the OS and system architecture, and it will download and install the appropriate package.

To configure the SQLite extension, load the extension module and then run the steampipe_configure_terraform function to configure it with plugin-specific options.

$ sqlite3
sqlite> .load ./steampipe_sqlite_extension_terraform.so
sqlite> select steampipe_configure_terraform('<your_config>');

Export

This plugin is available as a standalone Export CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- terraform

You can pass the configuration to the command with the --config argument:

steampipe_export_terraform --config '<your_config>' <table_name>