Table: aws_pipes_pipe - Query AWS Pipes using SQL
The AWS Pipes service is a tool that allows you to interact with your AWS resources using SQL syntax. It provides a unified interface to query and manipulate your AWS data, facilitating easy integration with existing SQL-based tools and workflows. AWS Pipes essentially turns your AWS data into a relational database, offering powerful querying capabilities for data analysis and management.
Table Usage Guide
The aws_pipes_pipe
table in Steampipe provides you with information about individual pipes within AWS Pipes. This table allows you, as a DevOps engineer, to query pipe-specific details, including the pipe name, pipe ARN, and the creation time. You can utilize this table to gather insights on pipes, such as pipe statuses, pipe types, and more. The schema outlines the various attributes of the pipe for you, including the pipe name, pipe ARN, creation time, and pipe status.
Examples
Basic info
Determine the areas in which specific AWS pipes are currently active and when they were created. This information is useful for auditing purposes, helping to track the lifecycle and usage of each pipe within your AWS infrastructure.
select name, arn, current_state, creation_time, role_arnfrom aws_pipes_pipe;
select name, arn, current_state, creation_time, role_arnfrom aws_pipes_pipe;
List pipes that are not in desired state
Identify instances where the current state of your AWS pipes does not match the desired state. This can be useful for troubleshooting or identifying potential issues within your AWS environment.
select name, arn, description, creation_time, current_state, desired_statefrom aws_pipes_pipewhere desired_state <> current_state;
select name, arn, description, creation_time, current_state, desired_statefrom aws_pipes_pipewhere desired_state != current_state;
Get the target parameters information for each pipe
Explore the different types of parameters that each pipe targets, which can help in understanding the specific configurations and settings associated with each pipe. This can be useful in identifying potential bottlenecks or areas for improvement in the system's data processing capabilities.
Determine the enrichment parameters for each pipe, providing insights into how each pipe is enhancing the data flow. This could be beneficial in optimizing the data processing and enhancing the overall system performance.
select name, target_parameters ->> 'BatchJobParameters' as batch_job_parameters, target_parameters ->> 'CloudWatchLogsParameters' as cloudwatch_logs_parameters, target_parameters ->> 'EcsTaskParameters' as ecs_task_parameters, target_parameters ->> 'EventBridgeEventBusParameters' as eventbridge_event_bus_parameters, target_parameters ->> 'HttpParameters' as http_parameters, target_parameters ->> 'InputTemplate' as input_template, target_parameters ->> 'KinesisStreamParameters' as kinesis_stream_parameters, target_parameters ->> 'LambdaFunctionParameters' as lambda_function_parameters, target_parameters ->> 'RedshiftDataParameters' as redshift_data_parameters, target_parameters ->> 'SageMakerPipelineParameters' as sage_maker_pipeline_parameters, target_parameters ->> 'SqsQueueParameters' as sqs_queue_parameters, target_parameters ->> 'StepFunctionStateMachineParameters' as step_function_state_machine_parametersfrom aws_pipes_pipe;
select name, json_extract(target_parameters, '$.BatchJobParameters') as batch_job_parameters, json_extract(target_parameters, '$.CloudWatchLogsParameters') as cloudwatch_logs_parameters, json_extract(target_parameters, '$.EcsTaskParameters') as ecs_task_parameters, json_extract( target_parameters, '$.EventBridgeEventBusParameters' ) as eventbridge_event_bus_parameters, json_extract(target_parameters, '$.HttpParameters') as http_parameters, json_extract(target_parameters, '$.InputTemplate') as input_template, json_extract(target_parameters, '$.KinesisStreamParameters') as kinesis_stream_parameters, json_extract(target_parameters, '$.LambdaFunctionParameters') as lambda_function_parameters, json_extract(target_parameters, '$.RedshiftDataParameters') as redshift_data_parameters, json_extract( target_parameters, '$.SageMakerPipelineParameters' ) as sage_maker_pipeline_parameters, json_extract(target_parameters, '$.SqsQueueParameters') as sqs_queue_parameters json_extract( target_parameters, '$.StepFunctionStateMachineParameters' ) as step_function_state_machine_parametersfrom aws_pipes_pipe;
Get enrichment parameter details for each pipe
select name, enrichment_parameters ->> 'HttpParameters' as http_parameters, enrichment_parameters ->> 'InputTemplate' as input_templatefrom aws_pipes_pipe;
select name, json_extract(enrichment_parameters, '$.HttpParameters') as http_parameters, json_extract(enrichment_parameters, '$.InputTemplate') as input_templatefrom aws_pipes_pipe;
List pipes created within the last 30 days
Identify instances where new pipelines have been established within the past month. This can be useful for tracking recent changes or additions to your AWS resources.
select name, creation_time, current_state, desired_state, enrichment, targetfrom aws_pipes_pipewhere creation_time >= now() - interval '30' day;
select name, creation_time, current_state, desired_state, enrichment, targetfrom aws_pipes_pipewhere creation_time >= datetime('now', '-30 day');
Get IAM role details for pipes
This query is useful for identifying the specific AWS IAM roles associated with your Steampipe pipelines. It provides insights into the permissions, boundaries, and usage of these roles, helping you manage security and access controls more effectively.
select p.name, r.arn as role_arn, r.role_id, r.permissions_boundary_arn, r.role_last_used_region, r.inline_policies, r.assume_role_policyfrom aws_pipes_pipe as p, aws_iam_role as rwhere p.role_arn = r.arn;
select p.name, r.arn as role_arn, r.role_id, r.permissions_boundary_arn, r.role_last_used_region, r.inline_policies, r.assume_role_policyfrom aws_pipes_pipe as p, aws_iam_role as rwhere p.role_arn = r.arn;
Schema for aws_pipes_pipe
Name | Type | Operators | Description |
---|---|---|---|
_ctx | jsonb | Steampipe context in JSON form. | |
account_id | text | =, !=, ~~, ~~*, !~~, !~~* | The AWS Account ID in which the resource is located. |
akas | jsonb | Array of globally unique identifier strings (also known as) for the resource. | |
arn | text | The Amazon Resource Name (ARN) of the pipe. | |
creation_time | timestamp with time zone | The time the pipe was created. | |
current_state | text | = | The state the pipe is in. |
description | text | A description of the pipe. | |
desired_state | text | = | The state the pipe should be in. |
enrichment | text | The ARN of the enrichment resource. | |
enrichment_parameters | jsonb | The parameters required to set up enrichment on your pipe. | |
last_modified_time | timestamp with time zone | When the pipe was last updated. | |
log_configuration | jsonb | The logging configuration settings for the pipe. | |
name | text | = | The name of the pipe. |
partition | text | The AWS partition in which the resource is located (aws, aws-cn, or aws-us-gov). | |
region | text | The AWS Region in which the resource is located. | |
role_arn | text | The ARN of the role that allows the pipe to send data to the target. | |
source | text | The ARN of the source resource. | |
source_parameters | jsonb | The parameters required to set up a source for your pipe. | |
source_prefix | text | = | The prefix matching the pipe source. |
sp_connection_name | text | =, !=, ~~, ~~*, !~~, !~~* | Steampipe connection name. |
sp_ctx | jsonb | Steampipe context in JSON form. | |
state_reason | text | The reason the pipe is in its current state. | |
tags | jsonb | A map of tags for the resource. | |
target | text | The ARN of the target resource. | |
target_parameters | jsonb | The parameters required to set up a target for your pipe. | |
target_prefix | text | = | The prefix matching the pipe target. |
title | text | Title of the resource. |
Export
This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.
You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh
script:
/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- aws
You can pass the configuration to the command with the --config
argument:
steampipe_export_aws --config '<your_config>' aws_pipes_pipe