steampipe plugin install aws

Table: aws_pipes_pipe - Query AWS Pipes using SQL

The AWS Pipes service is a tool that allows you to interact with your AWS resources using SQL syntax. It provides a unified interface to query and manipulate your AWS data, facilitating easy integration with existing SQL-based tools and workflows. AWS Pipes essentially turns your AWS data into a relational database, offering powerful querying capabilities for data analysis and management.

Table Usage Guide

The aws_pipes_pipe table in Steampipe provides you with information about individual pipes within AWS Pipes. This table allows you, as a DevOps engineer, to query pipe-specific details, including the pipe name, pipe ARN, and the creation time. You can utilize this table to gather insights on pipes, such as pipe statuses, pipe types, and more. The schema outlines the various attributes of the pipe for you, including the pipe name, pipe ARN, creation time, and pipe status.

Examples

Basic info

Determine the areas in which specific AWS pipes are currently active and when they were created. This information is useful for auditing purposes, helping to track the lifecycle and usage of each pipe within your AWS infrastructure.

select
name,
arn,
current_state,
creation_time,
role_arn
from
aws_pipes_pipe;
select
name,
arn,
current_state,
creation_time,
role_arn
from
aws_pipes_pipe;

List pipes that are not in desired state

Identify instances where the current state of your AWS pipes does not match the desired state. This can be useful for troubleshooting or identifying potential issues within your AWS environment.

select
name,
arn,
description,
creation_time,
current_state,
desired_state
from
aws_pipes_pipe
where
desired_state <> current_state;
select
name,
arn,
description,
creation_time,
current_state,
desired_state
from
aws_pipes_pipe
where
desired_state != current_state;

Get the target parameters information for each pipe

Explore the different types of parameters that each pipe targets, which can help in understanding the specific configurations and settings associated with each pipe. This can be useful in identifying potential bottlenecks or areas for improvement in the system's data processing capabilities.

Determine the enrichment parameters for each pipe, providing insights into how each pipe is enhancing the data flow. This could be beneficial in optimizing the data processing and enhancing the overall system performance.

select
name,
target_parameters ->> 'BatchJobParameters' as batch_job_parameters,
target_parameters ->> 'CloudWatchLogsParameters' as cloudwatch_logs_parameters,
target_parameters ->> 'EcsTaskParameters' as ecs_task_parameters,
target_parameters ->> 'EventBridgeEventBusParameters' as eventbridge_event_bus_parameters,
target_parameters ->> 'HttpParameters' as http_parameters,
target_parameters ->> 'InputTemplate' as input_template,
target_parameters ->> 'KinesisStreamParameters' as kinesis_stream_parameters,
target_parameters ->> 'LambdaFunctionParameters' as lambda_function_parameters,
target_parameters ->> 'RedshiftDataParameters' as redshift_data_parameters,
target_parameters ->> 'SageMakerPipelineParameters' as sage_maker_pipeline_parameters,
target_parameters ->> 'SqsQueueParameters' as sqs_queue_parameters,
target_parameters ->> 'StepFunctionStateMachineParameters' as step_function_state_machine_parameters
from
aws_pipes_pipe;
select
name,
json_extract(target_parameters, '$.BatchJobParameters') as batch_job_parameters,
json_extract(target_parameters, '$.CloudWatchLogsParameters') as cloudwatch_logs_parameters,
json_extract(target_parameters, '$.EcsTaskParameters') as ecs_task_parameters,
json_extract(
target_parameters,
'$.EventBridgeEventBusParameters'
) as eventbridge_event_bus_parameters,
json_extract(target_parameters, '$.HttpParameters') as http_parameters,
json_extract(target_parameters, '$.InputTemplate') as input_template,
json_extract(target_parameters, '$.KinesisStreamParameters') as kinesis_stream_parameters,
json_extract(target_parameters, '$.LambdaFunctionParameters') as lambda_function_parameters,
json_extract(target_parameters, '$.RedshiftDataParameters') as redshift_data_parameters,
json_extract(
target_parameters,
'$.SageMakerPipelineParameters'
) as sage_maker_pipeline_parameters,
json_extract(target_parameters, '$.SqsQueueParameters') as sqs_queue_parameters json_extract(
target_parameters,
'$.StepFunctionStateMachineParameters'
) as step_function_state_machine_parameters
from
aws_pipes_pipe;

Get enrichment parameter details for each pipe

select
name,
enrichment_parameters ->> 'HttpParameters' as http_parameters,
enrichment_parameters ->> 'InputTemplate' as input_template
from
aws_pipes_pipe;
select
name,
json_extract(enrichment_parameters, '$.HttpParameters') as http_parameters,
json_extract(enrichment_parameters, '$.InputTemplate') as input_template
from
aws_pipes_pipe;

List pipes created within the last 30 days

Identify instances where new pipelines have been established within the past month. This can be useful for tracking recent changes or additions to your AWS resources.

select
name,
creation_time,
current_state,
desired_state,
enrichment,
target
from
aws_pipes_pipe
where
creation_time >= now() - interval '30' day;
select
name,
creation_time,
current_state,
desired_state,
enrichment,
target
from
aws_pipes_pipe
where
creation_time >= datetime('now', '-30 day');

Get IAM role details for pipes

This query is useful for identifying the specific AWS IAM roles associated with your Steampipe pipelines. It provides insights into the permissions, boundaries, and usage of these roles, helping you manage security and access controls more effectively.

select
p.name,
r.arn as role_arn,
r.role_id,
r.permissions_boundary_arn,
r.role_last_used_region,
r.inline_policies,
r.assume_role_policy
from
aws_pipes_pipe as p,
aws_iam_role as r
where
p.role_arn = r.arn;
select
p.name,
r.arn as role_arn,
r.role_id,
r.permissions_boundary_arn,
r.role_last_used_region,
r.inline_policies,
r.assume_role_policy
from
aws_pipes_pipe as p,
aws_iam_role as r
where
p.role_arn = r.arn;

Schema for aws_pipes_pipe

NameTypeOperatorsDescription
_ctxjsonbSteampipe context in JSON form.
account_idtext=, !=, ~~, ~~*, !~~, !~~*The AWS Account ID in which the resource is located.
akasjsonbArray of globally unique identifier strings (also known as) for the resource.
arntextThe Amazon Resource Name (ARN) of the pipe.
creation_timetimestamp with time zoneThe time the pipe was created.
current_statetext=The state the pipe is in.
descriptiontextA description of the pipe.
desired_statetext=The state the pipe should be in.
enrichmenttextThe ARN of the enrichment resource.
enrichment_parametersjsonbThe parameters required to set up enrichment on your pipe.
last_modified_timetimestamp with time zoneWhen the pipe was last updated.
log_configurationjsonbThe logging configuration settings for the pipe.
nametext=The name of the pipe.
partitiontextThe AWS partition in which the resource is located (aws, aws-cn, or aws-us-gov).
regiontextThe AWS Region in which the resource is located.
role_arntextThe ARN of the role that allows the pipe to send data to the target.
sourcetextThe ARN of the source resource.
source_parametersjsonbThe parameters required to set up a source for your pipe.
source_prefixtext=The prefix matching the pipe source.
sp_connection_nametext=, !=, ~~, ~~*, !~~, !~~*Steampipe connection name.
sp_ctxjsonbSteampipe context in JSON form.
state_reasontextThe reason the pipe is in its current state.
tagsjsonbA map of tags for the resource.
targettextThe ARN of the target resource.
target_parametersjsonbThe parameters required to set up a target for your pipe.
target_prefixtext=The prefix matching the pipe target.
titletextTitle of the resource.

Export

This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- aws

You can pass the configuration to the command with the --config argument:

steampipe_export_aws --config '<your_config>' aws_pipes_pipe