steampipe plugin install aws

Table: aws_cloudtrail_import - Query AWS CloudTrail using SQL

AWS CloudTrail is a service that enables governance, compliance, operational auditing, and risk auditing of your AWS account. It allows you to log, continuously monitor, and retain account activity related to actions across your AWS infrastructure. CloudTrail provides event history of your AWS account activity, including actions taken through the AWS Management Console, AWS SDKs, command line tools, and other AWS services.

Table Usage Guide

The aws_cloudtrail_import table in Steampipe provides you with information about imported trail files within AWS CloudTrail. This table allows you, as a DevOps engineer, to query import-specific details, including the file name, import time, hash value, and more. You can utilize this table to gather insights on imported trail files, such as their import status, hash type, and hash value. The schema outlines the various attributes of the imported trail file for you, including the import ID, import time, file name, and associated metadata.

Examples

Basic info

Explore which AWS CloudTrail imports have been created and their current status to understand where the data is being sent. This can help in assessing the flow of data and ensuring it is reaching the intended destinations.

select
import_id,
created_timestamp,
import_status,
destinations
from
aws_cloudtrail_import;
select
import_id,
created_timestamp,
import_status,
destinations
from
aws_cloudtrail_import;

List imports that are not completed

Identify instances where CloudTrail imports are still in progress. This is useful for tracking the progress of data import tasks and identifying any potential issues or delays.

select
import_id,
created_timestamp,
import_source
from
aws_cloudtrail_import
where
import_status <> 'COMPLETED';
select
import_id,
created_timestamp,
import_source
from
aws_cloudtrail_import
where
import_status != 'COMPLETED';

List imports that are created in last 30 days

Identify recent imports within the last 30 days to track their status and duration. This is useful for understanding recent activity and ensuring timely data retrieval.

select
import_id,
created_timestamp,
import_status,
start_event_time,
end_event_time
from
aws_cloudtrail_import
where
created_timestamp >= now() - interval '30' day;
select
import_id,
created_timestamp,
import_status,
start_event_time,
end_event_time
from
aws_cloudtrail_import
where
created_timestamp >= datetime('now', '-30 day');

Get import source details of each import

Identify the origins of each import by examining the access role, region, and URI of the S3 bucket used. This can be useful for auditing purposes or to troubleshoot issues related to specific imports.

select
import_id,
import_status,
import_source ->> 'S3BucketAccessRoleArn' as s3_bucket_access_role_arn,
import_source ->> 'S3BucketRegion' as s3_bucket_region,
import_source ->> 'S3LocationUri' as s3_location_uri
from
aws_cloudtrail_import;
select
import_id,
import_status,
json_extract(import_source, '$.S3BucketAccessRoleArn') as s3_bucket_access_role_arn,
json_extract(import_source, '$.S3BucketRegion') as s3_bucket_region,
json_extract(import_source, '$.S3LocationUri') as s3_location_uri
from
aws_cloudtrail_import;

Get import statistic of each import

Gain insights into the performance of each import operation by assessing the number of completed events, failed entries, and completed files. This is useful for monitoring the efficiency and reliability of data import processes.

select
import_id,
import_status,
import_statistics -> 'EventsCompleted' as events_completed,
import_statistics -> 'FailedEntries' as failed_entries,
import_statistics -> 'FilesCompleted' as files_completed,
import_statistics -> 'FilesCompleted' as prefixes_completed,
import_statistics -> 'PrefixesFound' as PrefixesFound
from
aws_cloudtrail_import;
select
import_id,
import_status,
json_extract(import_statistics, '$.EventsCompleted') as events_completed,
json_extract(import_statistics, '$.FailedEntries') as failed_entries,
json_extract(import_statistics, '$.FilesCompleted') as files_completed,
json_extract(import_statistics, '$.FilesCompleted') as prefixes_completed,
json_extract(import_statistics, '$.PrefixesFound') as PrefixesFound
from
aws_cloudtrail_import;

Schema for aws_cloudtrail_import

NameTypeOperatorsDescription
_ctxjsonbSteampipe context in JSON form, e.g. connection_name.
account_idtextThe AWS Account ID in which the resource is located.
created_timestamptimestamp with time zoneThe timestamp of the import's creation.
destinationsjsonbThe ARN of the destination event data store.
end_event_timetimestamp with time zoneUsed with EndEventTime to bound a StartImport request, and limit imported trail events to only those events logged within a specified time period.
import_idtext=The ID of the import.
import_sourcejsonbThe source S3 bucket.
import_statisticsjsonbProvides statistics for the import.
import_statustext=The status of the import.
partitiontextThe AWS partition in which the resource is located (aws, aws-cn, or aws-us-gov).
regiontextThe AWS Region in which the resource is located.
start_event_timetimestamp with time zoneUsed with StartEventTime to bound a StartImport request, and limit imported trail events to only those events logged within a specified time period.
titletextTitle of the resource.
updated_timestamptimestamp with time zoneThe timestamp of the import's last update.

Export

This table is available as a standalone Exporter CLI. Steampipe exporters are stand-alone binaries that allow you to extract data using Steampipe plugins without a database.

You can download the tarball for your platform from the Releases page, but it is simplest to install them with the steampipe_export_installer.sh script:

/bin/sh -c "$(curl -fsSL https://steampipe.io/install/export.sh)" -- aws

You can pass the configuration to the command with the --config argument:

steampipe_export_aws --config '<your_config>' aws_cloudtrail_import