Welcome to the Scope AWS documentation section. This section will guide you through connecting AWS to Scope and getting the necessary data to start analyzing your AWS environment.
To use Scope effectively, you’ll need an AWS user with appropriate permissions. Here’s how to create one:
Sign in to the AWS Management Console and open the IAM console.
Create a new policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"cloudtrail:LookupEvents",
"cloudtrail:DescribeTrails",
"s3:GetObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "*"
}
]
}
Create a new user:
Save the credentials:
scope aws configure
commandNote: Consider using more restrictive permissions by limiting the “Resource” section to specific S3 buckets and CloudTrail trails.
Scope supports multiple authentication methods:
Interactive configuration:
# Configure AWS credentials interactively
scope aws configure
# Configure for a specific profile
scope aws configure --profile my-profile
Command-line arguments:
scope aws --access-key YOUR_ACCESS_KEY --secret-key YOUR_SECRET_KEY --region us-east-1 discover
Environment variables:
# Windows
set AWS_ACCESS_KEY_ID=your_access_key
set AWS_SECRET_ACCESS_KEY=your_secret_key
set AWS_DEFAULT_REGION=us-east-1
# macOS/Linux
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_DEFAULT_REGION=us-east-1
AWS credentials file (~/.aws/credentials
)
IAM role (if running on an EC2 instance with an IAM role)
To list all available CloudTrail trails in your AWS account:
scope aws discover
This command will display information about each trail, including its name, S3 bucket location, and whether it logs management events.
To explore the structure of an S3 bucket and automatically detect CloudTrail logs:
scope aws explore-bucket --bucket your-cloudtrail-bucket
This command will:
To collect CloudTrail management events:
scope aws management --days 7 --output-file timeline.csv --format csv
Available parameters:
--days
: Number of days to look back (default: 7)--output-file
: Path to save the timeline (required)--format
: Choose between ‘csv’ or ‘json’ (default: csv)To collect CloudTrail logs stored in an S3 bucket:
scope aws s3 --bucket your-cloudtrail-bucket --output-file timeline.csv
The command will automatically:
For more control, you can specify additional parameters:
scope aws s3 --bucket your-cloudtrail-bucket --prefix AWSLogs/123456789012/CloudTrail/ --regions us-east-1 us-west-2 --start-date 2023-04-15 --end-date 2023-04-22 --output-dir ./raw_logs --output-file timeline.csv --format json
Available parameters:
--bucket
: S3 bucket containing CloudTrail logs (required)--prefix
: S3 prefix to filter logs (optional)--regions
: Specific regions to collect from (space-separated list)--start-date
: Start date in YYYY-MM-DD format (default: 7 days ago)--end-date
: End date in YYYY-MM-DD format (default: today)--output-dir
: Directory to save raw logs (optional)--output-file
: Path to save the timeline (required)--format
: Choose between ‘csv’ or ‘json’ (default: csv)To process CloudTrail logs that have already been downloaded to your local machine:
scope aws local --directory /path/to/logs --output-file timeline.csv
For recursive processing of all subdirectories:
scope aws local --directory /path/to/logs --recursive --output-file timeline.csv --format json
Note for Windows users: When specifying file paths, use one of these formats:
- Forward slashes:
C:/Users/username/Desktop/CloudTrail
- Escaped backslashes:
C:\\Users\\username\\Desktop\\CloudTrail
- Quoted paths:
"C:\Users\username\Desktop\CloudTrail"
Available parameters:
--directory
: Directory containing CloudTrail logs (required)--recursive
: Process subdirectories recursively--output-file
: Path to save the timeline (required)--format
: Choose between ‘csv’ or ‘json’ (default: csv)This command will:
.json
or .json.gz
) in the specified directoryBy default, Scope exports timelines to the specified output file. You can specify between csv and json formats.