AWS Cost Management lab

AWS Cost Management Lab

Overview

Time to understand where your AWS money is going using the command line. This lab is split into strategic (monthly analysis) and tactical (weekly deep-dive) cost management. You’ll learn to pull spending data, analyze cost patterns, and master the AWS Cost Explorer API.

Prerequisites

What You’ll Accomplish


Part 1: Connect to Your Instance

Get CLI Ready

  1. EC2 Console → your instance → “Connect” → EC2 Instance Connect
  2. Verify AWS CLI is working:

Read and Learn:

aws --version
aws sts get-caller-identity

Copy to Execute:

aws --version && aws sts get-caller-identity

Part 2: Strategic Analysis - Monthly Cost Patterns

Total Monthly Spending

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') \
    --granularity MONTHLY \
    --metrics BlendedCost \
    --query 'ResultsByTime[0].Total.BlendedCost.Amount' \
    --output text

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') --granularity MONTHLY --metrics BlendedCost --query 'ResultsByTime[0].Total.BlendedCost.Amount' --output text

What did you observe? This gives you the total AWS spend for strategic budget planning and monthly reviews.

Monthly Spending by Service

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') \
    --granularity MONTHLY \
    --metrics BlendedCost \
    --group-by Type=DIMENSION,Key=SERVICE \
    --query 'ResultsByTime[0].Groups[*].[Keys[0],Metrics.BlendedCost.Amount]' \
    --output table

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') --granularity MONTHLY --metrics BlendedCost --group-by Type=DIMENSION,Key=SERVICE --query 'ResultsByTime[0].Groups[*].[Keys[0],Metrics.BlendedCost.Amount]' --output table

What did you observe? You’ll see a table showing which services consumed the most budget. This helps identify major cost drivers for strategic decisions like service consolidation or architecture changes.

Monthly Spending by Region

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') \
    --granularity MONTHLY \
    --metrics BlendedCost \
    --group-by Type=DIMENSION,Key=REGION \
    --query 'ResultsByTime[0].Groups[*].[Keys[0],Metrics.BlendedCost.Amount]' \
    --output table

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 month ago' '+%Y-%m-01'),End=$(date '+%Y-%m-01') --granularity MONTHLY --metrics BlendedCost --group-by Type=DIMENSION,Key=REGION --query 'ResultsByTime[0].Groups[*].[Keys[0],Metrics.BlendedCost.Amount]' --output table

What did you observe? This shows your geographic cost distribution. Useful for decisions about multi-region deployments, data locality, and compliance requirements.


Part 3: Tactical Analysis - Weekly Operations Focus

From here forward, we’re focusing on last week only for tactical decision-making. Weekly analysis helps you spot trends, catch anomalies, and make operational adjustments before they become expensive problems.

Total Weekly Spending

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') \
    --granularity DAILY \
    --metrics BlendedCost \
    --query 'sum(ResultsByTime[*].Total.BlendedCost.Amount | [?@ != null] | map(&to_number(@), @))' \
    --output text

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --query 'sum(ResultsByTime[*].Total.BlendedCost.Amount | [?@ != null] | map(&to_number(@), @))' --output text

What did you observe? This shows your week’s total spend. Compare this to previous weeks to spot trends or unusual spikes that need immediate attention.


Part 4: Weekly Regional Cost Analysis - targeted

We’re focusing specifically on ap-southeast-1 (Asia Pacific - Singapore) for tactical cost analysis. Singapore is a key regional hub for Asia-Pacific operations, and understanding costs here helps optimize your regional deployment strategy.

Singapore Region Weekly Spending

Read and Learn:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' \
'+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost \
--group-by Type=DIMENSION,Key=REGION --query \
'ResultsByTime[*].{Date:TimePeriod.Start,Region:Groups[?Keys[0]==`ap-southeast-1`].Keys[0] | \
[0],Cost:Groups[?Keys[0]==`ap-southeast-1`].Metrics.BlendedCost.Amount | [0]}' --output table

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=REGION --query 'ResultsByTime[*].{Date:TimePeriod.Start,Region:Groups[?Keys[0]==`ap-southeast-1`].Keys[0] | [0],Cost:Groups[?Keys[0]==`ap-southeast-1`].Metrics.BlendedCost.Amount | [0]}' --output table

What did you observe? You’ll see output like:

--------------------------------------------
|  Date        | Region         | Cost     |
--------------------------------------------
|  2025-08-12  | ap-southeast-1 | 12.45    |
|  2025-08-13  | ap-southeast-1 | 15.23    |
|  2025-08-14  | ap-southeast-1 | 8.67     |
|  2025-08-15  | ap-southeast-1 | 14.55    |
|  2025-08-16  | ap-southeast-1 | 16.78    |
|  2025-08-17  | ap-southeast-1 | 11.23    |
|  2025-08-18  | ap-southeast-1 | 9.87     |
--------------------------------------------

Look for patterns:


Part 5: Weekly Service Cost Analysis

Focus on the services that matter most for week-to-week operational decisions.

EC2 Weekly Costs

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') \
    --granularity DAILY \
    --metrics BlendedCost \
    --group-by Type=DIMENSION,Key=SERVICE \
    --query 'ResultsByTime[*].{Date:TimePeriod.Start,Service:Groups[?Keys[0]==`Amazon Elastic Compute Cloud - Compute`]|[0].Keys[0],Cost:Groups[?Keys[0]==`Amazon Elastic Compute Cloud - Compute`]|[1].Metrics.BlendedCost.Amount}' \
    --output table

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=SERVICE --query 'ResultsByTime[*].{Date:TimePeriod.Start,Service:Groups[?Keys[0]==`Amazon Elastic Compute Cloud - Compute`]|[0].Keys[0],Cost:Groups[?Keys[0]==`Amazon Elastic Compute Cloud - Compute`]|[0].Metrics.BlendedCost.Amount}' --output table

What did you observe? You’ll see output like:

---------------------------------------------------------------------
|  Date         | Service                                | Cost     |
---------------------------------------------------------------------
|  2025-08-12   | Amazon Elastic Compute Cloud - Compute | 45.67    |
|  2025-08-13   | Amazon Elastic Compute Cloud - Compute | 52.34    |
|  2025-08-14   | Amazon Elastic Compute Cloud - Compute | 48.92    |
|  2025-08-15   | Amazon Elastic Compute Cloud - Compute | 51.23    |
|  2025-08-16   | Amazon Elastic Compute Cloud - Compute | 49.78    |
|  2025-08-17   | Amazon Elastic Compute Cloud - Compute | 35.12    |
|  2025-08-18   | Amazon Elastic Compute Cloud - Compute | 33.45    |
---------------------------------------------------------------------

Typical patterns:

S3 Weekly Costs

Read and Learn:

aws ce get-cost-and-usage \
    --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') \
    --granularity DAILY \
    --metrics BlendedCost \
    --group-by Type=DIMENSION,Key=SERVICE \
    --query 'ResultsByTime[*].Groups[?Keys[0]==`Amazon Simple Storage Service`].Metrics.BlendedCost.Amount' \
    --output text

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=SERVICE --query 'ResultsByTime[*].Groups[?Keys[0]==`Amazon Simple Storage Service`].Metrics.BlendedCost.Amount' --output text

What did you observe? You’ll see output like:

0.0000002225
0.0000002476
0.0000002463
0.0000002464
0.0000672604
0.0056448208
0.0001368392

S3 costs should show these patterns:

Data Transfer Weekly Costs

Read and Learn:

aws ce get-cost-and-usage --time-period Start=$(date \
 -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') \
--granularity DAILY --metrics BlendedCost --group-by \
Type=DIMENSION,Key=USAGE_TYPE --query \
'ResultsByTime[*].{Date:TimePeriod.Start,UsageType:Groups[?contains(Keys[0], \
`DataTransfer-Out`)].Keys[0] | [0],Cost:Groups[?contains(Keys[0], \
`DataTransfer-Out`)].Metrics.BlendedCost.Amount | [0]}' --output table

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=USAGE_TYPE --query 'ResultsByTime[*].{Date:TimePeriod.Start,UsageType:Groups[?contains(Keys[0], `DataTransfer-Out`)].Keys[0] | [0],Cost:Groups[?contains(Keys[0], `DataTransfer-Out`)].Metrics.BlendedCost.Amount | [0]}' --output table

What did you observe? You’ll see output like:

----------------------------------------------------
|  Date        | UsageType              | Cost     |
----------------------------------------------------
|  2025-08-12  | DataTransfer-Out-Bytes | 5.23     |
|  2025-08-13  | DataTransfer-Out-Bytes | 7.45     |
|  2025-08-14  | DataTransfer-Out-Bytes | 4.12     |
|  2025-08-15  | DataTransfer-Out-Bytes | 8.67     |
|  2025-08-16  | DataTransfer-Out-Bytes | 6.34     |
|  2025-08-17  | DataTransfer-Out-Bytes | 3.89     |
|  2025-08-18  | DataTransfer-Out-Bytes | 4.56     |
----------------------------------------------------

Data transfer patterns to watch:


Part 6: Singapore Service Cost Breakdown

Focus on understanding how key services are performing specifically in the Singapore region for tactical optimization.

EC2 Costs in Singapore (Weekly)

Read and Learn:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=SERVICE Type=DIMENSION,Key=REGION --output json | jq -r '.ResultsByTime[] | .TimePeriod.Start as $date | .Groups[] | select(.Keys[0] == "Amazon Elastic Compute Cloud - Compute" and .Keys[1] == "ap-southeast-1") | "\($date)\t\(.Keys[0])\t\(.Keys[1])\t\(.Metrics.BlendedCost.Amount)"'

Copy to Execute:

aws ce get-cost-and-usage --time-period Start=$(date -d '1 week ago' '+%Y-%m-%d'),End=$(date '+%Y-%m-%d') --granularity DAILY --metrics BlendedCost --group-by Type=DIMENSION,Key=SERVICE Type=DIMENSION,Key=REGION --output json | jq -r '.ResultsByTime[] | .TimePeriod.Start as $date | .Groups[] | select(.Keys[0] == "Amazon Elastic Compute Cloud - Compute" and .Keys[1] == "ap-southeast-1") | "\($date)\t\(.Keys[0])\t\(.Keys[1])\t\(.Metrics.BlendedCost.Amount)"'

What did you observe? You’ll see output like:

2026-01-31      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  0.6859728824
2026-02-01      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  0.7008
2026-02-02      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  0.7008
2026-02-03      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  0.7008
2026-02-04      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  0.7198324836
2026-02-05      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  17.7764483866
2026-02-06      Amazon Elastic Compute Cloud - Compute  ap-southeast-1  12.6115706301

What You’ve Built

You now have both strategic (monthly) and tactical (weekly) cost analysis capabilities. The monthly view helps with budget planning and architectural decisions, while the weekly view gives you the operational visibility to catch cost anomalies quickly and optimize day-to-day spending patterns.