Skip to main content
Gists are examples that demonstrate a concept, but are not actively maintained and may not work in your environment or current versions of Atmos without adaptations.
README.md6.8 KB
View on GitHub

FinOps with AWS MCP Servers

Give your AI coding assistant direct access to AWS cost data, billing, and pricing information — all authenticated automatically through Atmos. Ask Claude questions like "What did we spend on EC2 last month?" and get real answers from your actual AWS account.

The Problem

FinOps teams need visibility into AWS costs, but the tools are scattered across consoles and APIs. Meanwhile, AI coding assistants like Claude Code can't access your AWS cost data because they need authenticated credentials — and setting that up for AWS MCP servers manually is tedious.

You end up juggling SSO sessions, environment variables, and credential files across 20+ MCP servers. It shouldn't be this hard.

The Solution

Use Atmos to wire everything together so AI assistants can query your AWS cost data directly:

  • Custom Commands define atmos mcp aws install/start/test subcommands
  • Auth wraps each MCP server process with atmos auth exec, injecting authenticated credentials automatically
  • Toolchain ensures uv (the Python package manager) is available for installing MCP packages
  • .mcp.json tells Claude Code to start each server via atmos mcp aws start <name>

The result: Claude Code gets authenticated access to AWS Billing, Cost Explorer, Pricing, and 18 other AWS services — all through a single pattern.

Features Used

  • Custom Commands — nested subcommands for install, start, and test
  • Authatmos auth exec wraps processes with authenticated AWS credentials
  • Toolchain — ensures uv is available via toolchain aliases
  • AI/MCP — enables MCP server support

How It Works


  1. You ask Claude a question about AWS costs, infrastructure, or pricing
  2. Claude invokes the relevant MCP server (e.g., aws-cost-explorer, aws-pricing)
  3. The .mcp.json config runs atmos mcp aws start <server-name>
  4. The custom command resolves the short name to the full Python package name
  5. atmos auth exec -i core-root/terraform handles AWS SSO authentication
  6. The MCP server process inherits the authenticated credentials and returns real data

Getting Started

Prerequisites

  • Atmos installed
  • AWS account with SSO configured
  • Python 3.13 (for MCP server packages)

Setup

  1. Copy the configuration files from this gist into your project
  2. Adjust the identity (-i core-root/terraform) and profile (ATMOS_PROFILE=managers) to match your environment
  3. Install all MCP server packages:
atmos mcp aws install all
  1. Test that authentication works:
atmos mcp aws test all
  1. Start using MCP servers with Claude Code — the .mcp.json file handles the rest.

Configuration Files

FilePurpose
atmos.yamlImports configuration from .atmos.d/
.atmos.d/mcp.yamlCustom commands for atmos mcp aws install/start/test
.atmos.d/toolchain.yamlToolchain alias for uv package manager
.atmos.d/ai.yamlEnables AI/MCP support in Atmos
.mcp.jsonClaude Code MCP server configuration

Usage

# Install a specific MCP server package
atmos mcp aws install pricing

# Install all 21 AWS MCP server packages
atmos mcp aws install all

# Start a specific server with automatic AWS auth
atmos mcp aws start pricing

# Test that authentication is working
atmos mcp aws test all

Available Servers

This gist includes 21 AWS MCP servers. The FinOps-relevant ones are highlighted:

FinOps & Cost Management

ServerPackageWhat You Can Ask
billing-cost-managementawslabs.billing-cost-management-mcp-serverBilling summaries, payment history
cost-explorerawslabs.cost-explorer-mcp-serverSpend breakdowns, cost trends, forecasts
pricingawslabs.aws-pricing-mcp-serverOn-demand vs reserved pricing, cost comparisons

Infrastructure & Operations

ServerPackage
terraformawslabs.terraform-mcp-server
cfnawslabs.cfn-mcp-server
cdkawslabs.cdk-mcp-server
iacawslabs.aws-iac-mcp-server
ecsawslabs.ecs-mcp-server
eksawslabs.eks-mcp-server
serverlessawslabs.aws-serverless-mcp-server
lambda-toolawslabs.lambda-tool-mcp-server
stepfunctions-toolawslabs.stepfunctions-tool-mcp-server

Observability & Security

ServerPackage
cloudwatchawslabs.cloudwatch-mcp-server
cloudtrailawslabs.cloudtrail-mcp-server
iamawslabs.iam-mcp-server
well-architected-securityawslabs.well-architected-security-mcp-server
networkawslabs.aws-network-mcp-server

Data & Support

ServerPackage
dynamodbawslabs.dynamodb-mcp-server
s3-tablesawslabs.s3-tables-mcp-server
documentationawslabs.aws-documentation-mcp-server
supportawslabs.aws-support-mcp-server

Customization

Different AWS Account/Identity

Change the identity flag in .atmos.d/mcp.yaml:

# Before
exec env ATMOS_PROFILE=managers atmos auth exec -i core-root/terraform -- \

# After (your identity)
exec env ATMOS_PROFILE=your-profile atmos auth exec -i your-stack/terraform -- \

Different AWS Region

Update AWS_REGION in .mcp.json for each server entry:

"env": { "AWS_REGION": "us-west-2" }

Adding New Servers

  1. Add the server name to the ALL_SERVERS array in the install command
  2. Add the package name resolution logic if it follows a non-standard naming pattern
  3. Add a new entry to .mcp.json

The Key Insight

atmos auth exec is the glue that makes this work. It wraps any command with authenticated credentials using exec, which replaces the current process — so the MCP server inherits the credentials directly. No temp files, no environment variable juggling, no credential expiration headaches.

Combined with Custom Commands for the install/start/test workflow and Toolchain for dependency management, you get a complete, self-contained solution for AI-powered FinOps. Your team can ask natural language questions about AWS costs and get answers from real account data — without leaving their editor.