Skip to main content

Atmos AI

Atmos works with your existing AI tools and brings its own native, built-in capabilities. Use API providers with purchased tokens (Anthropic, OpenAI, Bedrock, Azure OpenAI, Gemini, Grok, Ollama), or use CLI providers that reuse your existing subscription (Claude Code, OpenAI Codex, Gemini CLI) — no API keys needed.

Experimental

Quick Start

With API Tokens

atmos.yaml

ai:
enabled: true
default_provider: "anthropic"
providers:
anthropic:
model: "claude-sonnet-4-6"
api_key: !env "ANTHROPIC_API_KEY"

With Your Existing Subscription (No API Key)

Use your locally installed Claude Code, OpenAI Codex, or Gemini CLI binary. The CLI tool handles auth via its own subscription — no API key configuration needed.

atmos.yaml

ai:
enabled: true
default_provider: "claude-code" # or "codex-cli" or "gemini-cli"
providers:
claude-code:
max_turns: 10

Install and authenticate

# Claude Code
brew install --cask claude-code && claude auth login

# OpenAI Codex
npm install -g @openai/codex && codex login

# Gemini CLI (authenticates on first run)
npm install -g @google/gemini-cli && gemini

shell

export ANTHROPIC_API_KEY="your-api-key"
atmos ai chat # Interactive chat
atmos ai ask "What stacks do we have?" # Single question
atmos ai exec "validate stacks" --format json # CI/CD automation
atmos ai sessions list # List sessions
atmos ai skill list # List installed skills
atmos terraform plan vpc -s prod --ai # AI analysis of any command
atmos terraform plan vpc -s prod --ai --skill atmos-terraform # With domain expertise
atmos terraform plan vpc -s prod --ai --skill atmos-terraform,atmos-stacks # Multiple skills

AI Configuration

Configure AI providers, models, API keys, skills, tools, sessions, and instructions in your atmos.yaml.

Examples

AI with API Providers

Multi-provider AI configuration with sessions, tools, and custom skills using API tokens.

AI with Claude Code CLI

Use your Claude Pro/Max subscription with MCP server pass-through for AWS tools. No API keys needed.

AI Providers

Atmos supports two types of AI providers:

API providers call the provider's API directly with purchased tokens. Atmos manages the tool execution loop in-process.

ProviderConfig KeyAuth
AnthropicanthropicANTHROPIC_API_KEY
OpenAIopenaiOPENAI_API_KEY
Google GeminigeminiGEMINI_API_KEY
Grok (xAI)grokXAI_API_KEY
AWS BedrockbedrockAWS IAM credentials
Azure OpenAIazureopenaiAZURE_OPENAI_API_KEY
OllamaollamaNone (local)

CLI providers invoke a locally installed AI tool as a subprocess, reusing your existing subscription. The CLI tool manages its own tool execution loop, and MCP servers are passed through for tool access.

ProviderConfig KeyBinaryAuthMCP
Claude Codeclaude-codeclaudeClaude Pro/Max subscriptionFull
OpenAI Codexcodex-clicodexChatGPT Plus/Pro subscriptionFull
Gemini CLIgemini-cligeminiGoogle account (free tier)Blocked for personal accounts
When to use which
  • Interactive development with MCPclaude-code or codex-cli (subscription, full MCP), or any of the API providers
  • CI/CD pipelines — API providers (env var auth, no interactive login)
  • Cost-consciousgemini-cli (free tier, prompt-only)
  • Enterprisebedrock or azureopenai (compliance, audit trails)

AI-Powered Command Analysis

Add --ai to any Atmos command for instant AI-powered output analysis. Pair with --skill for domain-specific expertise — multiple skills can be combined with commas or repeated flags.

# Basic AI analysis
atmos terraform plan vpc -s prod --ai

# Single skill for domain expertise
atmos terraform plan vpc -s prod --ai --skill atmos-terraform

# Multiple skills (comma-separated)
atmos terraform plan vpc -s prod --ai --skill atmos-terraform,atmos-stacks

# Multiple skills (repeated flag)
atmos terraform plan vpc -s prod --ai --skill atmos-terraform --skill atmos-stacks

# Via environment variables
ATMOS_AI=true ATMOS_SKILL=atmos-terraform,atmos-stacks atmos terraform plan vpc -s prod

AI-Powered Command Analysis

Add --ai to any Atmos command for instant output analysis. Combine with --skill for domain-specific expertise. Multiple skills can be specified with commas (--skill a,b) or repeated flags (--skill a --skill b).

Commands

atmos ai chat
Interactive chat with session management, provider switching, and skill selection.
atmos ai ask
Ask a single question and get an immediate response. Ideal for scripting and CI/CD.
atmos ai exec
Execute Atmos and shell commands via AI prompts with structured output.
atmos ai sessions
Manage chat sessions: list, clean, export, and import.
atmos ai skill
Install, list, and uninstall community AI skills from GitHub.
atmos --ai
Add --ai flag to any command for AI-powered output analysis. Use --skill flag for domain-specific expertise (supports multiple skills via commas or repeated flag).
atmos mcp start
Start the Atmos MCP server for external AI clients.
atmos mcp list
List configured external MCP servers.
atmos mcp tools
List tools from an external MCP server.
atmos mcp test
Test connectivity to an external MCP server.
atmos mcp status
Show live connection status of all configured MCP servers.
atmos mcp restart
Restart an external MCP server.
atmos mcp export
Export .mcp.json from atmos.yaml for Claude Code / Cursor / IDE integration.

AI Assistants

Configure AI Assistants

Set up Claude Code, Cursor, Windsurf, GitHub Copilot, Gemini CLI, OpenAI Codex, and other AI coding assistants to use Atmos agent skills.

Claude Code Integration

Claude Code Integration

Use Claude Code with the Atmos MCP server and create specialized atmos-expert subagents for deep infrastructure expertise.

Agent Skills

Atmos ships 21+ agent skills that give AI coding assistants deep knowledge of Atmos conventions. Skills follow the Agent Skills open standard and work across Claude Code, Gemini CLI, OpenAI Codex, Cursor, Windsurf, GitHub Copilot, and more.

Agent Skills Overview

How skills work, available skills, and the SKILL.md format.

Skill Marketplace

Install and share community skills from GitHub.

Skills Configuration

Configure skills in atmos.yaml.

MCP

Atmos supports the Model Context Protocol (MCP) in both directions:

Atmos as MCP Server — Exposes Atmos AI tools to external clients (Claude Desktop, Claude Code, VS Code, Cursor, Gemini CLI). Use Atmos tools inside your preferred AI assistant.

External MCP Servers — Connect to AWS, GCP, Azure, and custom MCP servers. Their tools become available in atmos ai chat, atmos ai ask, and atmos ai exec alongside native Atmos tools.

Smart Routing — When multiple MCP servers are configured with API providers, Atmos automatically selects only the servers relevant to your question using a lightweight routing call. This keeps tool payloads small and responses fast. Use the --mcp flag to override and specify servers directly.

MCP Pass-Through — With CLI AI providers (claude-code, codex-cli), all configured MCP servers are passed to the CLI tool via its native config format. The CLI tool decides which servers to use. Smart routing is skipped — the AI model handles server selection internally.

CLI Provider MCP
  • Claude Code: MCP servers passed via --mcp-config temp file
  • Codex CLI: MCP servers written to ~/.codex/config.toml (backup/restore)
  • Gemini CLI: MCP blocked for personal Google accounts (oauth-personal auth)
# Manage external MCP servers
atmos mcp list # See configured servers
atmos mcp test aws-docs # Test connectivity
atmos mcp tools aws-security # List available tools
atmos mcp status # Show live status of all servers
atmos mcp export # Export .mcp.json for Claude Code / IDE

# Ask questions — AI auto-routes to the right MCP server
atmos ai ask "What did we spend on EC2 last month?" # routes to aws-billing
atmos ai ask "Is GuardDuty enabled in us-east-1?" # routes to aws-security
atmos ai ask "How do I configure S3 lifecycle rules?" # routes to aws-docs

# Manual MCP server(s) selection (skip auto-routing)
atmos ai ask --mcp aws-iam "List all admin roles"
atmos ai ask --mcp aws-iam,aws-cloudtrail "Who accessed the admin role?"
atmos ai chat --mcp aws-billing

atmos.yaml

mcp:
servers:
# FinOps
aws-billing:
command: uvx
args: ["awslabs.billing-cost-management-mcp-server@latest"]
env: { AWS_REGION: "us-east-1" }
description: "AWS Billing — billing summaries and payment history"
identity: "readonly" # Atmos Auth identity (from the auth section)

# Security & Compliance
aws-security:
command: uvx
args: ["awslabs.well-architected-security-mcp-server@latest"]
env: { AWS_REGION: "us-east-1" }
description: "AWS Security — Well-Architected security posture assessment"
identity: "readonly" # Atmos Auth identity (from the auth section)

# Documentation (no credentials needed)
aws-docs:
command: uvx
args: ["awslabs.aws-documentation-mcp-server@latest"]
description: "AWS Documentation — search and fetch AWS docs"

# Smart routing (enabled by default, uses your configured AI provider)
routing:
enabled: true

MCP Configuration

Configure the Atmos MCP server and external MCP server connections.

Try the MCP Example

Explore a complete example with pre-configured AWS MCP servers for cost analysis, security, IAM, and documentation.

Troubleshooting

Having issues? See the Troubleshooting Guide for solutions to common problems with providers, tools, sessions, and connectivity.