Atmos AI
Atmos works with your existing AI tools and brings its own native, built-in capabilities. Bring your own credentials for supported providers — OpenAI, Anthropic, Bedrock, Azure OpenAI, Gemini, Grok, and Ollama — and Atmos handles provider-specific auth styles automatically.
Quick Start
atmos.yaml
AI Configuration
Configure AI providers, models, API keys, skills, tools, sessions, and instructions in your atmos.yaml.
Example
Try the Example
Explore a complete multi-region infrastructure example with Atmos AI, skills, sessions, and tool execution.
AI-Powered Command Analysis
Add --ai to any Atmos command for instant AI-powered output analysis. Pair with --skill for
domain-specific expertise — multiple skills can be combined with commas or repeated flags.
# Basic AI analysis
atmos terraform plan vpc -s prod --ai
# Single skill for domain expertise
atmos terraform plan vpc -s prod --ai --skill atmos-terraform
# Multiple skills (comma-separated)
atmos terraform plan vpc -s prod --ai --skill atmos-terraform,atmos-stacks
# Multiple skills (repeated flag)
atmos terraform plan vpc -s prod --ai --skill atmos-terraform --skill atmos-stacks
# Via environment variables
ATMOS_AI=true ATMOS_SKILL=atmos-terraform,atmos-stacks atmos terraform plan vpc -s prod
AI-Powered Command Analysis
Add --ai to any Atmos command for instant output analysis. Combine with --skill for domain-specific expertise.
Multiple skills can be specified with commas (--skill a,b) or repeated flags (--skill a --skill b).
Commands
atmos ai chat- Interactive chat with session management, provider switching, and skill selection.
atmos ai ask- Ask a single question and get an immediate response. Ideal for scripting and CI/CD.
atmos ai exec- Execute Atmos and shell commands via AI prompts with structured output.
atmos ai sessions- Manage chat sessions: list, clean, export, and import.
atmos ai skill- Install, list, and uninstall community AI skills from GitHub.
atmos --ai- Add
--aiflag to any command for AI-powered output analysis. Use--skillflag for domain-specific expertise (supports multiple skills via commas or repeated flag). atmos mcp start- Start the MCP server with stdio (default) or HTTP transport.
atmos lsp start- Start the LSP server with stdio, TCP, or WebSocket transport.
AI Assistants
Configure AI Assistants
Set up Claude Code, Cursor, Windsurf, GitHub Copilot, Gemini CLI, OpenAI Codex, and other AI coding assistants to use Atmos agent skills.
Claude Code Integration
Claude Code Integration
Use Claude Code with the Atmos MCP server and create specialized atmos-expert subagents for deep infrastructure expertise.
Agent Skills
Atmos ships 21+ agent skills that give AI coding assistants deep knowledge of Atmos conventions. Skills follow the Agent Skills open standard and work across Claude Code, Gemini CLI, OpenAI Codex, Cursor, Windsurf, GitHub Copilot, and more.
Agent Skills Overview
How skills work, available skills, and the SKILL.md format.
Skill Marketplace
Install and share community skills from GitHub.
Skills Configuration
Configure skills in atmos.yaml.
MCP Server
The Model Context Protocol (MCP) server exposes Atmos AI tools to any compatible client -- Claude Desktop, Claude Code, VS Code, Cursor, Gemini CLI, and many others. This lets you use Atmos tools inside your preferred AI assistant without custom integrations.
MCP Setup
Configure MCP clients and manage tool permissions.
Troubleshooting
Having issues? See the Troubleshooting Guide for solutions to common problems with providers, tools, sessions, and connectivity.