Migrating from Native Terraform
You're already 90% there. Your Terraform code doesn't need to change. Atmos gives you a documented, conventional way to manage your infrastructure—whether you're using Makefiles, shell scripts, or just raw Terraform commands.
Why This Guide?
Most teams don't use Terraform in isolation. You're probably already using:
- Makefiles to wrap common commands
- Shell scripts to set variables or loop through environments
- GitHub Actions/Jenkins with custom bash scripting
- Directory structures to separate dev/staging/prod
.tfvarsfiles scattered everywhere
Tool fatigue is real. Instead of duct-taping 25 different tools together, Atmos gives you one documented approach.
Crawl, Walk, Run
- Crawl: Get running in 20 minutes (this guide)
- Walk: Explore DRY configs and remote state
- Run: Advanced features when you need them (workflows, validation, component libraries)
You don't need to learn everything on day one. Get value in 20 minutes, not 20 hours.
Crawl: Get Running in 20 Minutes
What You're Going To Do
- Install Atmos
- Create a minimal
atmos.yaml - Create one stack YAML file
- Run
atmos terraform plan
That's it. You'll be using Atmos.
Step 1: Install Atmos
There are many ways to install Atmos. See the full Installation Guide for all options.
Step 2: Create Minimal atmos.yaml
Point Atmos to where your Terraform root modules live. Atmos only cares about root modules—where you put child modules (reusable modules called via source) is entirely up to you and has no bearing on Atmos configuration.
atmos.yaml
The base_path setting is flexible. If your root modules are already in a terraform/ directory, set base_path: "terraform". If you only use Terraform (no Helmfile or other toolchains), you could use base_path: "components" or even just base_path: ".". The components/terraform convention exists because Atmos supports multiple toolchains (Terraform, Helmfile, etc.), but organize however makes sense for your project.
Step 3: Move Your Terraform Code
If your Terraform is in scattered directories, consolidate it:
- Before
- After
terraform/
├── vpc/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
│ └── envs/
│ ├── dev.tfvars
│ ├── staging.tfvars
│ └── prod.tfvars
└── database/
├── main.tf
├── variables.tf
├── outputs.tf
└── envs/
├── dev.tfvars
├── staging.tfvars
└── prod.tfvars
components/terraform/
├── vpc/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
│ └── envs/ # Optional: Keep your existing tfvars!
│ ├── dev.tfvars
│ ├── staging.tfvars
│ └── prod.tfvars
└── database/
├── main.tf
├── variables.tf
├── outputs.tf
└── envs/
├── dev.tfvars
├── staging.tfvars
└── prod.tfvars
Your Terraform code stays exactly the same. You can keep using your .tfvars files with !include and gradually migrate to stack YAML as you grow.
Step 4: Create Your First Stack
Create a stack YAML file for one environment. Start by referencing your existing .tfvars files:
- Keep Using tfvars
- Convert to YAML
Keep your existing .tfvars files and include them directly:
stacks/dev.yaml
The !include function resolves paths relative to the Atmos base path and automatically parses .tfvars files (HCL format). This is the fastest migration path—your existing variable files keep working. You still get stack inheritance, imports, and all other Atmos features.
As you grow, convert your .tfvars to native YAML for deep merge benefits:
stacks/dev.yaml
Native YAML lets you define variables directly in stacks, enabling deep merging across inherited files—no duplication between environments.
Step 5: Run Atmos
- Before (Native Terraform)
- After (Atmos)
cd terraform/dev
terraform plan -var-file=vpc.tfvars
terraform apply -var-file=vpc.tfvars
atmos terraform plan vpc -s dev
atmos terraform apply vpc -s dev
Or just run atmos to use the interactive UI.
Congratulations! You're now using Atmos.
What Just Happened?
Directory Structure: Before and After
Here's a comprehensive view of how your project structure transforms:
- Before (Native Terraform)
- After (Atmos)
my-infrastructure/
├── terraform/
│ ├── vpc/
│ │ ├── main.tf
│ │ ├── variables.tf
│ │ ├── outputs.tf
│ │ └── envs/
│ │ ├── dev.tfvars
│ │ ├── staging.tfvars
│ │ └── prod.tfvars
│ └── database/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
│ └── envs/
│ ├── dev.tfvars
│ ├── staging.tfvars
│ └── prod.tfvars
├── scripts/
│ ├── deploy.sh
│ └── plan-all.sh
└── Makefile
Challenges:
.tfvarsfiles duplicated across components- Backend config managed manually or in scripts
- Custom scripts for orchestration
- No standard way to query infrastructure
my-infrastructure/
├── atmos.yaml # Single config file
├── components/
│ └── terraform/
│ ├── vpc/
│ │ ├── main.tf
│ │ ├── variables.tf
│ │ ├── outputs.tf
│ │ └── envs/ # Optional: Keep your existing tfvars!
│ │ ├── dev.tfvars
│ │ ├── staging.tfvars
│ │ └── prod.tfvars
│ └── database/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
│ └── envs/
│ ├── dev.tfvars
│ ├── staging.tfvars
│ └── prod.tfvars
└── stacks/
├── _defaults/
│ └── globals.yaml # Shared config (backend, etc.)
├── dev.yaml # References tfvars with !include
├── staging.yaml
└── prod.yaml
Benefits:
- Keep using your existing
.tfvarsfiles - Centralized backend configuration in
_defaults/ - No custom scripts needed
- Query with
atmos list stacks,atmos describe component
Key Differences at a Glance
| Aspect | Native Terraform | Atmos |
|---|---|---|
| Terraform Code | main.tf, variables.tf | Same - no changes needed |
| Configuration | .tfvars files, TF_VAR_ env vars | YAML vars: (but .tfvars still work via !include!) |
| Environments | Directories or workspaces | Stack YAML files |
| Backend Config | In Terraform code | Centralized in stack config |
| Commands | terraform plan -var-file=... | atmos terraform plan <component> -s <stack> |
| Querying | Bash scripts, grep | atmos list stacks, atmos describe component |
What Stays The Same
- Your Terraform code works as-is
- Your
.tfvarsfiles still work (use!includeto import them) - Your
TF_VAR_environment variables still work - Your backend configuration migrates cleanly
What You Added
atmos.yaml- tells Atmos where your code lives- Stack YAML files - one per environment
That's it. This is the "entry fee" for all the benefits below.
Walk: Immediate Value
Now that you're running Atmos, here's what you get immediately:
1. List Your Infrastructure
No more bash scripts or mental mapping:
Learn more: atmos list | atmos describe component
2. DRY Configuration
Instead of copying .tfvars files, use YAML imports and inheritance:
stacks/_defaults.yaml
stacks/dev.yaml
stacks/prod.yaml
Shared settings live in _defaults.yaml. Each environment only specifies what's different.
3. Query Remote State
Pull outputs from other components using the !terraform.output function—no custom bash needed:
stacks/dev.yaml
Or use the Terraform module:
components/terraform/eks/remote_state.tf
4. Centralized Backend
Stop managing backend config in every directory:
stacks/dev.yaml
Atmos auto-generates backend.tf.json for you.
Run: When You're Ready
These advanced features are there when you need them. You don't need them now.
Workflows (Replace Your Makefiles)
stacks/workflows/deploy.yaml
Validation with OPA and JSON Schema
Validate your configurations before running Terraform:
Component Inheritance
Reuse component configurations:
stacks/catalog/vpc-defaults.yaml
stacks/dev.yaml
Real Example: Hello World Migration
Let's migrate a simple "Hello World" Terraform configuration that creates an S3 bucket.
Before (Native Terraform)
- Structure
- main.tf
- variables.tf
- terraform.tfvars
- backend.tf
hello-world/
dev/
main.tf
variables.tf
outputs.tf
terraform.tfvars
backend.tf
prod/
main.tf
variables.tf
outputs.tf
terraform.tfvars
backend.tf
resource "aws_s3_bucket" "hello" {
bucket = var.bucket_name
tags = {
Environment = var.environment
Project = "hello-world"
}
}
variable "bucket_name" {
type = string
description = "Name of the S3 bucket"
}
variable "environment" {
type = string
description = "Environment name"
}
bucket_name = "hello-world-dev-bucket"
environment = "dev"
terraform {
backend "s3" {
bucket = "my-terraform-state"
key = "dev/hello-world/terraform.tfstate"
region = "us-east-1"
}
}
Commands:
cd hello-world/dev
terraform init
terraform plan -var-file=terraform.tfvars
terraform apply -var-file=terraform.tfvars
After (Atmos)
- Structure
- atmos.yaml
- stacks/dev.yaml
- stacks/prod.yaml
- Using !include
atmos.yaml
components/terraform/hello-world/
main.tf # Same code, no changes
variables.tf # Same code, no changes
outputs.tf # Same code, no changes
stacks/
dev.yaml
prod.yaml
components:
terraform:
base_path: "components/terraform"
stacks:
base_path: "stacks"
name_template: "{{ .vars.stage }}"
terraform:
backend_type: s3
backend:
s3:
bucket: my-terraform-state
key: "dev/hello-world/terraform.tfstate"
region: us-east-1
vars:
stage: dev
components:
terraform:
hello-world:
vars:
bucket_name: "hello-world-dev-bucket"
environment: dev
terraform:
backend_type: s3
backend:
s3:
bucket: my-terraform-state
key: "prod/hello-world/terraform.tfstate"
region: us-east-1
vars:
stage: prod
components:
terraform:
hello-world:
vars:
bucket_name: "hello-world-prod-bucket"
environment: prod
Want to keep using your existing .tfvars files? Use !include:
# stacks/dev.yaml
terraform:
backend_type: s3
backend:
s3:
bucket: my-terraform-state
key: "dev/hello-world/terraform.tfstate"
region: us-east-1
vars:
stage: dev
components:
terraform:
hello-world:
vars: !include ../legacy/dev/terraform.tfvars
Commands:
atmos terraform plan hello-world -s dev
atmos terraform apply hello-world -s dev
Why It's Worth It
Stop Duct-Taping Tools Together
Instead of:
- Makefiles + shell scripts + GitHub Actions + custom bash + .tfvars + workspaces
You get:
- One documented approach with Atmos
Real Benefits You'll Feel Immediately
- Documented convention - Not tribal knowledge
- Reduced cognitive load - Follow patterns, don't reinvent
- Easier onboarding - New team members productive in 20 minutes
- Query infrastructure -
atmos list stacksinstead of bash/grep - DRY configs - Inheritance without copy-paste
- Workflows - Replace your Makefiles
- Separation of concerns - Terraform is code, YAML is configuration
What It Transforms
-
Before: "Let me grep through directories to find where we deploy the VPC in staging"
-
After:
atmos describe component vpc -s staging -
Before: "New developer? Here's 45 minutes of tribal knowledge about our Makefile"
-
After: "Read the stack YAML, run
atmos terraform plan, you're good" -
Before: Custom bash scripts to pull remote state
-
After:
vpc_id: !terraform.output vpc.vpc_id
What Atmos Won't Do
Here's what to expect:
- Won't magically refactor your existing Terraform - Atmos doesn't provide automated refactoring tools
- Won't fix monolithic modules - That's still on you
- Won't require you to learn everything - Start with basics, grow as needed
But:
- Everything new you build will follow glorious conventions
- You can gradually refactor existing stuff as you see fit
- It's going to transform your day-to-day
Working with .tfvars Files
Option 1: Use !include (Recommended for Migration)
Keep your existing .tfvars files and import them directly:
stacks/dev.yaml
The !include function:
- Automatically parses
.tfvars(HCL format) - Converts to proper YAML types (maps, lists, booleans)
- Works with local and remote files
- Supports YQ expressions for filtering
See the !include function documentation for more details.
Option 2: Convert to YAML (Recommended Long-term)
Convert your .tfvars to YAML for full Atmos features:
- Before (vpc.tfvars)
- After (stacks/dev.yaml)
cidr_block = "10.0.0.0/16"
enable_dns_hostnames = true
tags = {
Environment = "dev"
Team = "platform"
}
components:
terraform:
vpc:
vars:
cidr_block: "10.0.0.0/16"
enable_dns_hostnames: true
tags:
Environment: dev
Team: platform
Working with TF_VAR_ Environment Variables
Atmos supports Terraform's native environment variable pattern:
stacks/dev.yaml
When you run atmos terraform plan vpc -s dev, these environment variables are set automatically.
Migration Checklist
- Install Atmos CLI (Installation Guide)
- Create
atmos.yamlpointing to your Terraform code - Reorganize Terraform code into
components/terraform/<component>/ - Create your first stack YAML (start with dev)
- Test with
atmos terraform plan <component> -s dev - Create remaining stack files (staging, prod)
- (Optional) Use
!includeto import existing.tfvarsfiles - (Optional) Migrate
.tfvarsto YAML for full features - (Optional) Set up workflows to replace Makefiles
- (Optional) Explore DRY configs with imports
Next Steps
You just did Crawl - you're running Atmos!
Walk: Explore these next:
Run: When you're ready for advanced features: