Sandboxing AI Codegen CLIs: Part 1 - CLI Tooling

AI-powered code generation CLIs like Claude Code, Gemini CLI, OpenAI Codex, and GitHub Copilot are incredibly productive tools. They can scaffold projects, refactor code, and even run bash commands on your behalf.
But here's the catch: they can also hallucinate. And a hallucinated rm -rf or a runaway npm install that fills your disk is not something you want happening on your main OS.
The Problem
These tools are powerful, but when you grant them shell access, you're essentially trusting an LLM to not mess up your system. That's a gamble I'm not willing to take on my daily driver.
Consider these scenarios:
- A hallucinated command wipes important files
- An infinite loop fills your disk with garbage
- A misconfigured script modifies system files
- Dependencies get installed globally, polluting your environment
The Solution: Sandboxed Docker Containers
This is the first part in a series on sandboxing AI codegen CLIs. We're starting with the simplest, most portable approach: a Docker image with a lightweight wrapper script.
By running AI codegen CLIs inside a Docker container, you get:
- Full isolation - The container can't touch your main OS
- No root access - The container runs as a non-root user with no sudo
- Persistent AI config - The
.ai/folder survives between runs - Batteries included - Claude, Codex, Copilot, Gemini, plus language runtimes and dev tools
- One command - A wrapper script handles everything
The beauty of this approach is its simplicity. No IDE plugins or special configuration needed - just Docker and a shell script.
What's in the Box?
The aidevtools Docker image ships with:
AI Assistants: Claude CLI, OpenAI Codex, GitHub Copilot, Google Gemini CLI
Language Runtimes: Java (SDKMAN), Node.js (NVM), Python, Go, Rust, Deno
Dev Tools: GitHub CLI, Neovim, tmux, direnv
The image is fully customizable via build args - disable what you don't need:
docker build --build-arg INSTALL_JAVA=false --build-arg INSTALL_RUST=false .
Getting Started
Getting started is straightforward:
# Pull the pre-built image
docker pull ghcr.io/sugarfreebytes/aidevtools:latest
# Run from your project directory
./aidevtools.sh # Interactive shell
./aidevtools.sh claude # Launch Claude directly
./aidevtools.sh gemini # Launch Gemini directly
The wrapper script mounts your current directory into the container and persists AI tool configurations in a local .ai/ folder between sessions.
Read the full tutorial: Sandboxing AI CLIs with Docker
What's Next?
This is Part 1 of a series on sandboxing AI codegen CLIs. In upcoming posts, we'll explore DevContainers, IDE integration, and more advanced isolation strategies.
