Cloud sandbox for AI agents.
Fork in 100 ms. Persistent REPL.

Secure, isolated environments your agents can branch, run code in, and tear down. State carries across calls; forks diverge in milliseconds.

from podflare import Sandbox

# Create a sandbox
sbx = Sandbox(template="python-datasci")

# Execute code (state persists across calls)
sbx.run_code("print('hello')")

# Fork — spawn N copies of the running sandbox
children = sbx.fork(n=5)

# Merge a branch back into the parent
sbx.merge_into(children[0])

# Destroy the sandbox
sbx.close()

pip install podflare · npm install podflare · Remote MCP for Claude / Cursor / Cline: https://mcp.podflare.ai

~100 ms

fork(n)

Snapshot a running sandbox, spawn N copies, each diverges for free. The primitive tree-search agents have always wanted.

0 ms reimport

Persistent REPL

Variables, imports, open files carry across run_code calls. Data-science templates preload pandas, numpy, scipy, and matplotlib.

4 adapters

Every framework

OpenAI Agents SDK, Vercel AI SDK, Anthropic code_execution, MCP. Drop-in for any tool-use pattern.

Why agents need this

Tree search without state hell

Classic sandboxes give you one shell per agent. Branching a long-running computation means re-running setup from scratch. Forking lets an agent explore N hypotheses from a shared ancestor state — each branch inherits memory, files, and environment.

Secure by construction

Every sandbox runs in a hardware-isolated virtual machine with its own kernel boundary. No network egress by default — agents can't exfiltrate data or call arbitrary APIs without opt-in.

Five minutes from sign-up to first sandbox.

Mint a key, install the SDK, you're running.