Mod
The filesystem for agents
Agents are becoming the primary way work gets done. They write code, execute tasks, and operate autonomously. But they all need the same thing: a filesystem they can read, write, and reason over.
Mod is the infrastructure layer for this shift. A collaborative, branchable filesystem designed for agents — with the performance to run at scale, the primitives developers need, and the flexibility for anyone to build on top.
Problem
The filesystem has become the core primitive for agents. It gives them shared context, pairs naturally with their ability to write and execute code, and provides persistence across sessions. Every agent — not just coding ones — will need this.
But today’s infrastructure wasn’t built for this world:
- Git doesn’t scale for agents. One branch at a time, manual commits, push/pull to sync. When you want 10 agents working in parallel, you need 10 clones, complex orchestration, and constant merge conflicts.
- Cloud storage isn’t versioned. Dropbox and Google Drive sync files, but they don’t branch. There’s no way to have parallel workstreams that merge back together.
- Apps are siloed. Users want their filesystem accessible by agents across the applications they use — not locked inside individual tools.
- Building is still too hard. Users are increasingly building workflow-specific apps for their own productivity. Coding agents make this trivial — but wiring up filesystem access, collaboration, and sync is still complex.
Solution
Mod is a collaborative, branchable filesystem built for agents. It combines the versioning power of git with the real-time sync of modern collaboration tools — designed from the ground up for the way agents work.
- Branch and merge. Any agent or user can create a branch, work in isolation, and merge back. No clones, no push/pull — just instant branching with automatic sync.
- Built for scale. Run hundreds of agents in parallel sandboxes, each on their own branch, with high-performance infrastructure that handles the concurrency.
- Real-time collaboration. Changes sync instantly across all connected clients. Local-first architecture means it works offline and feels fast.
- Open by design. Your filesystem is accessible to any agent, any app, any integration. Not locked inside a single tool.
Product
Mod works at three layers: a workspace primitive that developers integrate into their agents and apps, a CLI for power users, and a growing ecosystem of canvases built on top.
For developers building agents
A workspace SDK that gives agents a filesystem they can branch, modify, and merge — without the complexity of git orchestration.
For developers working locally
A CLI that lets you work on multiple tasks in parallel, each in its own branch, with agents running alongside you.
For teams building custom apps
Users are building workflow-specific apps for their own productivity — and coding agents make this trivial. Mod gives them filesystem context and real-time collaboration without the complexity of building from scratch.
The Mod workspace
For users who just want to get started, we provide a flexible workspace built on these primitives — a canvas for your files, real-time collaboration, and a growing ecosystem of integrations.
Market
We’re building infrastructure for the agent era. The market is every agent that needs to read, write, and operate on files — which is all of them.
AI inference spend is projected to reach $500B+ annually by 2028. A significant portion of that compute will be agents operating on filesystems — writing code, editing documents, processing data, executing workflows. We capture value at the infrastructure layer where this work happens.
- Agent compute. Every agent session that reads, writes, or branches a workspace flows through our infrastructure. As agent usage scales, so does our footprint.
- Developer infrastructure. Companies building agents need filesystem primitives — branching, sync, storage. This is foundational infrastructure they build on top of.
- End-user inference. Users running agents in their workspaces consume inference through us. We become the interface between users and AI compute.
Business Model
Two revenue streams that scale with usage: inference and infrastructure.
Inference
Users and developers consume AI compute through Mod. Every agent session, every background task, every intelligent operation flows through our inference layer. We meter and bill for this usage.
- End users pay for agent usage in their workspaces — running tasks, asking questions, generating content. Subscription tiers with usage-based overages.
- Developers pay for inference their apps and agents consume through our API. Usage-based pricing that scales with their products.
Storage & Sync
Developers building on Mod pay for the infrastructure their products use — workspace storage, real-time sync, branching operations, API calls.
- Storage — Per-GB pricing for workspace data
- Sync — Per-operation pricing for real-time collaboration and branch operations
- API — Usage-based pricing for workspace access and agent orchestration
The model compounds: more users means more inference, more developers means more infrastructure, and the ecosystem grows on top of both.
Traction
[Traction narrative: Add your current metrics and milestones here]
Milestones
- [Milestone 1 - e.g., Core infrastructure built]
- [Milestone 2 - e.g., Beta users onboarded]
- [Milestone 3 - e.g., Key integration shipped]
Competition
Existing tools solve pieces of this problem, but none are built for the agent-first world.
The deeper insight: the infrastructure layer is becoming the opportunity. As coding agents commoditize app development, the value shifts to the primitives everything is built on.
Team
We’ve been building companies together for 15 years. Weightlifting partners, MMA training partners, best men at each other’s weddings. We’ve been obsessed with platform businesses from the start — bootstrapped and scaled a marketplace in the UK to £5M annual revenue.
The last five years we’ve gone deep on developer platforms and version-controlled filesystems. Now we’re coming back together to build the filesystem for agents.
The Ask
We’re raising to build the foundational infrastructure for agent-first computing — the filesystem layer that every agent, every app, and every workflow will be built on.
Use of Funds
- Engineering — Scale the core infrastructure, build out SDK and API
- Go-to-market — Developer relations, documentation, community
- Operations — Infrastructure costs, security, compliance
What This Enables
- Production-ready infrastructure for agent workloads at scale
- Public SDK and API for developers building agents
- Growing ecosystem of integrations and canvases
- Path to enterprise adoption and revenue