Every feature you need.
Nothing you don't.

Halper is a complete AI agent platform — not a chat widget bolted onto a cloud service. Here's what you get when you deploy it.

🤖

Multi-agent AI, done right

Create unlimited agents, each with their own model, persona, tools, and workspace. Agents share infrastructure but operate independently — no context bleed, no shared secrets.

Multiple agents per account

Define as many specialized agents as you need: a coding assistant, a research agent, a DevOps bot. Each configured independently.

Per-agent model selection

Point each agent at a different model. Use a fast model for quick tasks and a powerful one for complex reasoning — without changing your workflow.

Custom system prompts

Give each agent its own persona, domain knowledge, and behavioral rules via a system prompt. Agents behave exactly as you define.

Tool & MCP scoping

Choose which tools and MCP servers each agent can access. A code agent gets bash and file access; a research agent gets web browsing. Principle of least privilege, built in.

Skills system

Compose reusable AI skill modules from local files or GitHub repos. Activate or deactivate skills per agent. Git SHA tracking ensures reproducibility.

Persistent agent memory

Agents can remember user preferences across conversations via a persistent memory file — no re-explaining your stack every session.

🖥️

A real Linux environment, not a toy

Every workspace gets a full ephemeral Linux container (node:22 + python3) with persistent storage. Agents execute code, run tests, manage files, and access the internet — exactly like a developer would.

Full Linux container

Agents run commands in a real shell environment with Node.js, Python, git, GitHub CLI, curl, pnpm, and more pre-installed.

Built-in terminal

Drop into a live PTY terminal directly from the chat UI. See exactly what your agent sees — or run commands yourself alongside it.

Persistent file storage

Files saved to /workspace/storage are backed by Cloudflare R2. Your codebase and data survive container restarts.

Multi-session architecture

Agents get isolated conversation sessions preserving shell state across tool calls. Background tasks run asynchronously without blocking.

Workspace file browser

Browse, upload, and manage workspace files from the UI. Download files, view directory trees, and inspect agent output without SSH.

Multiple workspaces

Create separate sandboxes for different projects. Each workspace has its own environment variables, startup scripts, and file storage.

🧠

The best model for every job

Halper supports 12+ LLMs from multiple providers. Mix and match models across agents, switch on the fly, and bring your own API keys — no subscriptions required.

Cloudflare Workers AI

Run Llama 4, DeepSeek R1/V3, Qwen Coder, and other open models on Cloudflare's GPU infrastructure — fast, private, and included in your Workers AI quota.

OpenRouter gateway

Access Claude Opus/Sonnet 4.5, Kimi K2, and hundreds of other models via OpenRouter. One API key, every frontier model.

Custom model configs

Define model configurations with custom base URLs, API keys, and parameters. Point at any OpenAI-compatible endpoint.

Per-agent model selection

Each agent uses the model that fits its task. No global setting forces a single model across your entire workflow.

🔗

Extend agents with any MCP server

Connect agents to external data sources, APIs, and tools via the Model Context Protocol. OAuth credentials are stored automatically and shared across agents.

Any MCP server

Connect to any MCP-compatible server: databases, APIs, code review tools, issue trackers, cloud consoles — if it speaks MCP, Halper can use it.

OAuth persistence

Halper stores OAuth tokens for each MCP server in SQLite and shares them across agents. Authenticate once, use everywhere.

Per-agent server selection

Assign specific MCP servers to specific agents. A DevOps agent can access AWS and GitHub; a writing agent accesses only Notion.

Tool filtering

Limit which MCP tools an agent can invoke. Expose only the capabilities each agent needs.

Multiple transport types

Supports SSE, streamable-HTTP, and auto-detected transports. Works with both hosted and locally-running MCP servers.

Agents that work while you sleep

Schedule agents on cron expressions, trigger them via HTTP webhooks, or integrate them into any CI/CD pipeline. Automation is a first-class citizen in Halper.

Cron scheduling

Schedule any agent to run on any cron pattern. Daily standups, weekly summaries, nightly code reviews — automate the routine.

HTTP webhook triggers

Trigger agents from external systems via POST webhooks. GitHub push events, PagerDuty alerts, Stripe payments — any HTTP event can start a Halper agent.

Authentication options

Secure webhooks with static header tokens or GitHub webhook signature verification. No unauthenticated execution.

Custom prompt templates

Inject webhook payload data into agent prompts via templates. Pass PR diffs, alert details, or form submissions directly into the agent's context.

Session tracking

Every scheduled and webhook-triggered run creates a conversation session with full history and status tracking.

🔧

Make your agents talk to anything

Define HTTP or CLI tools that agents can call. Inject secrets at runtime, run background commands, and connect agents to any internal API or service.

HTTP tools

Template URL, headers, and body with {{secret:KEY}} interpolation. Call any REST API with user-defined credentials — safely, without exposing keys to the agent.

CLI tools

Define shell commands that run in the sandbox. Wrap complex scripts, internal CLIs, or database queries into simple agent-callable tools.

Background execution

CLI tools can run asynchronously in the background. Kick off long-running jobs without blocking the conversation.

Secret injection

Secrets are encrypted at rest and injected at tool execution time. Agents never see raw credentials — only the tool results.

Conditional workspace access

Tools can be scoped to specific workspaces. A database tool that only appears when working in the production workspace.

🔒

Your data never leaves your infrastructure

Halper is deployed to your Cloudflare account. There's no Halper-operated server, no shared data store, no telemetry. You own every byte.

Self-hosted on Cloudflare

The entire platform runs in your Cloudflare account. Workers, Durable Objects, R2 storage — all isolated under your tenancy.

Encrypted secrets

Secrets are encrypted at rest in the Durable Object SQLite database. They're injected at runtime but never surfaced in conversation history or logs.

Per-user isolation

Multi-tenant architecture enforces strict user isolation. R2 paths, sandbox containers, and Durable Object namespaces are all scoped to the authenticated user.

Auth options

Deploy with a simple password, or integrate WorkOS OAuth for SSO with your existing identity provider.

Config change history

Every configuration change is tracked with a full revision history. Roll back to any previous configuration.

Ready to own your AI stack?

Deploy Halper to your Cloudflare account in minutes. MIT licensed, no strings attached.

Get Started on GitHub