Open source ecosystem

Build on Repull's
open-source ecosystem

A full channel-manager template, an MCP server for AI agents, a Vercel AI SDK provider, and six native SDKs — all open source, all forkable, all production-ready.

Channel Manager Template

A full open-source channel manager.

Fork the entire channel-manager. Calendar, reservations, connections, messaging, reviews — every screen a multi-property operator needs. Powered by Repull, with AI features powered by Vanio AI. Works against your own Repull API key on day one.

Multi-channel calendar with sync
Guest CRM and reservation drawer
Inbox with AI-assisted replies
Click-through demo signin
$git clone github.com/ivannikolovbg/repull-channel-manager
repull-channel-manager.vercel.app
Reservations
142+12
Active listings
38+2
This month
$24.3k+18%
Calendar — May
5 channels in sync
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Guests
Inbox
AI replies
MCP Server

Repull, native in your AI agent.

@repull/mcpMIT License OSS

Plug Repull into Claude Desktop, Cursor, Windsurf, Zed, or any MCP-aware client. Your AI calls reservations, properties, and messaging tools natively — no custom glue, no plugin to ship.

Works with every MCP client
18 tools out of the box
Stdio + SSE transports
Same API key as your dashboard
$npx -y @repull/mcp
claude_desktop_config.json
{
  "mcpServers": {
    "repull": {
      "command": "npx",
      "args": ["-y", "@repull/mcp"],
      "env": {
        "REPULL_API_KEY": "sk_test_..."
      }
    }
  }
}

Then ask Claude:

“Show me reservations checking in this weekend across all my listings.”
Vercel AI SDK Provider

Use Repull as a tool provider in any AI app.

@repull/ai-sdkMIT License OSS

Drop Repull into any streamText or generateText call. Your model can list reservations, send guest messages, and update pricing through native tool calls. Ships with a working chat-demo you can fork as a starter.

Native Vercel AI SDK tools
Works with any model provider
Streaming-first by default
Forkable chat-demo included
$npm install @repull/ai-sdk ai @ai-sdk/openai
app/api/chat/route.ts
import { streamText } from 'ai'
import { openai } from '@ai-sdk/openai'
import { repullTools, RepullClient } from '@repull/ai-sdk'

const tools = repullTools(
  new RepullClient({ apiKey: process.env.REPULL_API_KEY! })
)

export async function POST(req: Request) {
  const { messages } = await req.json()
  return streamText({
    model: openai('gpt-4o'),
    tools,
    messages,
  }).toDataStreamResponse()
}

Six SDKs

Native clients for every stack.

All open source. Hand-crafted TypeScript flagship plus generated bindings against the same OpenAPI spec.

TypeScript@repull/sdk
Repull Community License
$npm install @repull/sdk
View on GitHub
Pythonrepull
MIT License
$pip install repull
View on GitHub
PHPrepull/sdk
MIT License
$composer require repull/sdk
View on GitHub
Gogithub.com/ivannikolovbg/repull-go
MIT License
$go get github.com/ivannikolovbg/repull-go
View on GitHub
Rubyrepull
MIT License
$gem install repull
View on GitHub
.NETRepull.SDK
MIT License
$dotnet add package Repull.SDK
View on GitHub

Our stance

Why open source?

No lock-in

Every adapter, the channel-manager template, and the AI runtimes are open. If you ever want to leave Repull, you can take the integration code with you.

Interoperability

Standard schemas, OpenAPI everywhere, and an MCP server mean Repull plays nicely with the tools you already use — including ones we haven’t heard of yet.

Built in the open

Issues, PRs, and releases happen on GitHub. Found a bug or want a new field on a connector? Open a PR and we’ll review it the same week.

Ship on top of Repull.

Get an API key in 30 seconds, fork the channel-manager template, and you're live by the weekend.