MCP Server
Connect AI coding assistants to OneSchema's API documentation via the Model Context Protocol (MCP)
OneSchema provides an MCP (Model Context Protocol) server that gives AI coding assistants — such as Claude, Codex, Cursor, Windsurf, and Devin — direct access to OneSchema's API specifications. This lets your AI tools discover endpoints, understand request/response schemas, and write correct integration code without relying on stale training data.
Overview
The MCP server exposes read-only tools for both API specifications and product guides:
| Tool | Description |
|---|---|
list_api_groups | Lists all available API spec groups (e.g., importer, templates, filefeeds) with descriptions and endpoint counts |
get_api_spec | Retrieves the full OpenAPI spec for a specific API group, including a summary of all endpoints |
list_guides | Lists available product guides (setup, quickstart, how-to) with titles and product area tags |
get_guide | Retrieves the full markdown content of a specific guide by slug |
It also exposes both specs and guides as MCP resources — oneschema://api/{group} and oneschema://guides/{slug} — for clients that support proactive context loading.
Setup
Prerequisites
- A OneSchema account with API access
- Your OneSchema API key (found in the dashboard under Settings > API Keys)
Configuration
Add the OneSchema MCP server to your AI tool's MCP configuration. The server uses Streamable HTTP transport — no packages to install.
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"oneschema": {
"url": "https://api.oneschema.co/mcp",
"headers": {
"X-Api-Key": "<your-oneschema-api-key>"
}
}
}
}Codex
Add to your ~/.codex/config.toml (or .codex/config.toml in your project root):
[mcp_servers.oneschema]
url = "https://api.oneschema.co/mcp"
env_http_headers = { "X-Api-Key" = "ONESCHEMA_API_KEY" }Then set the environment variable:
export ONESCHEMA_API_KEY=<your-oneschema-api-key>Cursor
Add to your .cursor/mcp.json in your project root:
{
"mcpServers": {
"oneschema": {
"url": "https://api.oneschema.co/mcp",
"headers": {
"X-Api-Key": "<your-oneschema-api-key>"
}
}
}
}Windsurf
Add to your Windsurf MCP configuration:
{
"mcpServers": {
"oneschema": {
"serverUrl": "https://api.oneschema.co/mcp",
"headers": {
"X-Api-Key": "<your-oneschema-api-key>"
}
}
}
}Devin
Your organization admin can add the MCP server in Devin Settings under the MCP configuration section. Every Devin session working on your codebase will then automatically have access to the OneSchema API specs.
Other MCP Clients
Any MCP client that supports Streamable HTTP transport can connect. Configure it with:
- URL:
https://api.oneschema.co/mcp - Header:
X-Api-Key: <your-oneschema-api-key>
Available API Groups
The MCP server provides specs for the following API groups:
| API Group | Description |
|---|---|
importer | Embeddable data importer endpoints |
templates | Template creation and management |
templates-validations | Template validation rules |
templates-code-hooks | Template code hooks configuration |
filefeeds | Automated file-based data ingestion |
sheets | Spreadsheet data operations |
workflows | Workflow orchestration |
s3 | S3 integration configuration |
sftp | SFTP integration configuration |
Usage Examples
Once configured, your AI assistant can automatically discover and use the OneSchema API specs. Here are some example prompts:
Discovering available APIs
"What OneSchema APIs are available?"
Your AI assistant will call list_api_groups and return a summary of all API groups with their descriptions and endpoint counts.
Getting integration help
"Help me integrate OneSchema's importer into my app"
The assistant will call get_api_spec for the importer group, read the full OpenAPI spec, and use it to write correct API calls with the right endpoints, headers, and payload shapes.
Building automation
"Create a script that uses OneSchema FileFeeds to automatically ingest CSV files from S3"
The assistant will pull the filefeeds and s3 specs to understand the available endpoints and build a working integration.
How It Works
The MCP server implements the Model Context Protocol specification using JSON-RPC 2.0 over Streamable HTTP transport. When your AI tool connects:
- Discovery — The client sends an
initializerequest and receives the server's capabilities - Tool listing — The client calls
tools/listto discover available tools - Tool calls — When the AI needs API information, it calls the appropriate tool (e.g.,
get_api_spec) and receives structured data - Resource access — Clients can also list and read API specs as resources for proactive context loading
All requests are authenticated using your existing OneSchema API key. The server is read-only and does not modify any data.
Feature Availability
The MCP server must be enabled for your organization. Contact your OneSchema account team or reach out to [email protected] to enable it.
Troubleshooting
"401 Unauthorized" response
Verify your API key is correct and included in the request headers. The key should be passed as X-Api-Key header.
"403 Forbidden" response
The MCP server feature is not enabled for your organization. Contact OneSchema support to enable it.
AI tool not discovering the server
Ensure your MCP configuration file is correctly formatted and saved. Restart your AI tool after making configuration changes. Check that the URL is exactly https://api.oneschema.co/mcp with no trailing slash.
Empty or missing API specs
If list_api_groups returns an empty list, this may indicate a temporary server issue. Try again in a few moments or contact support.
Updated about 18 hours ago
