Deploy via Manual API
When your API doesn't have a published OpenAPI specification — or you want full control over which endpoints become AI tools — Manual API is the path. You define the server, connect the credentials, and add tools on your own terms.
This is the method for internal APIs, legacy systems, custom microservices, and any endpoint you want to expose to AI with surgical precision.
When to use Manual API
- Your API has no published OpenAPI spec — no Swagger, no YAML, no JSON schema
- You want to expose specific endpoints only — not the entire API surface
- The API is internal or behind a firewall — accessible only via a base URL you define
- You're building a custom MCP server from scratch — prototyping before committing to a full spec
Step 1 — Connection
- Server Name — This is the display name AI clients see when discovering your server. Choose something descriptive: "Payments API", "Inventory Service", "Internal CRM".
- Base URL — The root endpoint of your API. All tools you add later will be relative to this URL. Must be HTTPS with a valid domain — no
localhost, no IP addresses.
Step 2 — Security
| Auth type | What you provide |
|---|---|
| Bearer Token | Your API token or access key |
| Basic Auth | Username and password |
| Custom Header | Header name (e.g., X-API-Key) and value |
| None — Public API | Nothing. Step is skipped. |
Credentials are AES-256 encrypted at rest, decrypted in-memory only when an AI model invokes a tool, and immediately discarded after the upstream call.
Step 3 — Governance
Both controls are enabled by default. You can toggle them off during setup or change them later from the server's settings page.
What happens when you deploy
Click Deploy. The server goes live in seconds — but it starts empty. No tools exist yet. This is intentional.
After deployment, you receive the same outputs as OpenAPI deploy: MCP URL, Config JSON, and a Connection Token (HMAC-SHA256, shown once).
Adding tools
Navigate to your server's Tools tab. Each tool you add maps to a single API endpoint. You define:
Start small
You don't need to map every endpoint at once. Deploy with the 3–5 tools the AI needs most, test them with a client, and add more as you go. Every new tool is available immediately — no redeployment needed.
Manual vs. OpenAPI: which to choose
| Manual API | OpenAPI Import | |
|---|---|---|
| Best for | Internal APIs, prototypes, selective exposure | Public APIs with published specs |
| Tool creation | You define each tool individually | All tools generated automatically |
| Speed | Server deploys instantly; tools added incrementally | Full server with all tools in < 30s |
| Control | Maximum — you choose exactly which endpoints to expose | All endpoints imported; toggle off what you don't need |
| Spec required | No | Yes — OpenAPI 2.0, 3.0, or 3.1 |
Both methods produce the same result: a governed, sandboxed MCP server running on the global edge with HMAC-authenticated connection tokens, DLP, and FinOps Guard.
Next steps
Frequently Asked Questions
When should I use Manual API instead of OpenAPI Import?
Use Manual API when your API doesn't have a published OpenAPI or Swagger specification, when you want to expose only specific endpoints instead of the entire API surface, or when you're working with internal or legacy systems that require custom configuration.
Can I add tools after the initial deployment?
Yes. Manual API is designed for incremental tool building. Deploy the server first (it starts empty), then add tools one at a time from the dashboard. Each new tool is available immediately — no redeployment needed.
What do I need to define for each tool?
Each tool requires a path and HTTP verb (relative to your base URL), a description (what the AI reads to decide when to use the tool), an input schema (parameters the AI provides), and an annotation (read-only, idempotent, or destructive).
Does the base URL have to be publicly accessible?
The base URL must be a valid HTTPS domain — no localhost, no IP addresses. If your API is behind a firewall, you'll need to allowlist the Vinkius Cloud egress IPs or use a reverse proxy.
How are credentials handled for my API?
Vinkius Cloud supports Bearer Token, Basic Auth, Custom Header, and None (public APIs). Credentials are AES-256 encrypted at rest, decrypted only in-memory when the AI invokes a tool, and never logged or stored in plaintext.
What's the difference between Manual API and OpenAPI Import results?
Both produce identical results: a governed, sandboxed MCP server with HMAC-authenticated tokens, DLP, and FinOps Guard. The only difference is how tools are created — manually defined vs. auto-generated from a spec.