Skip to content

Deploy via Manual API

When your API doesn't have a published OpenAPI specification — or you want full control over which endpoints become AI tools — Manual API is the path. You define the server, connect the credentials, and add tools on your own terms.

This is the method for internal APIs, legacy systems, custom microservices, and any endpoint you want to expose to AI with surgical precision.

< 30s
SERVER PROVISIONED
Zero spec
NO OPENAPI REQUIRED
Add later
TOOLS ON YOUR SCHEDULE

When to use Manual API

  • Your API has no published OpenAPI spec — no Swagger, no YAML, no JSON schema
  • You want to expose specific endpoints only — not the entire API surface
  • The API is internal or behind a firewall — accessible only via a base URL you define
  • You're building a custom MCP server from scratch — prototyping before committing to a full spec

THE WIZARD
Three steps.
Tools come later.
Deploy first — an empty server ready for connections. Then add endpoints one at a time from the dashboard, exactly the ones you want the AI to access.

Step 1 — Connection

02 — CONNECTION
Give it
a name.
Name your server and tell us where the API lives. Add endpoints later — whenever you're ready, from the detail page.
This name is what AI agents see when they discover your tools.
CONNECTION
Server Name
My Internal API
Base URL
https://api.yourcompany.com/v1
HTTPS only · Valid domain required
  • Server Name — This is the display name AI clients see when discovering your server. Choose something descriptive: "Payments API", "Inventory Service", "Internal CRM".
  • Base URL — The root endpoint of your API. All tools you add later will be relative to this URL. Must be HTTPS with a valid domain — no localhost, no IP addresses.

Step 2 — Security

03 — SECURITY
Credentials never
leave the vault.
AI agents call your API through us. Keys are AES-256 encrypted at rest, injected at runtime, invisible to the LLM.
Public API? Skip this — no auth required.
AUTHENTICATION
Bearer Token
Basic Auth
Custom Header
None
AES-256 encrypted · Decrypted only at request time · Never logged
Auth typeWhat you provide
Bearer TokenYour API token or access key
Basic AuthUsername and password
Custom HeaderHeader name (e.g., X-API-Key) and value
None — Public APINothing. Step is skipped.

Credentials are AES-256 encrypted at rest, decrypted in-memory only when an AI model invokes a tool, and immediately discarded after the upstream call.

Step 3 — Governance

04 — GOVERNANCE
Invisible
armor.
PII is masked before it ever touches the model. Large payloads shrink automatically — fewer tokens, same intelligence.
Both optional. Enable now or toggle later from settings.
GOVERNANCE CONTROLS
Zero-Trust PII Redaction
Masks emails, SSNs, and credit cards before data reaches the AI.
Smart Array Truncation
Shrinks oversized payloads to reduce token consumption.

Both controls are enabled by default. You can toggle them off during setup or change them later from the server's settings page.


DEPLOY
Hit Deploy.
Then build your tools.

What happens when you deploy

Click Deploy. The server goes live in seconds — but it starts empty. No tools exist yet. This is intentional.

DEPLOYING...
Provisioning server…done
Configuring security layer…done
Activating data protection…done
Live on the edge…done

After deployment, you receive the same outputs as OpenAPI deploy: MCP URL, Config JSON, and a Connection Token (HMAC-SHA256, shown once).


AFTER DEPLOY
Build your tools.
One endpoint at a time.

Adding tools

Navigate to your server's Tools tab. Each tool you add maps to a single API endpoint. You define:

ENDPOINT
Path and HTTP verb
GET /customers, POST /orders, DELETE /invoices/{id} — the path is relative to the base URL you set during deployment.
DESCRIPTION
What this tool does
This is what the AI reads to decide when to use the tool. Be clear and specific — "Retrieves all invoices for a given customer ID" beats "Gets invoices".
INPUT SCHEMA
Parameters the AI provides
Define the parameters the AI model needs to fill: path params, query params, request body fields. Each has a name, type, and description.
ANNOTATION
Read-only, idempotent, or destructive
Tell the AI how to treat this tool. Read-only tools are called freely; destructive tools require explicit user confirmation before execution.

Start small

You don't need to map every endpoint at once. Deploy with the 3–5 tools the AI needs most, test them with a client, and add more as you go. Every new tool is available immediately — no redeployment needed.

Manual vs. OpenAPI: which to choose

Manual APIOpenAPI Import
Best forInternal APIs, prototypes, selective exposurePublic APIs with published specs
Tool creationYou define each tool individuallyAll tools generated automatically
SpeedServer deploys instantly; tools added incrementallyFull server with all tools in < 30s
ControlMaximum — you choose exactly which endpoints to exposeAll endpoints imported; toggle off what you don't need
Spec requiredNoYes — OpenAPI 2.0, 3.0, or 3.1

Both methods produce the same result: a governed, sandboxed MCP server running on the global edge with HMAC-authenticated connection tokens, DLP, and FinOps Guard.


Next steps


Frequently Asked Questions

When should I use Manual API instead of OpenAPI Import?

Use Manual API when your API doesn't have a published OpenAPI or Swagger specification, when you want to expose only specific endpoints instead of the entire API surface, or when you're working with internal or legacy systems that require custom configuration.

Can I add tools after the initial deployment?

Yes. Manual API is designed for incremental tool building. Deploy the server first (it starts empty), then add tools one at a time from the dashboard. Each new tool is available immediately — no redeployment needed.

What do I need to define for each tool?

Each tool requires a path and HTTP verb (relative to your base URL), a description (what the AI reads to decide when to use the tool), an input schema (parameters the AI provides), and an annotation (read-only, idempotent, or destructive).

Does the base URL have to be publicly accessible?

The base URL must be a valid HTTPS domain — no localhost, no IP addresses. If your API is behind a firewall, you'll need to allowlist the Vinkius Cloud egress IPs or use a reverse proxy.

How are credentials handled for my API?

Vinkius Cloud supports Bearer Token, Basic Auth, Custom Header, and None (public APIs). Credentials are AES-256 encrypted at rest, decrypted only in-memory when the AI invokes a tool, and never logged or stored in plaintext.

What's the difference between Manual API and OpenAPI Import results?

Both produce identical results: a governed, sandboxed MCP server with HMAC-authenticated tokens, DLP, and FinOps Guard. The only difference is how tools are created — manually defined vs. auto-generated from a spec.