Skip to main content

Documentation Index

Fetch the complete documentation index at: https://developers.fireblocks.com/llms.txt

Use this file to discover all available pages before exploring further.

Building on Fireblocks is designed to work with AI agents. Start by connecting your agent to the docs — the Documentation MCP is the one step that applies regardless of how you build. From there, use whichever combination of SDKs, CLI, and platform tools fits your workflow.

Get Started

The quickstart walks you through connecting your agent to the docs, setting up API authentication, and making your first Fireblocks API call — with SDK and CLI paths both covered.

Developer tools

These are the tools that matter most when you’re writing code against the Fireblocks API.

Fireblocks Documentation MCP

What it is: An MCP server that gives your coding agent (Cursor, Claude Code, Codex, or any MCP-compatible tool) real-time access to Fireblocks developer documentation. Your agent can search, read, and cross-reference docs without leaving your IDE. When to use it: Always. Install this before anything else. It keeps your agent grounded in canonical, up-to-date documentation rather than training-data guesses — especially important for Fireblocks, where API behavior, authentication, and object models have specific details that general LLM training may not reflect accurately. How to use it:
claude mcp add --transport http fireblocks-docs https://www.developers.fireblocks.com/mcp
Once installed, your agent can answer questions like:
  • What fields does a vault account have?
  • How does Fireblocks JWT signing work?
  • What webhooks fire when a transaction is confirmed?
Without an MCP client: Every page on this site has a Copy Page button — use it to paste a page directly into Claude, ChatGPT, or any other LLM. You can also open the page URL in Claude’s Projects or any tool with web access. The MCP handles search and freshness automatically, but copying a specific page works well when you need focused context on a single topic.

llms.txt and llms-full.txt

Fireblocks publishes documentation in the llms.txt format for tools and workflows that expect a static, crawlable text bundle:
  • llms.txt — a compact index and entry point to the docs.
  • llms-full.txt — a broader aggregate of documentation content for setups that load one large file.
Use these when you cannot run an MCP client, need an offline or snapshot ingest for a custom index, or are wiring a tool that only understands llms.txt URLs.
For everyday coding with an AI agent, the Fireblocks Documentation MCP (above) is the best option: it searches and retrieves the right pages on demand, reflects the current site, and avoids pulling huge static exports into context. Treat llms.txt / llms-full.txt as a fallback or supplement, not a replacement for the docs MCP.

Fireblocks CLI

What it is: An agent-first command-line tool that exposes every Fireblocks API operation as a typed, JWT-signed command. The help-index command returns a compact JSON index of all commands sized to fit in an LLM context window. When to use it: When you want to explore your workspace, prototype API calls, run scripts or CI jobs, or let your agent propose exact commands you can review before running. The CLI is the fastest path from “I want to do X” to a working, reviewable command — use it alongside an SDK in production, or on its own for operator and agent workflows. How to use it:
npm install -g @fireblocks/fireblocks-cli
Connect it to your workspace, then verify:
fireblocks configure
fireblocks whoami
Key agent-friendly features:
  • fireblocks help-index — compact JSON command catalogue for LLM context
  • --dry-run — preview a request before it executes
  • --no-confirm — skip interactive prompts in automated pipelines
  • Structured JSON errors on stderr with distinct exit codes
For any operation that creates or modifies data, ask your agent to show the fireblocks command first and use --dry-run where available. Review before running.
Full reference: Fireblocks CLI

Platform AI tools

These tools extend AI capabilities to the Fireblocks Console and broader operations workflows. What it is: An MCP server that connects your AI tools (Claude, ChatGPT, Cursor) directly to your Fireblocks workspace data. Where the Documentation MCP gives your agent access to docs, AI Link gives it access to your live Fireblocks environment — balances, transactions, addresses, and workspace state. When to use it: When you want to query live workspace data in natural language, build dashboards, or run operational queries without writing API code. AI Link is oriented towards operations and treasury workflows rather than code generation. Two deployment modes:
ModeCapabilitiesDeploymentBest for
Remote MCPRead-onlyFireblocks-hostedConnecting to ChatGPT, Claude, or other external AI platforms
Local MCPRead + write (transactions)Self-hosted, open-sourceCustom workflows with full data control
Example queries AI Link enables:
  • What percentage of my holdings are in stablecoins?
  • Summarize fees paid across all vaults this month.
  • What is the current balance of vault account 42?
Available via Fireblocks Labs for early access. The Local MCP server is open-source on GitHub.

Fireblocks Genie

What it is: A native AI assistant built directly into the Fireblocks Console — no external integrations required. Genie answers treasury and operations questions in real time using your workspace data, and can explain complex DeFi contract calls in plain language via the AI Transaction Summary feature. When to use it: When you’re in the Console and want fast answers about your workspace state, holdings, or a specific transaction — without switching context to another tool. Genie is not a developer tool; it’s built for treasury, finance, and operations teams who work in the Console. Capabilities:
  • Answer natural-language questions about balances, fees, and holdings
  • AI Transaction Summary: explains complex smart contract calls in human-readable terms
  • Respects your workspace policies, approval quorums, and user roles
Available via Fireblocks Labs for early access.

How the tools fit together

ToolPrimary userPrimary contextNeeds live workspace data
Documentation MCPDeveloper / agentIDE / coding agentNo — reads docs only
Fireblocks CLIDeveloper / agentTerminal / scriptsYes — calls the API
AI LinkDeveloper / operatorExternal AI toolsYes — reads/writes workspace
GenieOperations / treasuryFireblocks ConsoleYes — console-native
The Documentation MCP is the one tool that applies to every workflow — install it first. After that, use the SDK, CLI, or both depending on what you’re building. Add AI Link when you want natural-language access to live workspace data from your own AI tools. Genie is available in the Console for non-developer users on your team.