The data platform for agents
Dinobase syncs 100+ sources — APIs, databases, files, MCP servers — annotates your data, makes it SQL-ready for agents.
01 — before / after
Drag to see what changes when your agent stack has a real data platform.
Scroll to see what changes when your agent stack has a real data platform.
02 — benchmarks
query accuracy across benchmarked LLMs
fewer tokens vs API calling
avg query time
sources synced automatically
benchmarked on the Dinobase eval suite · 10+ LLMs including GPT-5.4, Claude 4.6, Gemini 3
03 — connect
100+ connectors — APIs, databases, files, MCP servers —
one place your agents look. Add them from the CLI, or
run dinobase setup to run a local setup UI.
04 — understandable
After every sync, Dinobase automatically builds a semantic
layer. Agents call describe and get column descriptions, PII flags, enum values, and the
full relationship graph — so they write correct SQL on the first
try.
✗ Raw SQL without context
subscriptions id VARCHAR customer_id VARCHAR plan_id VARCHAR status VARCHAR created INTEGER
Agent has to guess: what is created?
Unix? ISO? What values does status accept?
How does customer_id join?
✓ Dinobase semantic layer
stripe.subscriptions — "Active and historical subscriptions" customer_id VARCHAR → stripe.customers.id [FK] plan_id VARCHAR → stripe.plans.id [FK] status VARCHAR "active|past_due|canceled|trialing" created INTEGER "Unix timestamp — use to_timestamp()" amount INTEGER "Amount in cents" Related: stripe.customers, stripe.invoices
Agent knows the join path, the enum values, the timestamp format, and which fields are PII — before writing a single query.
05 — actionable
Dinobase gives your agent two ways to act. SQL mutations with preview → confirm for structured writes. A Python execution layer for everything else: chain MCP tools, mix SQL with Python logic, reach every connected source in one call.
Mutations docs → Python execution docs →✓ SQL mutations
dinobase query "UPDATE stripe.customers SET name = 'Acme Inc' WHERE id = 'cus_123'" # → mutation_id: mut_abc123 # → rows_affected: 1 # → side_effects: will call Stripe API dinobase confirm mut_abc123 # ✓ Stripe API called # ✓ Local data updated
One SQL primitive for every source. Preview before
anything executes. Destructive DDL (DROP, ALTER,
TRUNCATE) blocked entirely.
✓ Python execution layer
# Agents call exec_code(...) with a code snippet from dinobase.query.engine import QueryEngine from dinobase.db import DinobaseDB from dinobase.mcp import call with DinobaseDB() as db: rows = QueryEngine(db).execute(""" SELECT email, mrr FROM stripe.customers WHERE status = 'past_due' ORDER BY mrr DESC LIMIT 10 """)["rows"] for row in rows: t = call("zendesk.tickets_for_user", email=row["email"]) row["open_tickets"] = len(t.get("open", [])) result = sorted(rows, key=lambda r: -r["open_tickets"])
SQL + MCP + Python — one tool call, zero ping-pong. Your agent writes the orchestration; Dinobase runs it.
06 — integrations
Native SDKs for LangChain, CrewAI, LlamaIndex, Pydantic AI, and Mastra. MCP server for Claude Desktop, Cursor, and Codex.
Read the docs →# Claude Desktop curl -fsSL https://dinobase.ai/install.sh | bash -s -- claude-desktop # Cursor curl -fsSL https://dinobase.ai/install.sh | bash -s -- cursor # Codex curl -fsSL https://dinobase.ai/install.sh | bash -s -- codex # Claude Code curl -fsSL https://dinobase.ai/install.sh | bash -s -- claude-code
pip install dinobase langchain langgraph from integrations.langchain.toolkit import DinobaseToolkit toolkit = DinobaseToolkit() agent = create_react_agent(model, tools=toolkit.get_tools())
pip install dinobase crewai from integrations.crewai.tools import dinobase_query, dinobase_describe analyst = Agent(tools=[dinobase_describe, dinobase_query], ...)
pip install dinobase llama-index llama-index-llms-anthropic from integrations.llamaindex.tool_spec import DinobaseToolSpec tool_spec = DinobaseToolSpec() agent = ReActAgent.from_tools(tool_spec.to_tool_list(), llm=llm)
pip install "dinobase[pydantic-ai]" from dinobase.integrations.pydantic_ai.tools import DinobaseDeps, dinobase_tools agent = Agent( "anthropic:claude-sonnet-4-6", deps_type=DinobaseDeps, toolsets=[dinobase_tools], )
npm install @mastra/core @mastra/mcp @ai-sdk/anthropic && pip install dinobase const mcp = new MCPClient({ servers: { dinobase: { command: "dinobase", args: ["serve"] } }, }); const agent = new Agent({ model: anthropic("claude-sonnet-4-6"), tools: await mcp.listTools(), // all 8 tools, auto-discovered });
local or cloud
Dinobase Local
exec_code — Python execution layer combining
SQL, MCP tools, and the full Python ecosystem in one
call
Dinobase Cloud