LLM Resources
Machine-readable endpoints for AI assistants and LLM integrations
Overview
AI Toolkit provides machine-readable text files following the llms.txt convention. These endpoints help AI assistants understand and work with the toolkit.
Available Endpoints
/llms.txt — Condensed Reference
A concise overview of all modules with key export signatures. Ideal for including in an LLM's context window when the full reference is too large.
https://ai-toolkit-docs.vercel.app/llms.txt
/llms-full.txt — Complete API Reference
Every exported function, type, interface, and class across all 17 modules with full TypeScript signatures and working code examples.
https://ai-toolkit-docs.vercel.app/llms-full.txt
/prompt.txt — System Prompt
A ready-to-use system prompt for AI assistants that help developers use the toolkit. Includes module overview, import patterns, common patterns, and quick start.
https://ai-toolkit-docs.vercel.app/prompt.txt
Usage
With Claude
Paste the content of /llms-full.txt into your conversation, or reference it as context when asking questions about the toolkit.
With Custom Agents
import { createAI } from '@jamaalbuilds/ai-toolkit/ai';
const response = await fetch('https://ai-toolkit-docs.vercel.app/prompt.txt');
const systemPrompt = await response.text();
const ai = createAI();
const result = await ai.generate('How do I set up vector search?', {
system: systemPrompt,
});
With MCP
Use the toolkit's MCP module to serve these resources as MCP resources:
import { McpServerBuilder } from '@jamaalbuilds/ai-toolkit/mcp';
const server = new McpServerBuilder({ name: 'toolkit-docs', version: '1.0.0' });
server.defineResource({
uri: 'docs://ai-toolkit/api-reference',
name: 'AI Toolkit API Reference',
handler: async () => {
const res = await fetch('https://ai-toolkit-docs.vercel.app/llms-full.txt');
const text = await res.text();
return {
contents: [{ uri: 'docs://ai-toolkit/api-reference', mimeType: 'text/plain', text }],
};
},
});