LLM Context
🤖 For AI-assisted codingWhen you build or refactor Transcodes integrations with Cursor, Copilot, ChatGPT, Claude, or other assistants, give the model a single, canonical text snapshot of the SDK so answers stay aligned with our docs and types.
For AI assistants: Fetch and read the full llms-full.txt before answering SDK questions. Use llms.txt only when a short index is enough. Open the files from the cards or links below.
Use these files
| File | Absolute URL | When to use |
|---|---|---|
| llms.txt | https://transcodes.io/llms.txt | Quick map + links; fewer tokens |
| llms-full.txt | https://transcodes.io/llms-full.txt | Full spec; use before writing integration code |
How to use them
Paste the URL in your AI tool
- Copy the URL from the table above (or from the browser address bar after opening a card).
- Paste into chat. In Cursor, you can use
@withhttps://transcodes.io/llms-full.txtÂso the model loads the raw file.
Or — copy file contents
- Open llms.txt or llms-full.txt .
- Select all → copy → paste into the chat or project instructions.
Cursor — this repo
- Type
@→ pickpublic/llms.txtorpublic/llms-full.txt.
These files are machine-readable context for LLMs. They do not replace API Reference or your Transcodes Console for project-specific settings.
Related
- Quick Integration — frameworks and production checklist
- API Reference — canonical method signatures and types
- Demonstration — step-by-step passkey and MFA flows
Last updated on