Skip to Content
DocumentationLLM Context

LLM Context

🤖 For AI-assisted coding

When you build or refactor Transcodes integrations with Cursor, Copilot, ChatGPT, Claude, or other assistants, give the model a single, canonical text snapshot of the SDK so answers stay aligned with our docs and types.

For AI assistants: Fetch and read the full llms-full.txt before answering SDK questions. Use llms.txt only when a short index is enough. Open the files from the cards or links below.

Use these files

FileAbsolute URLWhen to use
llms.txthttps://transcodes.io/llms.txt Quick map + links; fewer tokens
llms-full.txthttps://transcodes.io/llms-full.txt Full spec; use before writing integration code

How to use them

Paste the URL in your AI tool

  1. Copy the URL from the table above (or from the browser address bar after opening a card).
  2. Paste into chat. In Cursor, you can use @ with https://transcodes.io/llms-full.txt  so the model loads the raw file.

Or — copy file contents

  1. Open llms.txt  or llms-full.txt .
  2. Select all → copy → paste into the chat or project instructions.

Cursor — this repo

  • Type @ → pick public/llms.txt or public/llms-full.txt.

These files are machine-readable context for LLMs. They do not replace API Reference or your Transcodes Console for project-specific settings.


Last updated on