llm.txt
— a compact, link-heavy index of the most important docsllm-full.txt
— a larger, self-contained digest that includes core content inline
llm.txt and _ llm-full.txt.
Cursor
Cursor can index external docs and let you call them with@docs
in chat.
Steps
- Open Cursor → Settings → Features → Docs.
- Click Add new doc and add:
https://autonomy.computer/docs/llms.txt
https://autonomy.computer/docs/llms-full.txt
- Wait for indexing to finish (check Settings → Indexing & Docs).
- In a chat or Composer, use:
@docs Autonomy …
and ask your question.
@docs
, and indexing progress appears under settings. Some sites only index the entry page—adding specific URLs like llm-full.txt
improves recall.
VS Code with Continue
Continue lets you add documentation sources and then pull them into a chat via + Add Context or@docs
.
Steps
- Install the Continue extension.
- Open the Continue side panel → + Add Context → Docs → Add doc.
- Add:
https://autonomy.computer/docs/llms.txt
https://autonomy.computer/docs/llms-full.txt
- (Optional) In
config.yaml
, you can pre-seed docs under thedocs:
or context provider section for teams.
Claude Code
Claude Code works best when paired with a Claude Project (or local “Claude files” in your repo) that contains your authoritative docs. Option A — Claude Projects (cloud)- In the Claude web app, create a Project for your codebase.
- Add knowledge sources:
- Add URLs:
https://autonomy.computer/docs/llms.txt
https://autonomy.computer/docs/llms-full.txt
- (Optional) Upload key markdowns from your Autonomy repo.
- Add URLs:
- In Claude Code, select the Project when you start a session so it can retrieve from Project knowledge.
-
Create
CLAUDE.md
at your repo root. -
Inside it, add pointers to docs you want prioritized, e.g.:
-
Keep additional design/usage notes in
/docs/*.md
and reference them fromCLAUDE.md
.
JetBrains IDEs (IntelliJ, WebStorm, PyCharm)
JetBrains’ AI Assistant doesn’t (yet) have first-class doc crawling like Cursor, but you have two reliable paths: Option A — Use Continue for JetBrainsInstall the Continue plugin and add the same doc URLs as above; then pull context into the AI chat when needed. Option B — Prompt Library “starter” prompts
- Go to Settings → Tools → AI Assistant → Prompt Library.
-
Create a prompt template that always instructs the assistant to consult the Autonomy docs, e.g.:
- Trigger this prompt from the AI Actions menu when working on related files.
GitHub Copilot & vanilla VS Code
Copilot doesn’t currently index arbitrary external URLs. Two workarounds:- Check the docs into your repo (or mirror the most relevant pages as Markdown in
/docs/autonomy/…
). Copilot and chat will consider open files and workspace context. - Use an assistant extension that supports doc indexing alongside Copilot (e.g., Continue), then reference that context during Copilot chats.
Make it “sticky” in your repo (applies to any IDE)
Add a small, human/LLM-readable control file at repo root so assistants discover your preferred sources automatically:llm.txt
(orllms.txt
) with a short blurb and top links (keep it <10–20 links)..cursorrules
or.cursor/rules/*.md
for Cursor to bias generation (e.g., “Prefer Autonomy PrivateLinks when connecting services; link to examples”).
llm.txt
snippet to include in your repo
Troubleshooting & tips
- Only the landing page indexed? Add both
llm.txt
andllm-full.txt
, and—if your tool allows—add a few high-value deep links (API reference, PrivateLinks how-to). Some Cursor builds only pull the entrypoint unless nudged. - Doc not being used in answers? In Cursor/Continue, explicitly tag context with
@docs
/ + Add Context → Docs in the chat that generates code. - Keep noise low.
llm.txt
is meant to be concise;llm-full.txt
can be large. Provide both and let tools choose based on token budget. - Version drift. Re-index after major doc changes; not all tools auto-refresh. Cursor shows indexing status in Settings.