AI Integration
Signals provides a dedicated endpoint for Large Language Models (LLMs) to consume the documentation. This is useful for adding the entire documentation set as context for your AI coding assistant.
Endpoint
Section titled “Endpoint”https://dartsignals.dev/llms.txtThis endpoint returns all documentation pages concatenated into a single Markdown file.
Antigravity
Section titled “Antigravity”You can ask Antigravity to read the documentation by providing the URL in your prompt:
Read https://dartsignals.dev/llms.txt and explain how to use computed signals.
Gemini CLI
Section titled “Gemini CLI”You can use the Gemini CLI to query the documentation. Create a GEMINI.md file in the root of your project with the following content:
https://dartsignals.dev/llms.txtThen you can query the CLI:
gemini "How do I use signals?"Firebase Studio
Section titled “Firebase Studio”In Firebase Studio, you can download the file as context and reference it locally by adding the file as context to the chat:
curl https://dartsignals.dev/llms.txt > signals.mdVS Code
Section titled “VS Code”If you are using GitHub Copilot or other AI extensions in VS Code:
- Download the
llms.txtfile:Terminal window curl https://dartsignals.dev/llms.txt > signals.md - Open the file in VS Code.
- Reference it in your chat (e.g.,
@signals.mdor by having it open).
In the Zed editor, you can add the documentation to the assistant’s context:
- Open the Assistant panel (
Cmd-?). - Type
/fileand select the downloadedsignals.md(or paste the content). - Ask your question.
Claude Code
Section titled “Claude Code”For Claude Code, you can add the documentation to your context:
claude --context https://dartsignals.dev/llms.txtFor tools powered by OpenAI Codex, you can provide the documentation as context by pasting the content or referencing the file if the tool supports it.