LLMs won't have training data on Vary, but AI coding tools can use the skill we provide for them.
The easiest way to set up the LLM skill is to use vary new with the --agent flag:
vary new my-app --agent claude # default
vary new my-app --agent codex
vary new my-app --agent cursor
vary new my-app --agent opencode
vary new my-app --agent kiro
This creates a new project with the skill pack already copied into the correct directory for your agent. See the installing agent skills article for details.
Vary ships an LLM skill that gives AI coding assistants the full language reference: grammar, types, stdlib, toolchain commands, and a list of features that Vary deliberately excludes. With the skill loaded, LLMs write valid Vary code without falling back to Python habits.
The skill is included in every Vary installation. Its location depends on how you installed Vary:
| Installation method | Skill path |
|---|---|
| varyup | ~/.varyup/toolchains/<version>/llm-skill/ |
| Linux tar.gz | <extract-dir>/llm-skill/ |
| macOS tar.gz | <extract-dir>/llm-skill/ |
| Windows ZIP | <extract-dir>\llm-skill\ |
| Docker image | /opt/vary/llm-skill/ |
If you installed via varyup, you can find the active toolchain path with:
varyup which vary
The skill directory is alongside the vary.jar in that toolchain.
The skill contains:
| File | Contents |
|---|---|
SKILL.md | Core syntax, types, control flow, pattern matching, null safety, testing, web framework |
grammar-reference.md | Full EBNF grammar, operator precedence, all pattern types |
stdlib-reference.md | Built-in functions, collection methods, JSON API, filesystem, HTTP client |
toolchain-reference.md | CLI commands, mutation testing flags, artifact caching, project config |
contracts-reference.md | Preconditions, postconditions, invariants, pure functions |
To use it, add the skill directory to your AI coding tool's configuration, or copy the contents into your system prompt. The skill files are plain markdown and work with any LLM.