
A character-level string parser for building tokenizers:

```vary
from scanner import Scanner

let s = Scanner("key=value;rest")
let key = s.scan_until("=")    # "key"
s.next()                        # skip "="
let value = s.scan_until(";")  # "value"
```

| **Method** | **Returns** | **Description** |
|--------|---------|-------------|
| `Scanner(text)` | `Scanner` | Create scanner |
| `.peek()` | `Str` | Current character |
| `.next()` | `Str` | Current character, advance |
| `.eof()` | `Bool` | True if at end |
| `.scan_while(chars)` | `Str` | Consume characters in set |
| `.scan_until(chars)` | `Str` | Consume characters not in set |
| `.match_str(token)` | `Bool` | Advance if prefix matches |
| `.skip_whitespace()` | `None` | Skip spaces, tabs, newlines |
| `.rest()` | `Str` | Remaining input |
| `.mark()` | `Int` | Save current position |
| `.restore(pos)` | `None` | Restore saved position |
