Compare commits
26 Commits
98605d2b70
...
v0.1.3
| Author | SHA1 | Date | |
|---|---|---|---|
| 6a2e4a7ac1 | |||
| 3d706cb32b | |||
| 7c3bfa9301 | |||
| b56c5461f1 | |||
| 61e1469845 | |||
| bb0a288210 | |||
| 5d7f4633e1 | |||
| d05b13d840 | |||
| 0ee3050704 | |||
| 80b1276f9f | |||
| bd843d2219 | |||
| d76aa17b38 | |||
| c23d9c7078 | |||
| fffacd2467 | |||
| 2ae2c132e5 | |||
| 4909ff9fff | |||
| 8e788c8a9f | |||
| dbdd3cca57 | |||
| 3ac022c04a | |||
| 6bedd37ac7 | |||
| 2909bf14b6 | |||
| d8871acf7e | |||
| 73b5eee664 | |||
| 542255780d | |||
| bac63bab2a | |||
| db82ca1a1c |
5
.gitignore
vendored
5
.gitignore
vendored
@@ -4,6 +4,11 @@
|
|||||||
# Claude Code project instructions
|
# Claude Code project instructions
|
||||||
CLAUDE.md
|
CLAUDE.md
|
||||||
|
|
||||||
|
# Build output
|
||||||
|
_site/
|
||||||
|
docs/*.html
|
||||||
|
docs/*.css
|
||||||
|
|
||||||
# Test binaries
|
# Test binaries
|
||||||
hello
|
hello
|
||||||
test_rc
|
test_rc
|
||||||
|
|||||||
78
CLAUDE.md
78
CLAUDE.md
@@ -42,15 +42,46 @@ When making changes:
|
|||||||
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
|
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
|
||||||
8. **Git commits**: Always use `--no-gpg-sign` flag
|
8. **Git commits**: Always use `--no-gpg-sign` flag
|
||||||
|
|
||||||
### Post-work checklist (run after each major piece of work)
|
### Post-work checklist (run after each committable change)
|
||||||
|
|
||||||
|
**MANDATORY: Run the full validation script after every committable change:**
|
||||||
|
```bash
|
||||||
|
./scripts/validate.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This script runs ALL of the following checks and will fail if any regress:
|
||||||
|
1. `cargo check` — no Rust compilation errors
|
||||||
|
2. `cargo test` — all Rust tests pass (currently 387)
|
||||||
|
3. `cargo build --release` — release binary builds
|
||||||
|
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
|
||||||
|
5. `lux check` on every package — type checking + lint passes
|
||||||
|
|
||||||
|
If `validate.sh` is not available or you need to run manually:
|
||||||
```bash
|
```bash
|
||||||
nix develop --command cargo check # No Rust errors
|
nix develop --command cargo check # No Rust errors
|
||||||
nix develop --command cargo test # All tests pass (currently 381)
|
nix develop --command cargo test # All Rust tests pass
|
||||||
./target/release/lux check # Type check + lint all .lux files
|
nix develop --command cargo build --release # Build release binary
|
||||||
./target/release/lux fmt # Format all .lux files
|
cd ../packages/path && ../../lang/target/release/lux test # Package tests
|
||||||
./target/release/lux lint # Standalone lint pass
|
cd ../packages/frontmatter && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/xml && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/rss && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/markdown && ../../lang/target/release/lux test
|
||||||
```
|
```
|
||||||
|
|
||||||
|
**Do NOT commit if any check fails.** Fix the issue first.
|
||||||
|
|
||||||
|
### Commit after every piece of work
|
||||||
|
**After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
|
||||||
|
|
||||||
|
**Commit workflow:**
|
||||||
|
1. Make the change
|
||||||
|
2. Run `./scripts/validate.sh` (all 13 checks must pass)
|
||||||
|
3. `git add` the relevant files
|
||||||
|
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
|
||||||
|
5. Move on to the next task
|
||||||
|
|
||||||
|
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
|
||||||
|
|
||||||
**IMPORTANT: Always verify Lux code you write:**
|
**IMPORTANT: Always verify Lux code you write:**
|
||||||
- Run with interpreter: `./target/release/lux file.lux`
|
- Run with interpreter: `./target/release/lux file.lux`
|
||||||
- Compile to binary: `./target/release/lux compile file.lux`
|
- Compile to binary: `./target/release/lux compile file.lux`
|
||||||
@@ -68,10 +99,45 @@ nix develop --command cargo test # All tests pass (currently 381)
|
|||||||
| `lux serve` | `lux s` | Static file server |
|
| `lux serve` | `lux s` | Static file server |
|
||||||
| `lux compile` | `lux c` | Compile to binary |
|
| `lux compile` | `lux c` | Compile to binary |
|
||||||
|
|
||||||
|
## Documenting Lux Language Errors
|
||||||
|
|
||||||
|
When working on any major task that involves writing Lux code, **document every language error, limitation, or surprising behavior** you encounter. This log is optimized for LLM consumption so future sessions can avoid repeating mistakes.
|
||||||
|
|
||||||
|
**File:** Maintain an `ISSUES.md` in the relevant project directory (e.g., `~/src/blu-site/ISSUES.md`).
|
||||||
|
|
||||||
|
**Format for each entry:**
|
||||||
|
```markdown
|
||||||
|
## Issue N: <Short descriptive title>
|
||||||
|
|
||||||
|
**Category**: Parser limitation | Type checker gap | Missing feature | Runtime error | Documentation gap
|
||||||
|
**Severity**: High | Medium | Low
|
||||||
|
**Status**: Open | **Fixed** (commit hash or version)
|
||||||
|
|
||||||
|
<1-2 sentence description of the problem>
|
||||||
|
|
||||||
|
**Reproduction:**
|
||||||
|
```lux
|
||||||
|
// Minimal code that triggers the issue
|
||||||
|
```
|
||||||
|
|
||||||
|
**Error message:** `<exact error text>`
|
||||||
|
|
||||||
|
**Workaround:** <how to accomplish the goal despite the limitation>
|
||||||
|
|
||||||
|
**Fix:** <if fixed, what was changed and where>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Rules:**
|
||||||
|
- Add new issues as you encounter them during any task
|
||||||
|
- When a previously documented issue gets fixed, update its status to **Fixed** and note the commit/version
|
||||||
|
- Remove entries that are no longer relevant (e.g., the feature was redesigned entirely)
|
||||||
|
- Keep the summary table at the bottom of ISSUES.md in sync with the entries
|
||||||
|
- Do NOT duplicate issues already documented -- check existing entries first
|
||||||
|
|
||||||
## Code Quality
|
## Code Quality
|
||||||
|
|
||||||
- Fix all compiler warnings before committing
|
- Fix all compiler warnings before committing
|
||||||
- Ensure all tests pass (currently 381 tests)
|
- Ensure all tests pass (currently 387 tests)
|
||||||
- Add new tests when adding features
|
- Add new tests when adding features
|
||||||
- Keep examples and documentation in sync
|
- Keep examples and documentation in sync
|
||||||
|
|
||||||
|
|||||||
216
Cargo.lock
generated
216
Cargo.lock
generated
@@ -135,16 +135,6 @@ dependencies = [
|
|||||||
"libc",
|
"libc",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "core-foundation"
|
|
||||||
version = "0.10.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "b2a6cd9ae233e7f62ba4e9353e81a88df7fc8a5987b8d445b4d90c879bd156f6"
|
|
||||||
dependencies = [
|
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "core-foundation-sys"
|
name = "core-foundation-sys"
|
||||||
version = "0.8.7"
|
version = "0.8.7"
|
||||||
@@ -235,7 +225,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb"
|
checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"libc",
|
"libc",
|
||||||
"windows-sys 0.61.2",
|
"windows-sys 0.59.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -297,21 +287,6 @@ version = "0.1.5"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
|
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "foreign-types"
|
|
||||||
version = "0.3.2"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
|
|
||||||
dependencies = [
|
|
||||||
"foreign-types-shared",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "foreign-types-shared"
|
|
||||||
version = "0.1.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "form_urlencoded"
|
name = "form_urlencoded"
|
||||||
version = "1.2.2"
|
version = "1.2.2"
|
||||||
@@ -552,16 +527,17 @@ dependencies = [
|
|||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "hyper-tls"
|
name = "hyper-rustls"
|
||||||
version = "0.5.0"
|
version = "0.24.2"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905"
|
checksum = "ec3efd23720e2049821a693cbc7e65ea87c72f1c58ff2f9522ff332b1491e590"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bytes",
|
"futures-util",
|
||||||
|
"http",
|
||||||
"hyper",
|
"hyper",
|
||||||
"native-tls",
|
"rustls",
|
||||||
"tokio",
|
"tokio",
|
||||||
"tokio-native-tls",
|
"tokio-rustls",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -794,7 +770,7 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "lux"
|
name = "lux"
|
||||||
version = "0.1.0"
|
version = "0.1.2"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"lsp-server",
|
"lsp-server",
|
||||||
"lsp-types",
|
"lsp-types",
|
||||||
@@ -843,23 +819,6 @@ dependencies = [
|
|||||||
"windows-sys 0.61.2",
|
"windows-sys 0.61.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "native-tls"
|
|
||||||
version = "0.2.16"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "9d5d26952a508f321b4d3d2e80e78fc2603eaefcdf0c30783867f19586518bdc"
|
|
||||||
dependencies = [
|
|
||||||
"libc",
|
|
||||||
"log",
|
|
||||||
"openssl",
|
|
||||||
"openssl-probe",
|
|
||||||
"openssl-sys",
|
|
||||||
"schannel",
|
|
||||||
"security-framework",
|
|
||||||
"security-framework-sys",
|
|
||||||
"tempfile",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "nibble_vec"
|
name = "nibble_vec"
|
||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
@@ -905,50 +864,6 @@ version = "1.21.3"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
|
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl"
|
|
||||||
version = "0.10.75"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
|
|
||||||
dependencies = [
|
|
||||||
"bitflags 2.10.0",
|
|
||||||
"cfg-if",
|
|
||||||
"foreign-types",
|
|
||||||
"libc",
|
|
||||||
"once_cell",
|
|
||||||
"openssl-macros",
|
|
||||||
"openssl-sys",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-macros"
|
|
||||||
version = "0.1.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
|
|
||||||
dependencies = [
|
|
||||||
"proc-macro2",
|
|
||||||
"quote",
|
|
||||||
"syn",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-probe"
|
|
||||||
version = "0.2.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-sys"
|
|
||||||
version = "0.9.111"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
|
|
||||||
dependencies = [
|
|
||||||
"cc",
|
|
||||||
"libc",
|
|
||||||
"pkg-config",
|
|
||||||
"vcpkg",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "parking_lot"
|
name = "parking_lot"
|
||||||
version = "0.12.5"
|
version = "0.12.5"
|
||||||
@@ -1203,15 +1118,15 @@ dependencies = [
|
|||||||
"http",
|
"http",
|
||||||
"http-body",
|
"http-body",
|
||||||
"hyper",
|
"hyper",
|
||||||
"hyper-tls",
|
"hyper-rustls",
|
||||||
"ipnet",
|
"ipnet",
|
||||||
"js-sys",
|
"js-sys",
|
||||||
"log",
|
"log",
|
||||||
"mime",
|
"mime",
|
||||||
"native-tls",
|
|
||||||
"once_cell",
|
"once_cell",
|
||||||
"percent-encoding",
|
"percent-encoding",
|
||||||
"pin-project-lite",
|
"pin-project-lite",
|
||||||
|
"rustls",
|
||||||
"rustls-pemfile",
|
"rustls-pemfile",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -1219,15 +1134,30 @@ dependencies = [
|
|||||||
"sync_wrapper",
|
"sync_wrapper",
|
||||||
"system-configuration",
|
"system-configuration",
|
||||||
"tokio",
|
"tokio",
|
||||||
"tokio-native-tls",
|
"tokio-rustls",
|
||||||
"tower-service",
|
"tower-service",
|
||||||
"url",
|
"url",
|
||||||
"wasm-bindgen",
|
"wasm-bindgen",
|
||||||
"wasm-bindgen-futures",
|
"wasm-bindgen-futures",
|
||||||
"web-sys",
|
"web-sys",
|
||||||
|
"webpki-roots",
|
||||||
"winreg",
|
"winreg",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "ring"
|
||||||
|
version = "0.17.14"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "a4689e6c2294d81e88dc6261c768b63bc4fcdb852be6d1352498b114f61383b7"
|
||||||
|
dependencies = [
|
||||||
|
"cc",
|
||||||
|
"cfg-if",
|
||||||
|
"getrandom 0.2.17",
|
||||||
|
"libc",
|
||||||
|
"untrusted",
|
||||||
|
"windows-sys 0.52.0",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rusqlite"
|
name = "rusqlite"
|
||||||
version = "0.31.0"
|
version = "0.31.0"
|
||||||
@@ -1252,7 +1182,19 @@ dependencies = [
|
|||||||
"errno",
|
"errno",
|
||||||
"libc",
|
"libc",
|
||||||
"linux-raw-sys",
|
"linux-raw-sys",
|
||||||
"windows-sys 0.61.2",
|
"windows-sys 0.59.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rustls"
|
||||||
|
version = "0.21.12"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "3f56a14d1f48b391359b22f731fd4bd7e43c97f3c50eee276f3aa09c94784d3e"
|
||||||
|
dependencies = [
|
||||||
|
"log",
|
||||||
|
"ring",
|
||||||
|
"rustls-webpki",
|
||||||
|
"sct",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -1264,6 +1206,16 @@ dependencies = [
|
|||||||
"base64 0.21.7",
|
"base64 0.21.7",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rustls-webpki"
|
||||||
|
version = "0.101.7"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "8b6275d1ee7a1cd780b64aca7726599a1dbc893b1e64144529e55c3c2f745765"
|
||||||
|
dependencies = [
|
||||||
|
"ring",
|
||||||
|
"untrusted",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rustversion"
|
name = "rustversion"
|
||||||
version = "1.0.22"
|
version = "1.0.22"
|
||||||
@@ -1298,15 +1250,6 @@ version = "1.0.23"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
|
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "schannel"
|
|
||||||
version = "0.1.28"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
|
|
||||||
dependencies = [
|
|
||||||
"windows-sys 0.61.2",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "scopeguard"
|
name = "scopeguard"
|
||||||
version = "1.2.0"
|
version = "1.2.0"
|
||||||
@@ -1314,26 +1257,13 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "security-framework"
|
name = "sct"
|
||||||
version = "3.6.0"
|
version = "0.7.1"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d17b898a6d6948c3a8ee4372c17cb384f90d2e6e912ef00895b14fd7ab54ec38"
|
checksum = "da046153aa2352493d6cb7da4b6e5c0c057d8a1d0a9aa8560baffdd945acd414"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bitflags 2.10.0",
|
"ring",
|
||||||
"core-foundation 0.10.1",
|
"untrusted",
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
"security-framework-sys",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "security-framework-sys"
|
|
||||||
version = "2.16.0"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "321c8673b092a9a42605034a9879d73cb79101ed5fd117bc9a597b89b4e9e61a"
|
|
||||||
dependencies = [
|
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -1521,7 +1451,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
|
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bitflags 1.3.2",
|
"bitflags 1.3.2",
|
||||||
"core-foundation 0.9.4",
|
"core-foundation",
|
||||||
"system-configuration-sys",
|
"system-configuration-sys",
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -1545,7 +1475,7 @@ dependencies = [
|
|||||||
"getrandom 0.4.1",
|
"getrandom 0.4.1",
|
||||||
"once_cell",
|
"once_cell",
|
||||||
"rustix",
|
"rustix",
|
||||||
"windows-sys 0.61.2",
|
"windows-sys 0.59.0",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -1619,16 +1549,6 @@ dependencies = [
|
|||||||
"windows-sys 0.61.2",
|
"windows-sys 0.61.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "tokio-native-tls"
|
|
||||||
version = "0.3.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
|
|
||||||
dependencies = [
|
|
||||||
"native-tls",
|
|
||||||
"tokio",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tokio-postgres"
|
name = "tokio-postgres"
|
||||||
version = "0.7.16"
|
version = "0.7.16"
|
||||||
@@ -1655,6 +1575,16 @@ dependencies = [
|
|||||||
"whoami",
|
"whoami",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "tokio-rustls"
|
||||||
|
version = "0.24.1"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "c28327cf380ac148141087fbfb9de9d7bd4e84ab5d2c28fbc911d753de8a7081"
|
||||||
|
dependencies = [
|
||||||
|
"rustls",
|
||||||
|
"tokio",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tokio-util"
|
name = "tokio-util"
|
||||||
version = "0.7.18"
|
version = "0.7.18"
|
||||||
@@ -1750,6 +1680,12 @@ version = "0.2.6"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
|
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "untrusted"
|
||||||
|
version = "0.9.0"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "url"
|
name = "url"
|
||||||
version = "2.5.8"
|
version = "2.5.8"
|
||||||
@@ -1941,6 +1877,12 @@ dependencies = [
|
|||||||
"wasm-bindgen",
|
"wasm-bindgen",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "webpki-roots"
|
||||||
|
version = "0.25.4"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "5f20c57d8d7db6d3b86154206ae5d8fba62dd39573114de97c2cb0578251f8e1"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "whoami"
|
name = "whoami"
|
||||||
version = "2.1.1"
|
version = "2.1.1"
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "lux"
|
name = "lux"
|
||||||
version = "0.1.0"
|
version = "0.1.3"
|
||||||
edition = "2021"
|
edition = "2021"
|
||||||
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
|
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
@@ -13,7 +13,7 @@ lsp-types = "0.94"
|
|||||||
serde = { version = "1", features = ["derive"] }
|
serde = { version = "1", features = ["derive"] }
|
||||||
serde_json = "1"
|
serde_json = "1"
|
||||||
rand = "0.8"
|
rand = "0.8"
|
||||||
reqwest = { version = "0.11", features = ["blocking", "json"] }
|
reqwest = { version = "0.11", default-features = false, features = ["blocking", "json", "rustls-tls"] }
|
||||||
tiny_http = "0.12"
|
tiny_http = "0.12"
|
||||||
rusqlite = { version = "0.31", features = ["bundled"] }
|
rusqlite = { version = "0.31", features = ["bundled"] }
|
||||||
postgres = "0.19"
|
postgres = "0.19"
|
||||||
|
|||||||
367
PACKAGES.md
Normal file
367
PACKAGES.md
Normal file
@@ -0,0 +1,367 @@
|
|||||||
|
# Lux Package Ecosystem Plan
|
||||||
|
|
||||||
|
## Current State
|
||||||
|
|
||||||
|
### Stdlib (built-in)
|
||||||
|
| Module | Coverage |
|
||||||
|
|--------|----------|
|
||||||
|
| String | Comprehensive (split, join, trim, indexOf, replace, etc.) |
|
||||||
|
| List | Good (map, filter, fold, head, tail, concat, range, find, any, all, take, drop) |
|
||||||
|
| Option | Basic (map, flatMap, getOrElse, isSome, isNone) |
|
||||||
|
| Result | Basic (map, flatMap, getOrElse, isOk, isErr) |
|
||||||
|
| Math | Basic (abs, min, max, sqrt, pow, floor, ceil, round) |
|
||||||
|
| Json | Comprehensive (parse, stringify, get, typed extractors, constructors) |
|
||||||
|
| File | Good (read, write, append, exists, delete, readDir, isDir, mkdir) |
|
||||||
|
| Console | Good (print, read, readLine, readInt) |
|
||||||
|
| Process | Good (exec, execStatus, env, args, exit, cwd) |
|
||||||
|
| Http | Basic (get, post, put, delete, setHeader) |
|
||||||
|
| HttpServer | Basic (listen, accept, respond) |
|
||||||
|
| Time | Minimal (now, sleep) |
|
||||||
|
| Random | Basic (int, float, bool) |
|
||||||
|
| Sql | Good (SQLite: open, query, execute, transactions) |
|
||||||
|
| Postgres | Good (connect, query, execute, transactions) |
|
||||||
|
| Schema | Niche (versioned data migration) |
|
||||||
|
| Test | Good (assert, assertEqual, assertTrue) |
|
||||||
|
| Concurrent | Experimental (spawn, await, yield, cancel) |
|
||||||
|
| Channel | Experimental (create, send, receive) |
|
||||||
|
|
||||||
|
### Registry (pkgs.lux) - 3 packages
|
||||||
|
| Package | Version | Notes |
|
||||||
|
|---------|---------|-------|
|
||||||
|
| json | 1.0.0 | Wraps stdlib Json with convenience functions (getPath, getString, etc.) |
|
||||||
|
| http-client | 0.1.0 | Wraps stdlib Http with JSON helpers, URL encoding |
|
||||||
|
| testing | 0.1.0 | Wraps stdlib Test with describe/it structure |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Gap Analysis
|
||||||
|
|
||||||
|
### What's Missing vs Other Languages
|
||||||
|
|
||||||
|
Compared to ecosystems like Rust/cargo, Go, Python, Elm, Gleam:
|
||||||
|
|
||||||
|
| Category | Gap | Impact | Notes |
|
||||||
|
|----------|-----|--------|-------|
|
||||||
|
| **Collections** | No HashMap, Set, Queue, Stack | Critical | List-of-pairs with O(n) lookup is the only option |
|
||||||
|
| **Sorting** | No List.sort or List.sortBy | High | Must implement insertion sort manually |
|
||||||
|
| **Date/Time** | Only `Time.now()` (epoch ms), no parsing/formatting | High | blu-site does string-based date formatting manually |
|
||||||
|
| **Markdown** | No markdown parser | High | blu-site has 300+ lines of hand-rolled markdown |
|
||||||
|
| **XML/RSS** | No XML generation | High | Can't generate RSS feeds or sitemaps |
|
||||||
|
| **Regex** | No pattern matching on strings | High | Character-by-character scanning required |
|
||||||
|
| **Path** | No file path utilities | Medium | basename/dirname manually reimplemented |
|
||||||
|
| **YAML/TOML** | No config file parsing (beyond JSON) | Medium | Frontmatter parsing is manual |
|
||||||
|
| **Template** | No string templating | Medium | HTML built via raw string concatenation |
|
||||||
|
| **URL** | No URL parsing/encoding | Medium | http-client has basic urlEncode but no parser |
|
||||||
|
| **Crypto** | No hashing (SHA256, etc.) | Medium | Can't do checksums, content hashing |
|
||||||
|
| **Base64** | No encoding/decoding | Low | Needed for data URIs, some auth |
|
||||||
|
| **CSV** | No CSV parsing | Low | Common data format |
|
||||||
|
| **UUID** | No UUID generation | Low | Useful for IDs |
|
||||||
|
| **Logging** | No structured logging | Low | Just Console.print |
|
||||||
|
| **CLI** | No argument parsing library | Low | Manual arg handling |
|
||||||
|
|
||||||
|
### What Should Be Stdlib vs Package
|
||||||
|
|
||||||
|
**Should be stdlib additions** (too fundamental to be packages):
|
||||||
|
- HashMap / Map type (requires runtime support)
|
||||||
|
- List.sort / List.sortBy (fundamental operation)
|
||||||
|
- Better Time module (date parsing, formatting)
|
||||||
|
- Regex (needs runtime/C support for performance)
|
||||||
|
- Path module (cross-platform file path handling)
|
||||||
|
|
||||||
|
**Should be packages** (application-level, opinionated, composable):
|
||||||
|
- markdown
|
||||||
|
- xml
|
||||||
|
- rss/atom
|
||||||
|
- frontmatter
|
||||||
|
- template
|
||||||
|
- csv
|
||||||
|
- crypto
|
||||||
|
- ssg (static site generator framework)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Priority Package Plans
|
||||||
|
|
||||||
|
Ordered by what unblocks blu-site fixes first, then general ecosystem value.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 1: `markdown` (Priority: HIGHEST)
|
||||||
|
|
||||||
|
**Why:** The 300-line markdown parser in blu-site's main.lux is general-purpose code that belongs in a reusable package. It's also the most complex part of blu-site and has known bugs (e.g., `### ` inside list items renders literally).
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
markdown/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: parse, parseInline
|
||||||
|
src/
|
||||||
|
inline.lux # Inline parsing (bold, italic, links, images, code)
|
||||||
|
block.lux # Block parsing (headings, lists, code blocks, blockquotes, hr)
|
||||||
|
types.lux # AST types (optional - could emit HTML directly)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
// Convert markdown string to HTML string
|
||||||
|
pub fn toHtml(markdown: String): String
|
||||||
|
|
||||||
|
// Convert inline markdown only (no blocks)
|
||||||
|
pub fn inlineToHtml(text: String): String
|
||||||
|
|
||||||
|
// Escape HTML entities
|
||||||
|
pub fn escapeHtml(s: String): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Improvements over current blu-site code:**
|
||||||
|
- Fix heading-inside-list-item rendering (`- ### Title` should work)
|
||||||
|
- Support nested lists (currently flat only)
|
||||||
|
- Support reference-style links `[text][ref]`
|
||||||
|
- Handle edge cases (empty lines in code blocks, nested blockquotes)
|
||||||
|
- Proper HTML entity escaping in more contexts
|
||||||
|
|
||||||
|
**Depends on:** Nothing (pure string processing)
|
||||||
|
|
||||||
|
**Estimated size:** ~400-500 lines of Lux
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 2: `xml` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** Needed for RSS/Atom feed generation, sitemap.xml, and robots.txt generation. General-purpose XML builder that doesn't try to parse XML (which would need regex), just emits it.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
xml/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: element, document, serialize
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type XmlNode =
|
||||||
|
| Element(String, List<XmlAttr>, List<XmlNode>)
|
||||||
|
| Text(String)
|
||||||
|
| CData(String)
|
||||||
|
| Comment(String)
|
||||||
|
| Declaration(String, String) // version, encoding
|
||||||
|
|
||||||
|
type XmlAttr =
|
||||||
|
| Attr(String, String)
|
||||||
|
|
||||||
|
// Build an XML element
|
||||||
|
pub fn element(tag: String, attrs: List<XmlAttr>, children: List<XmlNode>): XmlNode
|
||||||
|
|
||||||
|
// Build a text node (auto-escapes)
|
||||||
|
pub fn text(content: String): XmlNode
|
||||||
|
|
||||||
|
// Build a CDATA section
|
||||||
|
pub fn cdata(content: String): XmlNode
|
||||||
|
|
||||||
|
// Serialize XML tree to string
|
||||||
|
pub fn serialize(node: XmlNode): String
|
||||||
|
|
||||||
|
// Serialize with XML declaration header
|
||||||
|
pub fn document(version: String, encoding: String, root: XmlNode): String
|
||||||
|
|
||||||
|
// Convenience: self-closing element
|
||||||
|
pub fn selfClosing(tag: String, attrs: List<XmlAttr>): XmlNode
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~150-200 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 3: `rss` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** Directly needed for blu-site's #6 priority fix (add RSS feed). Builds on `xml` package.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
rss/
|
||||||
|
lux.toml # depends on xml
|
||||||
|
lib.lux # Public API: feed, item, toXml, toAtom
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type FeedInfo =
|
||||||
|
| FeedInfo(String, String, String, String, String)
|
||||||
|
// title, link, description, language, lastBuildDate
|
||||||
|
|
||||||
|
type FeedItem =
|
||||||
|
| FeedItem(String, String, String, String, String, String)
|
||||||
|
// title, link, description, pubDate, guid, categories (comma-separated)
|
||||||
|
|
||||||
|
// Generate RSS 2.0 XML string
|
||||||
|
pub fn toRss(info: FeedInfo, items: List<FeedItem>): String
|
||||||
|
|
||||||
|
// Generate Atom 1.0 XML string
|
||||||
|
pub fn toAtom(info: FeedInfo, items: List<FeedItem>): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** `xml`
|
||||||
|
|
||||||
|
**Estimated size:** ~100-150 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 4: `frontmatter` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** blu-site has ~50 lines of fragile frontmatter parsing. This is a common need for any content-driven Lux project. The current parser uses `String.indexOf(line, ": ")` which breaks on values containing `: `.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
frontmatter/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: parse
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type FrontmatterResult =
|
||||||
|
| FrontmatterResult(List<(String, String)>, String)
|
||||||
|
// key-value pairs, remaining body
|
||||||
|
|
||||||
|
// Parse frontmatter from a string (--- delimited YAML-like header)
|
||||||
|
pub fn parse(content: String): FrontmatterResult
|
||||||
|
|
||||||
|
// Get a value by key from parsed frontmatter
|
||||||
|
pub fn get(pairs: List<(String, String)>, key: String): Option<String>
|
||||||
|
|
||||||
|
// Get a value or default
|
||||||
|
pub fn getOrDefault(pairs: List<(String, String)>, key: String, default: String): String
|
||||||
|
|
||||||
|
// Parse a space-separated tag string into a list
|
||||||
|
pub fn parseTags(tagString: String): List<String>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Improvements over current blu-site code:**
|
||||||
|
- Handle values with `: ` in them (only split on first `: `)
|
||||||
|
- Handle multi-line values (indented continuation)
|
||||||
|
- Handle quoted values with embedded newlines
|
||||||
|
- Strip quotes from values consistently
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~100-150 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 5: `path` (Priority: MEDIUM)
|
||||||
|
|
||||||
|
**Why:** blu-site manually implements `basename` and `dirname`. Any file-processing Lux program needs these. Tiny but universally useful.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
path/
|
||||||
|
lux.toml
|
||||||
|
lib.lux
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
// Get filename from path: "/foo/bar.txt" -> "bar.txt"
|
||||||
|
pub fn basename(p: String): String
|
||||||
|
|
||||||
|
// Get directory from path: "/foo/bar.txt" -> "/foo"
|
||||||
|
pub fn dirname(p: String): String
|
||||||
|
|
||||||
|
// Get file extension: "file.txt" -> "txt", "file" -> ""
|
||||||
|
pub fn extension(p: String): String
|
||||||
|
|
||||||
|
// Remove file extension: "file.txt" -> "file"
|
||||||
|
pub fn stem(p: String): String
|
||||||
|
|
||||||
|
// Join path segments: join("foo", "bar") -> "foo/bar"
|
||||||
|
pub fn join(a: String, b: String): String
|
||||||
|
|
||||||
|
// Normalize path: "foo//bar/../baz" -> "foo/baz"
|
||||||
|
pub fn normalize(p: String): String
|
||||||
|
|
||||||
|
// Check if path is absolute
|
||||||
|
pub fn isAbsolute(p: String): Bool
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~80-120 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 6: `sitemap` (Priority: MEDIUM)
|
||||||
|
|
||||||
|
**Why:** Directly needed for blu-site's #9 priority fix. Simple package that generates sitemap.xml.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
sitemap/
|
||||||
|
lux.toml # depends on xml
|
||||||
|
lib.lux
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type SitemapEntry =
|
||||||
|
| SitemapEntry(String, String, String, String)
|
||||||
|
// url, lastmod (ISO date), changefreq, priority
|
||||||
|
|
||||||
|
// Generate sitemap.xml string
|
||||||
|
pub fn generate(entries: List<SitemapEntry>): String
|
||||||
|
|
||||||
|
// Generate a simple robots.txt pointing to the sitemap
|
||||||
|
pub fn robotsTxt(sitemapUrl: String): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** `xml`
|
||||||
|
|
||||||
|
**Estimated size:** ~50-70 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 7: `ssg` (Priority: LOW - future)
|
||||||
|
|
||||||
|
**Why:** Once markdown, frontmatter, rss, sitemap, and path packages exist, the remaining logic in blu-site's main.lux is generic SSG framework code: read content dirs, parse posts, sort by date, generate section indexes, generate tag pages, copy static assets. This could be extracted into a framework package that other Lux users could use to build their own static sites.
|
||||||
|
|
||||||
|
**This should wait** until the foundation packages above are stable and battle-tested through blu-site usage.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Non-Package Stdlib Improvements Needed
|
||||||
|
|
||||||
|
These gaps are too fundamental to be packages and should be added to the Lux language itself:
|
||||||
|
|
||||||
|
### HashMap (Critical)
|
||||||
|
Every package above that needs key-value lookups (frontmatter, xml attributes, etc.) is working around the lack of HashMap with `List<(String, String)>`. This is O(n) per lookup and makes code verbose. A stdlib `Map` module would transform the ecosystem.
|
||||||
|
|
||||||
|
### List.sort / List.sortBy (High)
|
||||||
|
blu-site implements insertion sort manually. Every content-driven app needs sorting. This should be a stdlib function.
|
||||||
|
|
||||||
|
### Time.format / Time.parse (High)
|
||||||
|
blu-site manually parses "2025-01-15" by substring extraction and maps month numbers to names. A proper date/time library (even just ISO 8601 parsing and basic formatting) would help every package above.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Order
|
||||||
|
|
||||||
|
```
|
||||||
|
Phase 1 (unblock blu-site fixes):
|
||||||
|
1. markdown - extract from blu-site, fix bugs, publish
|
||||||
|
2. frontmatter - extract from blu-site, improve robustness
|
||||||
|
3. path - tiny, universally useful
|
||||||
|
4. xml - needed by rss and sitemap
|
||||||
|
|
||||||
|
Phase 2 (complete blu-site features):
|
||||||
|
5. rss - depends on xml
|
||||||
|
6. sitemap - depends on xml
|
||||||
|
|
||||||
|
Phase 3 (ecosystem growth):
|
||||||
|
7. template - string templating (mustache-like)
|
||||||
|
8. csv - data processing
|
||||||
|
9. cli - argument parsing
|
||||||
|
10. ssg - framework extraction from blu-site
|
||||||
|
```
|
||||||
|
|
||||||
|
Each package should be developed in its own directory under `~/src/`, published to the git.qrty.ink registry, and tested by integrating it into blu-site.
|
||||||
19
README.md
19
README.md
@@ -2,15 +2,22 @@
|
|||||||
|
|
||||||
A functional programming language with first-class effects, schema evolution, and behavioral types.
|
A functional programming language with first-class effects, schema evolution, and behavioral types.
|
||||||
|
|
||||||
## Vision
|
## Philosophy
|
||||||
|
|
||||||
Most programming languages treat three critical concerns as afterthoughts:
|
**Make the important things visible.**
|
||||||
|
|
||||||
1. **Effects** — What can this code do? (Hidden, untraceable, untestable)
|
Most languages hide what matters most: what code can do (effects), how data changes over time (schema evolution), and what guarantees functions provide (behavioral properties). Lux makes all three first-class, compiler-checked language features.
|
||||||
2. **Data Evolution** — Types change, data persists. (Manual migrations, runtime failures)
|
|
||||||
3. **Behavioral Properties** — Is this idempotent? Does it terminate? (Comments and hope)
|
|
||||||
|
|
||||||
Lux makes these first-class language features. The compiler knows what your code does, how your data evolves, and what properties your functions guarantee.
|
| Principle | What it means |
|
||||||
|
|-----------|--------------|
|
||||||
|
| **Explicit over implicit** | Effects in types — see what code does |
|
||||||
|
| **Composition over configuration** | No DI frameworks — effects compose naturally |
|
||||||
|
| **Safety without ceremony** | Type inference + explicit signatures where they matter |
|
||||||
|
| **Practical over academic** | Familiar syntax, ML semantics, no monads |
|
||||||
|
| **One right way** | Opinionated formatter, integrated tooling, built-in test framework |
|
||||||
|
| **Tools are the language** | `lux fmt/lint/check/test/compile` — one binary, not seven tools |
|
||||||
|
|
||||||
|
See [docs/PHILOSOPHY.md](./docs/PHILOSOPHY.md) for the full philosophy with language comparisons and design rationale.
|
||||||
|
|
||||||
## Core Principles
|
## Core Principles
|
||||||
|
|
||||||
|
|||||||
38
build.rs
Normal file
38
build.rs
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
use std::path::PathBuf;
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
// Capture the absolute C compiler path at build time so the binary is self-contained.
|
||||||
|
// This is critical for Nix builds where cc/gcc live in /nix/store paths.
|
||||||
|
let cc_path = std::env::var("CC").ok()
|
||||||
|
.filter(|s| !s.is_empty())
|
||||||
|
.and_then(|s| resolve_absolute(&s))
|
||||||
|
.or_else(|| find_in_path("cc"))
|
||||||
|
.or_else(|| find_in_path("gcc"))
|
||||||
|
.or_else(|| find_in_path("clang"))
|
||||||
|
.unwrap_or_default();
|
||||||
|
|
||||||
|
println!("cargo:rustc-env=LUX_CC_PATH={}", cc_path);
|
||||||
|
println!("cargo:rerun-if-env-changed=CC");
|
||||||
|
println!("cargo:rerun-if-env-changed=PATH");
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Resolve a command name to its absolute path by searching PATH.
|
||||||
|
fn find_in_path(cmd: &str) -> Option<String> {
|
||||||
|
let path_var = std::env::var("PATH").ok()?;
|
||||||
|
for dir in path_var.split(':') {
|
||||||
|
let candidate = PathBuf::from(dir).join(cmd);
|
||||||
|
if candidate.is_file() {
|
||||||
|
return Some(candidate.to_string_lossy().into_owned());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
|
/// If the path is already absolute and exists, return it. Otherwise search PATH.
|
||||||
|
fn resolve_absolute(cmd: &str) -> Option<String> {
|
||||||
|
let p = PathBuf::from(cmd);
|
||||||
|
if p.is_absolute() && p.is_file() {
|
||||||
|
return Some(cmd.to_string());
|
||||||
|
}
|
||||||
|
find_in_path(cmd)
|
||||||
|
}
|
||||||
449
docs/PHILOSOPHY.md
Normal file
449
docs/PHILOSOPHY.md
Normal file
@@ -0,0 +1,449 @@
|
|||||||
|
# The Lux Philosophy
|
||||||
|
|
||||||
|
## In One Sentence
|
||||||
|
|
||||||
|
**Make the important things visible.**
|
||||||
|
|
||||||
|
## The Three Pillars
|
||||||
|
|
||||||
|
Most programming languages hide the things that matter most in production:
|
||||||
|
|
||||||
|
1. **What can this code do?** — Side effects are invisible in function signatures
|
||||||
|
2. **How does data change over time?** — Schema evolution is a deployment problem, not a language one
|
||||||
|
3. **What guarantees does this code provide?** — Properties like idempotency live in comments and hope
|
||||||
|
|
||||||
|
Lux makes all three first-class, compiler-checked language features.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
### 1. Explicit Over Implicit
|
||||||
|
|
||||||
|
Every function signature tells you what it does:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
fn processOrder(order: Order): Receipt with {Database, Email, Logger}
|
||||||
|
```
|
||||||
|
|
||||||
|
You don't need to read the body, trace call chains, or check documentation. The signature *is* the documentation. Code review becomes: "should this function really send emails?"
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- Effects are declared in types, not hidden behind interfaces
|
||||||
|
- No dependency injection frameworks — just swap handlers
|
||||||
|
- No mocking libraries — test with different effect implementations
|
||||||
|
- No "spooky action at a distance" — if a function can fail, its type says so
|
||||||
|
|
||||||
|
**How this compares:**
|
||||||
|
| Language | Side effects | Lux equivalent |
|
||||||
|
|----------|-------------|----------------|
|
||||||
|
| JavaScript | Anything, anywhere, silently | `with {Console, Http, File}` |
|
||||||
|
| Python | Implicit, discovered by reading code | Effect declarations in signature |
|
||||||
|
| Java | Checked exceptions (partial), DI frameworks | Effects + handlers |
|
||||||
|
| Go | Return error values (partial) | `with {Fail}` or `Result` |
|
||||||
|
| Rust | `unsafe` blocks, `Result`/`Option` | Effects for I/O, Result for values |
|
||||||
|
| Haskell | Monad transformers (explicit but heavy) | Effects (explicit and lightweight) |
|
||||||
|
| Koka | Algebraic effects (similar) | Same family, more familiar syntax |
|
||||||
|
|
||||||
|
### 2. Composition Over Configuration
|
||||||
|
|
||||||
|
Things combine naturally without glue code:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Multiple effects compose by listing them
|
||||||
|
fn sync(id: UserId): User with {Database, Http, Logger} = ...
|
||||||
|
|
||||||
|
// Handlers compose by providing them
|
||||||
|
run sync(id) with {
|
||||||
|
Database = postgres(conn),
|
||||||
|
Http = realHttp,
|
||||||
|
Logger = consoleLogger
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
No monad transformers. No middleware stacks. No factory factories. Effects are sets; they union naturally.
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- Functions compose with `|>` (pipes)
|
||||||
|
- Effects compose by set union
|
||||||
|
- Types compose via generics and ADTs
|
||||||
|
- Tests compose by handler substitution
|
||||||
|
|
||||||
|
### 3. Safety Without Ceremony
|
||||||
|
|
||||||
|
The type system catches errors at compile time, but doesn't make you fight it:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Type inference keeps code clean
|
||||||
|
let x = 42 // Int, inferred
|
||||||
|
let names = ["Alice", "Bob"] // List<String>, inferred
|
||||||
|
|
||||||
|
// But function signatures are always explicit
|
||||||
|
fn greet(name: String): String = "Hello, {name}"
|
||||||
|
```
|
||||||
|
|
||||||
|
**The balance:**
|
||||||
|
- Function signatures: always annotated (documentation + API contract)
|
||||||
|
- Local bindings: inferred (reduces noise in implementation)
|
||||||
|
- Effects: declared or inferred (explicit at boundaries, lightweight inside)
|
||||||
|
- Behavioral properties: opt-in (`is pure`, `is total` — add when valuable)
|
||||||
|
|
||||||
|
### 4. Practical Over Academic
|
||||||
|
|
||||||
|
Lux borrows from the best of programming language research, but wraps it in familiar syntax:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// This is algebraic effects. But it reads like normal code.
|
||||||
|
fn main(): Unit with {Console} = {
|
||||||
|
Console.print("What's your name?")
|
||||||
|
let name = Console.readLine()
|
||||||
|
Console.print("Hello, {name}!")
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Compare with Haskell's equivalent:
|
||||||
|
```haskell
|
||||||
|
main :: IO ()
|
||||||
|
main = do
|
||||||
|
putStrLn "What's your name?"
|
||||||
|
name <- getLine
|
||||||
|
putStrLn ("Hello, " ++ name ++ "!")
|
||||||
|
```
|
||||||
|
|
||||||
|
Both are explicit about effects. Lux chooses syntax that reads like imperative code while maintaining the same guarantees.
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- ML-family semantics, C-family appearance
|
||||||
|
- No monads to learn (effects replace them)
|
||||||
|
- No category theory prerequisites
|
||||||
|
- The learning curve is: functions → types → effects (days, not months)
|
||||||
|
|
||||||
|
### 5. One Right Way
|
||||||
|
|
||||||
|
Like Go and Python, Lux favors having one obvious way to do things:
|
||||||
|
|
||||||
|
- **One formatter** (`lux fmt`) — opinionated, not configurable, ends all style debates
|
||||||
|
- **One test framework** (built-in `Test` effect) — no framework shopping
|
||||||
|
- **One way to handle effects** — declare, handle, compose
|
||||||
|
- **One package manager** (`lux pkg`) — integrated, not bolted on
|
||||||
|
|
||||||
|
This is a deliberate rejection of the JavaScript/Ruby approach where every project assembles its own stack from dozens of competing libraries.
|
||||||
|
|
||||||
|
### 6. Tools Are Part of the Language
|
||||||
|
|
||||||
|
The compiler, linter, formatter, LSP, package manager, and test runner are one thing, not seven:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
lux fmt # Format
|
||||||
|
lux lint # Lint (with --explain for education)
|
||||||
|
lux check # Type check + lint
|
||||||
|
lux test # Run tests
|
||||||
|
lux compile # Build a binary
|
||||||
|
lux serve # Serve files
|
||||||
|
lux --lsp # Editor integration
|
||||||
|
```
|
||||||
|
|
||||||
|
This follows Go's philosophy: a language is its toolchain. The formatter knows the AST. The linter knows the type system. The LSP knows the effects. They're not afterthoughts.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Design Decisions and Their Reasons
|
||||||
|
|
||||||
|
### Why algebraic effects instead of monads?
|
||||||
|
|
||||||
|
Monads are powerful but have poor ergonomics for composition. Combining `IO`, `State`, and `Error` in Haskell requires monad transformers — a notoriously difficult concept. Effects compose naturally:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Just list the effects you need. No transformers.
|
||||||
|
fn app(): Unit with {Console, File, Http, Time} = ...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Why not just `async/await`?
|
||||||
|
|
||||||
|
`async/await` solves one effect (concurrency). Effects solve all of them: I/O, state, randomness, failure, concurrency, logging, databases. One mechanism, universally applicable.
|
||||||
|
|
||||||
|
### Why require function type annotations?
|
||||||
|
|
||||||
|
Three reasons:
|
||||||
|
1. **Documentation**: Every function signature is self-documenting
|
||||||
|
2. **Error messages**: Inference failures produce confusing errors; annotations localize them
|
||||||
|
3. **API stability**: Changing a function body shouldn't silently change its type
|
||||||
|
|
||||||
|
### Why an opinionated formatter?
|
||||||
|
|
||||||
|
Style debates waste engineering time. `gofmt` proved that an opinionated, non-configurable formatter eliminates an entire category of bikeshedding. `lux fmt` does the same.
|
||||||
|
|
||||||
|
### Why immutable by default?
|
||||||
|
|
||||||
|
Mutable state is the root of most concurrency bugs and many logic bugs. Immutability makes code easier to reason about. When you need state, the `State` effect makes it explicit and trackable.
|
||||||
|
|
||||||
|
### Why behavioral types?
|
||||||
|
|
||||||
|
Properties like "this function is idempotent" or "this function always terminates" are critical for correctness but typically live in comments. Making them part of the type system means:
|
||||||
|
- The compiler can verify them (or generate property tests)
|
||||||
|
- Callers can require them (`where F is idempotent`)
|
||||||
|
- They serve as machine-readable documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Comparison with Popular Languages
|
||||||
|
|
||||||
|
### JavaScript / TypeScript (SO #1 / #6 by usage)
|
||||||
|
|
||||||
|
| Aspect | JavaScript/TypeScript | Lux |
|
||||||
|
|--------|----------------------|-----|
|
||||||
|
| **Type system** | Optional/gradual (TS) | Required, Hindley-Milner |
|
||||||
|
| **Side effects** | Anywhere, implicit | Declared in types |
|
||||||
|
| **Testing** | Mock libraries (Jest, etc.) | Swap effect handlers |
|
||||||
|
| **Formatting** | Prettier (configurable) | `lux fmt` (opinionated) |
|
||||||
|
| **Package management** | npm (massive ecosystem) | `lux pkg` (small ecosystem) |
|
||||||
|
| **Paradigm** | Multi-paradigm | Functional-first |
|
||||||
|
| **Null safety** | Optional chaining (partial) | `Option<T>`, no null |
|
||||||
|
| **Error handling** | try/catch (unchecked) | `Result<T, E>` + `Fail` effect |
|
||||||
|
| **Shared** | Familiar syntax, first-class functions, closures, string interpolation |
|
||||||
|
|
||||||
|
**What Lux learns from JS/TS:** Familiar syntax matters. String interpolation, arrow functions, and readable code lower the barrier to entry.
|
||||||
|
|
||||||
|
**What Lux rejects:** Implicit `any`, unchecked exceptions, the "pick your own adventure" toolchain.
|
||||||
|
|
||||||
|
### Python (SO #4 by usage, #1 most desired)
|
||||||
|
|
||||||
|
| Aspect | Python | Lux |
|
||||||
|
|--------|--------|-----|
|
||||||
|
| **Type system** | Optional (type hints) | Required, static |
|
||||||
|
| **Side effects** | Implicit | Explicit |
|
||||||
|
| **Performance** | Slow (interpreted) | Faster (compiled to C) |
|
||||||
|
| **Syntax** | Whitespace-significant | Braces/keywords |
|
||||||
|
| **Immutability** | Mutable by default | Immutable by default |
|
||||||
|
| **Tooling** | Fragmented (black, ruff, mypy, pytest...) | Unified (`lux` binary) |
|
||||||
|
| **Shared** | Clean syntax philosophy, "one way to do it", readability focus |
|
||||||
|
|
||||||
|
**What Lux learns from Python:** Readability counts. The Zen of Python's emphasis on one obvious way to do things resonates with Lux's design.
|
||||||
|
|
||||||
|
**What Lux rejects:** Dynamic typing, mutable-by-default, fragmented tooling.
|
||||||
|
|
||||||
|
### Rust (SO #1 most admired)
|
||||||
|
|
||||||
|
| Aspect | Rust | Lux |
|
||||||
|
|--------|------|-----|
|
||||||
|
| **Memory** | Ownership/borrowing (manual) | Reference counting (automatic) |
|
||||||
|
| **Type system** | Traits, generics, lifetimes | ADTs, effects, generics |
|
||||||
|
| **Side effects** | Implicit (except `unsafe`) | Explicit (effect system) |
|
||||||
|
| **Error handling** | `Result<T, E>` + `?` | `Result<T, E>` + `Fail` effect |
|
||||||
|
| **Performance** | Zero-cost, systems-level | Good, not systems-level |
|
||||||
|
| **Learning curve** | Steep (ownership) | Moderate (effects) |
|
||||||
|
| **Pattern matching** | Excellent, exhaustive | Excellent, exhaustive |
|
||||||
|
| **Shared** | ADTs, pattern matching, `Option`/`Result`, no null, immutable by default, strong type system |
|
||||||
|
|
||||||
|
**What Lux learns from Rust:** ADTs with exhaustive matching, `Option`/`Result` instead of null/exceptions, excellent error messages, integrated tooling (cargo model).
|
||||||
|
|
||||||
|
**What Lux rejects:** Ownership complexity (Lux uses GC/RC instead), lifetimes, `unsafe`.
|
||||||
|
|
||||||
|
### Go (SO #13 by usage, #11 most admired)
|
||||||
|
|
||||||
|
| Aspect | Go | Lux |
|
||||||
|
|--------|-----|-----|
|
||||||
|
| **Type system** | Structural, simple | HM inference, ADTs |
|
||||||
|
| **Side effects** | Implicit | Explicit |
|
||||||
|
| **Error handling** | Multiple returns (`val, err`) | `Result<T, E>` + effects |
|
||||||
|
| **Formatting** | `gofmt` (opinionated) | `lux fmt` (opinionated) |
|
||||||
|
| **Tooling** | All-in-one (`go` binary) | All-in-one (`lux` binary) |
|
||||||
|
| **Concurrency** | Goroutines + channels | `Concurrent` + `Channel` effects |
|
||||||
|
| **Generics** | Added late, limited | First-class from day one |
|
||||||
|
| **Shared** | Opinionated formatter, unified tooling, practical philosophy |
|
||||||
|
|
||||||
|
**What Lux learns from Go:** Unified toolchain, opinionated formatting, simplicity as a feature, fast compilation.
|
||||||
|
|
||||||
|
**What Lux rejects:** Verbose error handling (`if err != nil`), no ADTs, no generics (historically), nil.
|
||||||
|
|
||||||
|
### Java / C# (SO #7 / #8 by usage)
|
||||||
|
|
||||||
|
| Aspect | Java/C# | Lux |
|
||||||
|
|--------|---------|-----|
|
||||||
|
| **Paradigm** | OOP-first | FP-first |
|
||||||
|
| **Effects** | DI frameworks (Spring, etc.) | Language-level effects |
|
||||||
|
| **Testing** | Mock frameworks (Mockito, etc.) | Handler swapping |
|
||||||
|
| **Null safety** | Nullable (Java), nullable ref types (C#) | `Option<T>` |
|
||||||
|
| **Boilerplate** | High (getters, setters, factories) | Low (records, inference) |
|
||||||
|
| **Shared** | Static typing, generics, pattern matching (recent), established ecosystems |
|
||||||
|
|
||||||
|
**What Lux learns from Java/C#:** Enterprise needs (database effects, HTTP, serialization) matter. Testability is a first-class concern.
|
||||||
|
|
||||||
|
**What Lux rejects:** OOP ceremony, DI frameworks, null, boilerplate.
|
||||||
|
|
||||||
|
### Haskell / OCaml / Elm (FP family)
|
||||||
|
|
||||||
|
| Aspect | Haskell | Elm | Lux |
|
||||||
|
|--------|---------|-----|-----|
|
||||||
|
| **Effects** | Monads + transformers | Cmd/Sub (Elm Architecture) | Algebraic effects |
|
||||||
|
| **Learning curve** | Steep | Moderate | Moderate |
|
||||||
|
| **Error messages** | Improving | Excellent | Good (aspiring to Elm-quality) |
|
||||||
|
| **Practical focus** | Academic-leaning | Web-focused | General-purpose |
|
||||||
|
| **Syntax** | Unique | Unique | Familiar (C-family feel) |
|
||||||
|
| **Shared** | Immutability, ADTs, pattern matching, type inference, no null |
|
||||||
|
|
||||||
|
**What Lux learns from Haskell:** Effects must be explicit. Types must be powerful. Purity matters.
|
||||||
|
|
||||||
|
**What Lux learns from Elm:** Error messages should teach. Tooling should be integrated. Simplicity beats power.
|
||||||
|
|
||||||
|
**What Lux rejects (from Haskell):** Monad transformers, academic syntax, steep learning curve.
|
||||||
|
|
||||||
|
### Gleam / Elixir (SO #2 / #3 most admired, 2025)
|
||||||
|
|
||||||
|
| Aspect | Gleam | Elixir | Lux |
|
||||||
|
|--------|-------|--------|-----|
|
||||||
|
| **Type system** | Static, HM | Dynamic | Static, HM |
|
||||||
|
| **Effects** | No special tracking | Implicit | First-class |
|
||||||
|
| **Concurrency** | BEAM (built-in) | BEAM (built-in) | Effect-based |
|
||||||
|
| **Error handling** | `Result` | Pattern matching on tuples | `Result` + `Fail` effect |
|
||||||
|
| **Shared** | Friendly errors, pipe operator, functional style, immutability |
|
||||||
|
|
||||||
|
**What Lux learns from Gleam:** Friendly developer experience, clear error messages, and pragmatic FP resonate with developers.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tooling Philosophy Audit
|
||||||
|
|
||||||
|
### Does the linter follow the philosophy?
|
||||||
|
|
||||||
|
**Yes, strongly.** The linter embodies "make the important things visible":
|
||||||
|
|
||||||
|
- `could-be-pure`: Nudges users toward declaring purity — making guarantees visible
|
||||||
|
- `could-be-total`: Same for termination
|
||||||
|
- `unnecessary-effect-decl`: Keeps effect signatures honest — don't claim effects you don't use
|
||||||
|
- `unused-variable/import/function`: Keeps code focused — everything visible should be meaningful
|
||||||
|
- `single-arm-match` / `manual-map-option`: Teaches idiomatic patterns
|
||||||
|
|
||||||
|
The category system (correctness > suspicious > idiom > style > pedantic) reflects the philosophy of being practical, not academic: real bugs are errors, style preferences are opt-in.
|
||||||
|
|
||||||
|
### Does the formatter follow the philosophy?
|
||||||
|
|
||||||
|
**Yes, with one gap.** The formatter is opinionated and non-configurable, matching the "one right way" principle. It enforces consistent style across all Lux code.
|
||||||
|
|
||||||
|
**Gap:** `max_width` and `trailing_commas` are declared in `FormatConfig` but never used. This is harmless but inconsistent — either remove the unused config or implement line wrapping.
|
||||||
|
|
||||||
|
### Does the type checker follow the philosophy?
|
||||||
|
|
||||||
|
**Yes.** The type checker embodies every core principle:
|
||||||
|
- Effects are tracked and verified in function types
|
||||||
|
- Behavioral properties are checked where possible
|
||||||
|
- Error messages include context and suggestions
|
||||||
|
- Type inference reduces ceremony while maintaining safety
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What Could Be Improved
|
||||||
|
|
||||||
|
### High-value additions (improve experience significantly, low verbosity cost)
|
||||||
|
|
||||||
|
1. **Pipe-friendly standard library**
|
||||||
|
- Currently: `List.map(myList, fn(x: Int): Int => x * 2)`
|
||||||
|
- Better: Allow `myList |> List.map(fn(x: Int): Int => x * 2)`
|
||||||
|
- Many languages (Elixir, F#, Gleam) make the pipe operator the primary composition tool. If the first argument of stdlib functions is always the data, pipes become natural. This is a **library convention**, not a language change.
|
||||||
|
- **LLM impact:** Pipe chains are easier for LLMs to generate and read — linear data flow with no nesting.
|
||||||
|
- **Human impact:** Reduces cognitive load. Reading left-to-right matches how humans think about data transformation.
|
||||||
|
|
||||||
|
2. **Exhaustive `match` warnings for non-enum types**
|
||||||
|
- The linter warns about `wildcard-on-small-enum`, but could also warn when a match on `Option` or `Result` uses a wildcard instead of handling both cases explicitly.
|
||||||
|
- **Both audiences:** Prevents subtle bugs where new variants are silently caught by `_`.
|
||||||
|
|
||||||
|
3. **Error message improvements toward Elm quality**
|
||||||
|
- Current errors show the right information but could be more conversational and suggest fixes more consistently.
|
||||||
|
- Example improvement: When a function is called with wrong argument count, show the expected signature and highlight which argument is wrong.
|
||||||
|
- **LLM impact:** Structured error messages with clear "expected X, got Y" patterns are easier for LLMs to parse and fix.
|
||||||
|
- **Human impact:** Friendly errors reduce frustration, especially for beginners.
|
||||||
|
|
||||||
|
4. **`let ... else` for fallible destructuring**
|
||||||
|
- Rust's `let ... else` pattern handles the "unwrap or bail" case elegantly:
|
||||||
|
```lux
|
||||||
|
let Some(value) = maybeValue else return defaultValue
|
||||||
|
```
|
||||||
|
- Currently requires a full `match` expression for this common pattern.
|
||||||
|
- **Both audiences:** Reduces boilerplate for the most common Option/Result handling pattern.
|
||||||
|
|
||||||
|
5. **Trait/typeclass system for overloading**
|
||||||
|
- Currently `toString`, `==`, and similar operations are built-in. A trait system would let users define their own:
|
||||||
|
```lux
|
||||||
|
trait Show<T> { fn show(value: T): String }
|
||||||
|
impl Show<User> { fn show(u: User): String = "User({u.name})" }
|
||||||
|
```
|
||||||
|
- **Note:** This exists partially. Expanding it would enable more generic programming without losing explicitness.
|
||||||
|
- **LLM impact:** Traits provide clear, greppable contracts. LLMs can generate trait impls from examples.
|
||||||
|
|
||||||
|
### Medium-value additions (good improvements, some verbosity cost)
|
||||||
|
|
||||||
|
6. **Named arguments or builder pattern for records**
|
||||||
|
- When functions take many parameters, the linter already warns at 5+. Named arguments or record-punning would help:
|
||||||
|
```lux
|
||||||
|
fn createUser({ name, email, age }: UserConfig): User = ...
|
||||||
|
createUser({ name: "Alice", email: "alice@ex.com", age: 30 })
|
||||||
|
```
|
||||||
|
- **Trade-off:** Adds syntax, but the linter already pushes users toward records for many params.
|
||||||
|
|
||||||
|
7. **Async/concurrent effect sugar**
|
||||||
|
- The `Concurrent` effect exists but could benefit from syntactic sugar:
|
||||||
|
```lux
|
||||||
|
let (a, b) = concurrent {
|
||||||
|
fetch("/api/users"),
|
||||||
|
fetch("/api/posts")
|
||||||
|
}
|
||||||
|
```
|
||||||
|
- **Trade-off:** Adds syntax, but concurrent code is important enough to warrant it.
|
||||||
|
|
||||||
|
8. **Module-level documentation with `///` doc comments**
|
||||||
|
- The `missing-doc-comment` lint exists, but the doc generation system could be enhanced with richer doc comments that include examples, parameter descriptions, and effect documentation.
|
||||||
|
- **LLM impact:** Structured documentation is the single highest-value feature for LLM code understanding.
|
||||||
|
|
||||||
|
### Lower-value or risky additions (consider carefully)
|
||||||
|
|
||||||
|
9. **Type inference for function return types**
|
||||||
|
- Would reduce ceremony: `fn double(x: Int) = x * 2` instead of `fn double(x: Int): Int = x * 2`
|
||||||
|
- **Risk:** Violates the "function signatures are documentation" principle. A body change could silently change the API. Current approach is the right trade-off.
|
||||||
|
|
||||||
|
10. **Operator overloading**
|
||||||
|
- Tempting for numeric types, but quickly leads to the C++ problem where `+` could mean anything.
|
||||||
|
- **Risk:** Violates "make the important things visible" — you can't tell what `a + b` does.
|
||||||
|
- **Better:** Keep operators for built-in numeric types. Use named functions for everything else.
|
||||||
|
|
||||||
|
11. **Macros**
|
||||||
|
- Powerful but drastically complicate tooling, error messages, and readability.
|
||||||
|
- **Risk:** Rust's macro system is powerful but produces some of the worst error messages in the language.
|
||||||
|
- **Better:** Solve specific problems with language features (effects, generics) rather than a general metaprogramming escape hatch.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The LLM Perspective
|
||||||
|
|
||||||
|
Lux has several properties that make it unusually well-suited for LLM-assisted programming:
|
||||||
|
|
||||||
|
1. **Effect signatures are machine-readable contracts.** An LLM reading `fn f(): T with {Database, Logger}` knows exactly what capabilities to provide when generating handler code.
|
||||||
|
|
||||||
|
2. **Behavioral properties are verifiable assertions.** `is pure`, `is idempotent` give LLMs clear constraints to check their own output against.
|
||||||
|
|
||||||
|
3. **The opinionated formatter eliminates style ambiguity.** LLMs don't need to guess indentation, brace style, or naming conventions — `lux fmt` handles it.
|
||||||
|
|
||||||
|
4. **Exhaustive pattern matching forces completeness.** LLMs that generate `match` expressions are reminded by the compiler when they miss cases.
|
||||||
|
|
||||||
|
5. **Small, consistent standard library.** `List.map`, `String.split`, `Option.map` — uniform `Module.function` convention is easy to learn from few examples.
|
||||||
|
|
||||||
|
6. **Effect-based testing needs no framework knowledge.** An LLM doesn't need to know Jest, pytest, or JUnit — just swap handlers.
|
||||||
|
|
||||||
|
**What would help LLMs more:**
|
||||||
|
- Structured error output (JSON mode) for programmatic error fixing
|
||||||
|
- Example-rich documentation that LLMs can learn patterns from
|
||||||
|
- A canonical set of "Lux patterns" (like Go's proverbs) that encode best practices in memorable form
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Lux's philosophy can be compressed to five words: **Make the important things visible.**
|
||||||
|
|
||||||
|
This manifests as:
|
||||||
|
- **Effects in types** — see what code does
|
||||||
|
- **Properties in types** — see what code guarantees
|
||||||
|
- **Versions in types** — see how data evolves
|
||||||
|
- **One tool for everything** — see how to build
|
||||||
|
- **One format for all** — see consistent style
|
||||||
|
|
||||||
|
The language is in the sweet spot between Haskell's rigor and Python's practicality, with Go's tooling philosophy and Elm's developer experience aspirations. It doesn't try to be everything — it tries to make the things that matter most in real software visible, composable, and verifiable.
|
||||||
46
flake.nix
46
flake.nix
@@ -14,6 +14,7 @@
|
|||||||
pkgs = import nixpkgs { inherit system overlays; };
|
pkgs = import nixpkgs { inherit system overlays; };
|
||||||
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
|
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
|
||||||
extensions = [ "rust-src" "rust-analyzer" ];
|
extensions = [ "rust-src" "rust-analyzer" ];
|
||||||
|
targets = [ "x86_64-unknown-linux-musl" ];
|
||||||
};
|
};
|
||||||
in
|
in
|
||||||
{
|
{
|
||||||
@@ -22,8 +23,8 @@
|
|||||||
rustToolchain
|
rustToolchain
|
||||||
cargo-watch
|
cargo-watch
|
||||||
cargo-edit
|
cargo-edit
|
||||||
pkg-config
|
# Static builds
|
||||||
openssl
|
pkgsStatic.stdenv.cc
|
||||||
# Benchmark tools
|
# Benchmark tools
|
||||||
hyperfine
|
hyperfine
|
||||||
poop
|
poop
|
||||||
@@ -43,7 +44,7 @@
|
|||||||
printf "\n"
|
printf "\n"
|
||||||
printf " \033[1;35m╦ ╦ ╦╦ ╦\033[0m\n"
|
printf " \033[1;35m╦ ╦ ╦╦ ╦\033[0m\n"
|
||||||
printf " \033[1;35m║ ║ ║╔╣\033[0m\n"
|
printf " \033[1;35m║ ║ ║╔╣\033[0m\n"
|
||||||
printf " \033[1;35m╩═╝╚═╝╩ ╩\033[0m v0.1.0\n"
|
printf " \033[1;35m╩═╝╚═╝╩ ╩\033[0m v0.1.3\n"
|
||||||
printf "\n"
|
printf "\n"
|
||||||
printf " Functional language with first-class effects\n"
|
printf " Functional language with first-class effects\n"
|
||||||
printf "\n"
|
printf "\n"
|
||||||
@@ -61,18 +62,47 @@
|
|||||||
|
|
||||||
packages.default = pkgs.rustPlatform.buildRustPackage {
|
packages.default = pkgs.rustPlatform.buildRustPackage {
|
||||||
pname = "lux";
|
pname = "lux";
|
||||||
version = "0.1.0";
|
version = "0.1.3";
|
||||||
src = ./.;
|
src = ./.;
|
||||||
cargoLock.lockFile = ./Cargo.lock;
|
cargoLock.lockFile = ./Cargo.lock;
|
||||||
|
|
||||||
nativeBuildInputs = [ pkgs.pkg-config ];
|
|
||||||
buildInputs = [ pkgs.openssl ];
|
|
||||||
|
|
||||||
doCheck = false;
|
doCheck = false;
|
||||||
};
|
};
|
||||||
|
|
||||||
# Benchmark scripts
|
packages.static = let
|
||||||
|
muslPkgs = import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
crossSystem = {
|
||||||
|
config = "x86_64-unknown-linux-musl";
|
||||||
|
isStatic = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
in muslPkgs.rustPlatform.buildRustPackage {
|
||||||
|
pname = "lux";
|
||||||
|
version = "0.1.3";
|
||||||
|
src = ./.;
|
||||||
|
cargoLock.lockFile = ./Cargo.lock;
|
||||||
|
|
||||||
|
CARGO_BUILD_TARGET = "x86_64-unknown-linux-musl";
|
||||||
|
CARGO_BUILD_RUSTFLAGS = "-C target-feature=+crt-static";
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
|
||||||
|
postInstall = ''
|
||||||
|
$STRIP $out/bin/lux 2>/dev/null || true
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
apps = {
|
apps = {
|
||||||
|
# Release automation
|
||||||
|
release = {
|
||||||
|
type = "app";
|
||||||
|
program = toString (pkgs.writeShellScript "lux-release" ''
|
||||||
|
exec ${self}/scripts/release.sh "$@"
|
||||||
|
'');
|
||||||
|
};
|
||||||
|
|
||||||
|
# Benchmark scripts
|
||||||
# Run hyperfine benchmark comparison
|
# Run hyperfine benchmark comparison
|
||||||
bench = {
|
bench = {
|
||||||
type = "app";
|
type = "app";
|
||||||
|
|||||||
213
scripts/release.sh
Executable file
213
scripts/release.sh
Executable file
@@ -0,0 +1,213 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Lux Release Script
|
||||||
|
# Builds a static binary, generates changelog, and creates a Gitea release.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
|
||||||
|
# ./scripts/release.sh patch # same as above
|
||||||
|
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
|
||||||
|
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
|
||||||
|
# ./scripts/release.sh v1.2.3 # explicit version
|
||||||
|
#
|
||||||
|
# Environment:
|
||||||
|
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
|
||||||
|
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
|
||||||
|
|
||||||
|
# cd to repo root (directory containing this script's parent)
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR/.."
|
||||||
|
|
||||||
|
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
|
||||||
|
REPO_OWNER="blu"
|
||||||
|
REPO_NAME="lux"
|
||||||
|
API_BASE="$GITEA_URL/api/v1"
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
BOLD='\033[1m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
info() { printf "${CYAN}::${NC} %s\n" "$1"; }
|
||||||
|
ok() { printf "${GREEN}ok${NC} %s\n" "$1"; }
|
||||||
|
warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
|
||||||
|
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
|
||||||
|
|
||||||
|
# --- Determine version ---
|
||||||
|
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
|
||||||
|
BUMP="${1:-patch}"
|
||||||
|
|
||||||
|
bump_version() {
|
||||||
|
local ver="$1" part="$2"
|
||||||
|
IFS='.' read -r major minor patch <<< "$ver"
|
||||||
|
case "$part" in
|
||||||
|
major) echo "$((major + 1)).0.0" ;;
|
||||||
|
minor) echo "$major.$((minor + 1)).0" ;;
|
||||||
|
patch) echo "$major.$minor.$((patch + 1))" ;;
|
||||||
|
*) echo "$part" ;; # treat as explicit version
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
case "$BUMP" in
|
||||||
|
major|minor|patch)
|
||||||
|
VERSION=$(bump_version "$CURRENT" "$BUMP")
|
||||||
|
info "Bumping $BUMP: $CURRENT → $VERSION"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
# Explicit version — strip v prefix if present
|
||||||
|
VERSION="${BUMP#v}"
|
||||||
|
info "Explicit version: $VERSION"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
TAG="v$VERSION"
|
||||||
|
|
||||||
|
# --- Check for clean working tree ---
|
||||||
|
if [ -n "$(git status --porcelain)" ]; then
|
||||||
|
warn "Working tree has uncommitted changes:"
|
||||||
|
git status --short
|
||||||
|
printf "\n"
|
||||||
|
read -rp "Continue anyway? [y/N] " confirm
|
||||||
|
[[ "$confirm" =~ ^[Yy]$ ]] || exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Check if tag already exists ---
|
||||||
|
if git rev-parse "$TAG" >/dev/null 2>&1; then
|
||||||
|
err "Tag $TAG already exists. Choose a different version."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Update version in source files ---
|
||||||
|
if [ "$VERSION" != "$CURRENT" ]; then
|
||||||
|
info "Updating version in Cargo.toml and flake.nix..."
|
||||||
|
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
|
||||||
|
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
|
||||||
|
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
|
||||||
|
git add Cargo.toml flake.nix
|
||||||
|
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
|
||||||
|
ok "Version updated and committed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Generate changelog ---
|
||||||
|
info "Generating changelog..."
|
||||||
|
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
|
||||||
|
if [ -n "$LAST_TAG" ]; then
|
||||||
|
RANGE="$LAST_TAG..HEAD"
|
||||||
|
info "Changes since $LAST_TAG:"
|
||||||
|
else
|
||||||
|
RANGE="HEAD"
|
||||||
|
info "First release — summarizing recent commits:"
|
||||||
|
fi
|
||||||
|
|
||||||
|
CHANGELOG=$(git log "$RANGE" --pretty=format:"- %s" --no-merges 2>/dev/null | head -50 || true)
|
||||||
|
if [ -z "$CHANGELOG" ]; then
|
||||||
|
CHANGELOG="- Initial release"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Build static binary ---
|
||||||
|
info "Building static binary (nix build .#static)..."
|
||||||
|
nix build .#static
|
||||||
|
BINARY="result/bin/lux"
|
||||||
|
if [ ! -f "$BINARY" ]; then
|
||||||
|
err "Static binary not found at $BINARY"
|
||||||
|
fi
|
||||||
|
|
||||||
|
BINARY_SIZE=$(ls -lh "$BINARY" | awk '{print $5}')
|
||||||
|
BINARY_TYPE=$(file "$BINARY" | sed 's/.*: //')
|
||||||
|
ok "Binary: $BINARY_SIZE, $BINARY_TYPE"
|
||||||
|
|
||||||
|
# --- Prepare release artifact ---
|
||||||
|
ARTIFACT="/tmp/lux-${TAG}-linux-x86_64"
|
||||||
|
cp "$BINARY" "$ARTIFACT"
|
||||||
|
chmod +x "$ARTIFACT"
|
||||||
|
|
||||||
|
# --- Show release summary ---
|
||||||
|
printf "\n"
|
||||||
|
printf "${BOLD}═══ Release Summary ═══${NC}\n"
|
||||||
|
printf "\n"
|
||||||
|
printf " ${BOLD}Tag:${NC} %s\n" "$TAG"
|
||||||
|
printf " ${BOLD}Binary:${NC} %s (%s)\n" "lux-${TAG}-linux-x86_64" "$BINARY_SIZE"
|
||||||
|
printf " ${BOLD}Commit:${NC} %s\n" "$(git rev-parse --short HEAD)"
|
||||||
|
printf "\n"
|
||||||
|
printf "${BOLD}Changelog:${NC}\n"
|
||||||
|
printf "%s\n" "$CHANGELOG"
|
||||||
|
printf "\n"
|
||||||
|
|
||||||
|
# --- Confirm ---
|
||||||
|
read -rp "Create release $TAG? [y/N] " confirm
|
||||||
|
[[ "$confirm" =~ ^[Yy]$ ]] || { info "Aborted."; exit 0; }
|
||||||
|
|
||||||
|
# --- Get Gitea token ---
|
||||||
|
if [ -z "${GITEA_TOKEN:-}" ]; then
|
||||||
|
printf "\n"
|
||||||
|
info "Gitea API token required (create at $GITEA_URL/user/settings/applications)"
|
||||||
|
read -rsp "Token: " GITEA_TOKEN
|
||||||
|
printf "\n"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "$GITEA_TOKEN" ]; then
|
||||||
|
err "No token provided"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Create and push tag ---
|
||||||
|
info "Creating tag $TAG..."
|
||||||
|
git tag -a "$TAG" -m "Release $TAG" --no-sign
|
||||||
|
ok "Tag created"
|
||||||
|
|
||||||
|
info "Pushing tag to origin..."
|
||||||
|
git push origin "$TAG"
|
||||||
|
ok "Tag pushed"
|
||||||
|
|
||||||
|
# --- Create Gitea release ---
|
||||||
|
info "Creating release on Gitea..."
|
||||||
|
|
||||||
|
RELEASE_BODY=$(printf "## Lux %s\n\n### Changes\n\n%s\n\n### Installation\n\n\`\`\`bash\ncurl -Lo lux %s/%s/%s/releases/download/%s/lux-linux-x86_64\nchmod +x lux\n./lux --version\n\`\`\`" \
|
||||||
|
"$TAG" "$CHANGELOG" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG")
|
||||||
|
|
||||||
|
RELEASE_JSON=$(jq -n \
|
||||||
|
--arg tag "$TAG" \
|
||||||
|
--arg name "Lux $TAG" \
|
||||||
|
--arg body "$RELEASE_BODY" \
|
||||||
|
'{tag_name: $tag, name: $name, body: $body, draft: false, prerelease: false}')
|
||||||
|
|
||||||
|
RELEASE_RESPONSE=$(curl -s -X POST \
|
||||||
|
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases" \
|
||||||
|
-H "Authorization: token $GITEA_TOKEN" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "$RELEASE_JSON")
|
||||||
|
|
||||||
|
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id // empty')
|
||||||
|
if [ -z "$RELEASE_ID" ]; then
|
||||||
|
echo "$RELEASE_RESPONSE" | jq . 2>/dev/null || echo "$RELEASE_RESPONSE"
|
||||||
|
err "Failed to create release"
|
||||||
|
fi
|
||||||
|
ok "Release created (id: $RELEASE_ID)"
|
||||||
|
|
||||||
|
# --- Upload binary ---
|
||||||
|
info "Uploading binary..."
|
||||||
|
UPLOAD_RESPONSE=$(curl -s -X POST \
|
||||||
|
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases/$RELEASE_ID/assets?name=lux-linux-x86_64" \
|
||||||
|
-H "Authorization: token $GITEA_TOKEN" \
|
||||||
|
-H "Content-Type: application/octet-stream" \
|
||||||
|
--data-binary "@$ARTIFACT")
|
||||||
|
|
||||||
|
ASSET_NAME=$(echo "$UPLOAD_RESPONSE" | jq -r '.name // empty')
|
||||||
|
if [ -z "$ASSET_NAME" ]; then
|
||||||
|
echo "$UPLOAD_RESPONSE" | jq . 2>/dev/null || echo "$UPLOAD_RESPONSE"
|
||||||
|
err "Failed to upload binary"
|
||||||
|
fi
|
||||||
|
ok "Binary uploaded: $ASSET_NAME"
|
||||||
|
|
||||||
|
# --- Done ---
|
||||||
|
printf "\n"
|
||||||
|
printf "${GREEN}${BOLD}Release $TAG published!${NC}\n"
|
||||||
|
printf "\n"
|
||||||
|
printf " ${BOLD}URL:${NC} %s/%s/%s/releases/tag/%s\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
|
||||||
|
printf " ${BOLD}Download:${NC} %s/%s/%s/releases/download/%s/lux-linux-x86_64\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
|
||||||
|
printf "\n"
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$ARTIFACT"
|
||||||
73
scripts/validate.sh
Executable file
73
scripts/validate.sh
Executable file
@@ -0,0 +1,73 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Lux Full Validation Script
|
||||||
|
# Runs all checks: Rust tests, package tests, type checking, formatting, linting.
|
||||||
|
# Run after every committable change to ensure no regressions.
|
||||||
|
|
||||||
|
# cd to repo root (directory containing this script's parent)
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR/.."
|
||||||
|
|
||||||
|
LUX="$(pwd)/target/release/lux"
|
||||||
|
PACKAGES_DIR="$(pwd)/../packages"
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
BOLD='\033[1m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
FAILED=0
|
||||||
|
TOTAL=0
|
||||||
|
|
||||||
|
step() {
|
||||||
|
TOTAL=$((TOTAL + 1))
|
||||||
|
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
|
||||||
|
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
|
||||||
|
|
||||||
|
# --- Rust checks ---
|
||||||
|
step "cargo check"
|
||||||
|
if nix develop --command cargo check 2>&1 | grep -q "Finished"; then ok; else fail; fi
|
||||||
|
|
||||||
|
step "cargo test"
|
||||||
|
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
|
||||||
|
|
||||||
|
# --- Build release binary ---
|
||||||
|
step "cargo build --release"
|
||||||
|
if nix develop --command cargo build --release 2>&1 | grep -q "Finished"; then ok; else fail; fi
|
||||||
|
|
||||||
|
# --- Package tests ---
|
||||||
|
for pkg in path frontmatter xml rss markdown; do
|
||||||
|
PKG_DIR="$PACKAGES_DIR/$pkg"
|
||||||
|
if [ -d "$PKG_DIR" ]; then
|
||||||
|
step "lux test ($pkg)"
|
||||||
|
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Lux check on packages ---
|
||||||
|
for pkg in path frontmatter xml rss markdown; do
|
||||||
|
PKG_DIR="$PACKAGES_DIR/$pkg"
|
||||||
|
if [ -d "$PKG_DIR" ]; then
|
||||||
|
step "lux check ($pkg)"
|
||||||
|
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Summary ---
|
||||||
|
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
|
||||||
|
if [ $FAILED -eq 0 ]; then
|
||||||
|
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
|
||||||
|
else
|
||||||
|
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
13
src/ast.rs
13
src/ast.rs
@@ -499,6 +499,12 @@ pub enum Expr {
|
|||||||
field: Ident,
|
field: Ident,
|
||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
|
/// Tuple index access: tuple.0, tuple.1
|
||||||
|
TupleIndex {
|
||||||
|
object: Box<Expr>,
|
||||||
|
index: usize,
|
||||||
|
span: Span,
|
||||||
|
},
|
||||||
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
|
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
|
||||||
Lambda {
|
Lambda {
|
||||||
params: Vec<Parameter>,
|
params: Vec<Parameter>,
|
||||||
@@ -535,7 +541,9 @@ pub enum Expr {
|
|||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
/// Record literal: { name: "Alice", age: 30 }
|
/// Record literal: { name: "Alice", age: 30 }
|
||||||
|
/// With optional spread: { ...base, name: "Bob" }
|
||||||
Record {
|
Record {
|
||||||
|
spread: Option<Box<Expr>>,
|
||||||
fields: Vec<(Ident, Expr)>,
|
fields: Vec<(Ident, Expr)>,
|
||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
@@ -563,6 +571,7 @@ impl Expr {
|
|||||||
Expr::Call { span, .. } => *span,
|
Expr::Call { span, .. } => *span,
|
||||||
Expr::EffectOp { span, .. } => *span,
|
Expr::EffectOp { span, .. } => *span,
|
||||||
Expr::Field { span, .. } => *span,
|
Expr::Field { span, .. } => *span,
|
||||||
|
Expr::TupleIndex { span, .. } => *span,
|
||||||
Expr::Lambda { span, .. } => *span,
|
Expr::Lambda { span, .. } => *span,
|
||||||
Expr::Let { span, .. } => *span,
|
Expr::Let { span, .. } => *span,
|
||||||
Expr::If { span, .. } => *span,
|
Expr::If { span, .. } => *span,
|
||||||
@@ -614,7 +623,8 @@ pub enum BinaryOp {
|
|||||||
And,
|
And,
|
||||||
Or,
|
Or,
|
||||||
// Other
|
// Other
|
||||||
Pipe, // |>
|
Pipe, // |>
|
||||||
|
Concat, // ++
|
||||||
}
|
}
|
||||||
|
|
||||||
impl fmt::Display for BinaryOp {
|
impl fmt::Display for BinaryOp {
|
||||||
@@ -634,6 +644,7 @@ impl fmt::Display for BinaryOp {
|
|||||||
BinaryOp::And => write!(f, "&&"),
|
BinaryOp::And => write!(f, "&&"),
|
||||||
BinaryOp::Or => write!(f, "||"),
|
BinaryOp::Or => write!(f, "||"),
|
||||||
BinaryOp::Pipe => write!(f, "|>"),
|
BinaryOp::Pipe => write!(f, "|>"),
|
||||||
|
BinaryOp::Concat => write!(f, "++"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -909,13 +909,16 @@ impl JsBackend {
|
|||||||
let val = self.emit_expr(&let_decl.value)?;
|
let val = self.emit_expr(&let_decl.value)?;
|
||||||
let var_name = &let_decl.name.name;
|
let var_name = &let_decl.name.name;
|
||||||
|
|
||||||
// Check if this is a run expression (often results in undefined)
|
if var_name == "_" {
|
||||||
// We still want to execute it for its side effects
|
// Wildcard binding: just execute for side effects
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
self.writeln(&format!("{};", val));
|
||||||
|
} else {
|
||||||
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
|
|
||||||
// Register the variable for future use
|
// Register the variable for future use
|
||||||
self.var_substitutions
|
self.var_substitutions
|
||||||
.insert(var_name.clone(), var_name.clone());
|
.insert(var_name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
@@ -954,12 +957,17 @@ impl JsBackend {
|
|||||||
let r = self.emit_expr(right)?;
|
let r = self.emit_expr(right)?;
|
||||||
|
|
||||||
// Check for string concatenation
|
// Check for string concatenation
|
||||||
if matches!(op, BinaryOp::Add) {
|
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
|
||||||
if self.is_string_expr(left) || self.is_string_expr(right) {
|
if self.is_string_expr(left) || self.is_string_expr(right) {
|
||||||
return Ok(format!("({} + {})", l, r));
|
return Ok(format!("({} + {})", l, r));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ++ on lists: use .concat()
|
||||||
|
if matches!(op, BinaryOp::Concat) {
|
||||||
|
return Ok(format!("{}.concat({})", l, r));
|
||||||
|
}
|
||||||
|
|
||||||
let op_str = match op {
|
let op_str = match op {
|
||||||
BinaryOp::Add => "+",
|
BinaryOp::Add => "+",
|
||||||
BinaryOp::Sub => "-",
|
BinaryOp::Sub => "-",
|
||||||
@@ -974,6 +982,7 @@ impl JsBackend {
|
|||||||
BinaryOp::Ge => ">=",
|
BinaryOp::Ge => ">=",
|
||||||
BinaryOp::And => "&&",
|
BinaryOp::And => "&&",
|
||||||
BinaryOp::Or => "||",
|
BinaryOp::Or => "||",
|
||||||
|
BinaryOp::Concat => unreachable!("handled above"),
|
||||||
BinaryOp::Pipe => {
|
BinaryOp::Pipe => {
|
||||||
// Pipe operator: x |> f becomes f(x)
|
// Pipe operator: x |> f becomes f(x)
|
||||||
return Ok(format!("{}({})", r, l));
|
return Ok(format!("{}({})", r, l));
|
||||||
@@ -1034,18 +1043,26 @@ impl JsBackend {
|
|||||||
name, value, body, ..
|
name, value, body, ..
|
||||||
} => {
|
} => {
|
||||||
let val = self.emit_expr(value)?;
|
let val = self.emit_expr(value)?;
|
||||||
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
|
||||||
|
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
if name.name == "_" {
|
||||||
|
// Wildcard binding: just execute for side effects
|
||||||
|
self.writeln(&format!("{};", val));
|
||||||
|
} else {
|
||||||
|
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
||||||
|
|
||||||
// Add substitution
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
self.var_substitutions
|
|
||||||
.insert(name.name.clone(), var_name.clone());
|
// Add substitution
|
||||||
|
self.var_substitutions
|
||||||
|
.insert(name.name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
|
|
||||||
let body_result = self.emit_expr(body)?;
|
let body_result = self.emit_expr(body)?;
|
||||||
|
|
||||||
// Remove substitution
|
// Remove substitution
|
||||||
self.var_substitutions.remove(&name.name);
|
if name.name != "_" {
|
||||||
|
self.var_substitutions.remove(&name.name);
|
||||||
|
}
|
||||||
|
|
||||||
Ok(body_result)
|
Ok(body_result)
|
||||||
}
|
}
|
||||||
@@ -1066,6 +1083,10 @@ impl JsBackend {
|
|||||||
let arg = self.emit_expr(&args[0])?;
|
let arg = self.emit_expr(&args[0])?;
|
||||||
return Ok(format!("String({})", arg));
|
return Ok(format!("String({})", arg));
|
||||||
}
|
}
|
||||||
|
if ident.name == "print" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
return Ok(format!("console.log({})", arg));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
|
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
|
||||||
@@ -1228,10 +1249,15 @@ impl JsBackend {
|
|||||||
}
|
}
|
||||||
Statement::Let { name, value, .. } => {
|
Statement::Let { name, value, .. } => {
|
||||||
let val = self.emit_expr(value)?;
|
let val = self.emit_expr(value)?;
|
||||||
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
if name.name == "_" {
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
self.writeln(&format!("{};", val));
|
||||||
self.var_substitutions
|
} else {
|
||||||
.insert(name.name.clone(), var_name.clone());
|
let var_name =
|
||||||
|
format!("{}_{}", name.name, self.fresh_name());
|
||||||
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
|
self.var_substitutions
|
||||||
|
.insert(name.name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1240,15 +1266,19 @@ impl JsBackend {
|
|||||||
self.emit_expr(result)
|
self.emit_expr(result)
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
let field_strs: Result<Vec<_>, _> = fields
|
spread, fields, ..
|
||||||
.iter()
|
} => {
|
||||||
.map(|(name, expr)| {
|
let mut parts = Vec::new();
|
||||||
let val = self.emit_expr(expr)?;
|
if let Some(spread_expr) = spread {
|
||||||
Ok(format!("{}: {}", name.name, val))
|
let spread_code = self.emit_expr(spread_expr)?;
|
||||||
})
|
parts.push(format!("...{}", spread_code));
|
||||||
.collect();
|
}
|
||||||
Ok(format!("{{ {} }}", field_strs?.join(", ")))
|
for (name, expr) in fields {
|
||||||
|
let val = self.emit_expr(expr)?;
|
||||||
|
parts.push(format!("{}: {}", name.name, val));
|
||||||
|
}
|
||||||
|
Ok(format!("{{ {} }}", parts.join(", ")))
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Tuple { elements, .. } => {
|
Expr::Tuple { elements, .. } => {
|
||||||
@@ -1268,6 +1298,11 @@ impl JsBackend {
|
|||||||
Ok(format!("{}.{}", obj, field.name))
|
Ok(format!("{}.{}", obj, field.name))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Expr::TupleIndex { object, index, .. } => {
|
||||||
|
let obj = self.emit_expr(object)?;
|
||||||
|
Ok(format!("{}[{}]", obj, index))
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Run {
|
Expr::Run {
|
||||||
expr, handlers, ..
|
expr, handlers, ..
|
||||||
} => {
|
} => {
|
||||||
@@ -2333,7 +2368,7 @@ impl JsBackend {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::BinaryOp { op, left, right, .. } => {
|
Expr::BinaryOp { op, left, right, .. } => {
|
||||||
matches!(op, BinaryOp::Add)
|
matches!(op, BinaryOp::Add | BinaryOp::Concat)
|
||||||
&& (self.is_string_expr(left) || self.is_string_expr(right))
|
&& (self.is_string_expr(left) || self.is_string_expr(right))
|
||||||
}
|
}
|
||||||
_ => false,
|
_ => false,
|
||||||
|
|||||||
@@ -598,6 +598,9 @@ impl Formatter {
|
|||||||
Expr::Field { object, field, .. } => {
|
Expr::Field { object, field, .. } => {
|
||||||
format!("{}.{}", self.format_expr(object), field.name)
|
format!("{}.{}", self.format_expr(object), field.name)
|
||||||
}
|
}
|
||||||
|
Expr::TupleIndex { object, index, .. } => {
|
||||||
|
format!("{}.{}", self.format_expr(object), index)
|
||||||
|
}
|
||||||
Expr::If { condition, then_branch, else_branch, .. } => {
|
Expr::If { condition, then_branch, else_branch, .. } => {
|
||||||
format!(
|
format!(
|
||||||
"if {} then {} else {}",
|
"if {} then {} else {}",
|
||||||
@@ -685,15 +688,17 @@ impl Formatter {
|
|||||||
.join(", ")
|
.join(", ")
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
format!(
|
spread, fields, ..
|
||||||
"{{ {} }}",
|
} => {
|
||||||
fields
|
let mut parts = Vec::new();
|
||||||
.iter()
|
if let Some(spread_expr) = spread {
|
||||||
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val)))
|
parts.push(format!("...{}", self.format_expr(spread_expr)));
|
||||||
.collect::<Vec<_>>()
|
}
|
||||||
.join(", ")
|
for (name, val) in fields {
|
||||||
)
|
parts.push(format!("{}: {}", name.name, self.format_expr(val)));
|
||||||
|
}
|
||||||
|
format!("{{ {} }}", parts.join(", "))
|
||||||
}
|
}
|
||||||
Expr::EffectOp { effect, operation, args, .. } => {
|
Expr::EffectOp { effect, operation, args, .. } => {
|
||||||
format!(
|
format!(
|
||||||
@@ -728,7 +733,7 @@ impl Formatter {
|
|||||||
match &lit.kind {
|
match &lit.kind {
|
||||||
LiteralKind::Int(n) => n.to_string(),
|
LiteralKind::Int(n) => n.to_string(),
|
||||||
LiteralKind::Float(f) => format!("{}", f),
|
LiteralKind::Float(f) => format!("{}", f),
|
||||||
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"")),
|
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"").replace('{', "\\{").replace('}', "\\}")),
|
||||||
LiteralKind::Char(c) => format!("'{}'", c),
|
LiteralKind::Char(c) => format!("'{}'", c),
|
||||||
LiteralKind::Bool(b) => b.to_string(),
|
LiteralKind::Bool(b) => b.to_string(),
|
||||||
LiteralKind::Unit => "()".to_string(),
|
LiteralKind::Unit => "()".to_string(),
|
||||||
@@ -750,6 +755,7 @@ impl Formatter {
|
|||||||
BinaryOp::Ge => ">=",
|
BinaryOp::Ge => ">=",
|
||||||
BinaryOp::And => "&&",
|
BinaryOp::And => "&&",
|
||||||
BinaryOp::Or => "||",
|
BinaryOp::Or => "||",
|
||||||
|
BinaryOp::Concat => "++",
|
||||||
BinaryOp::Pipe => "|>",
|
BinaryOp::Pipe => "|>",
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -74,6 +74,9 @@ pub enum BuiltinFn {
|
|||||||
MathFloor,
|
MathFloor,
|
||||||
MathCeil,
|
MathCeil,
|
||||||
MathRound,
|
MathRound,
|
||||||
|
MathSin,
|
||||||
|
MathCos,
|
||||||
|
MathAtan2,
|
||||||
|
|
||||||
// Additional List operations
|
// Additional List operations
|
||||||
ListIsEmpty,
|
ListIsEmpty,
|
||||||
@@ -95,6 +98,10 @@ pub enum BuiltinFn {
|
|||||||
StringLastIndexOf,
|
StringLastIndexOf,
|
||||||
StringRepeat,
|
StringRepeat,
|
||||||
|
|
||||||
|
// Int/Float operations
|
||||||
|
IntToString,
|
||||||
|
FloatToString,
|
||||||
|
|
||||||
// JSON operations
|
// JSON operations
|
||||||
JsonParse,
|
JsonParse,
|
||||||
JsonStringify,
|
JsonStringify,
|
||||||
@@ -1068,9 +1075,24 @@ impl Interpreter {
|
|||||||
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
|
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
|
||||||
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
|
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
|
||||||
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
|
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
|
||||||
|
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
|
||||||
|
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
|
||||||
|
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
|
||||||
]));
|
]));
|
||||||
env.define("Math", math_module);
|
env.define("Math", math_module);
|
||||||
|
|
||||||
|
// Int module
|
||||||
|
let int_module = Value::Record(HashMap::from([
|
||||||
|
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
|
||||||
|
]));
|
||||||
|
env.define("Int", int_module);
|
||||||
|
|
||||||
|
// Float module
|
||||||
|
let float_module = Value::Record(HashMap::from([
|
||||||
|
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
|
||||||
|
]));
|
||||||
|
env.define("Float", float_module);
|
||||||
|
|
||||||
// JSON module
|
// JSON module
|
||||||
let json_module = Value::Record(HashMap::from([
|
let json_module = Value::Record(HashMap::from([
|
||||||
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
|
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
|
||||||
@@ -1099,11 +1121,50 @@ impl Interpreter {
|
|||||||
/// Execute a program
|
/// Execute a program
|
||||||
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
|
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
|
||||||
let mut last_value = Value::Unit;
|
let mut last_value = Value::Unit;
|
||||||
|
let mut has_main_let = false;
|
||||||
|
|
||||||
for decl in &program.declarations {
|
for decl in &program.declarations {
|
||||||
|
// Track if there's a top-level `let main = ...`
|
||||||
|
if let Declaration::Let(let_decl) = decl {
|
||||||
|
if let_decl.name.name == "main" {
|
||||||
|
has_main_let = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
last_value = self.eval_declaration(decl)?;
|
last_value = self.eval_declaration(decl)?;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Auto-invoke main if it was defined as a let binding with a function value
|
||||||
|
if has_main_let {
|
||||||
|
if let Some(main_val) = self.global_env.get("main") {
|
||||||
|
if let Value::Function(ref closure) = main_val {
|
||||||
|
if closure.params.is_empty() {
|
||||||
|
let span = Span { start: 0, end: 0 };
|
||||||
|
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
|
||||||
|
// Trampoline loop
|
||||||
|
loop {
|
||||||
|
match result {
|
||||||
|
EvalResult::Value(v) => {
|
||||||
|
last_value = v;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
EvalResult::Effect(req) => {
|
||||||
|
last_value = self.handle_effect(req)?;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
EvalResult::TailCall { func, args, span } => {
|
||||||
|
result = self.eval_call(func, args, span)?;
|
||||||
|
}
|
||||||
|
EvalResult::Resume(v) => {
|
||||||
|
last_value = v;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Ok(last_value)
|
Ok(last_value)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1415,6 +1476,34 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Expr::TupleIndex {
|
||||||
|
object,
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
} => {
|
||||||
|
let obj_val = self.eval_expr(object, env)?;
|
||||||
|
match obj_val {
|
||||||
|
Value::Tuple(elements) => {
|
||||||
|
if *index < elements.len() {
|
||||||
|
Ok(EvalResult::Value(elements[*index].clone()))
|
||||||
|
} else {
|
||||||
|
Err(RuntimeError {
|
||||||
|
message: format!(
|
||||||
|
"Tuple index {} out of bounds for tuple with {} elements",
|
||||||
|
index,
|
||||||
|
elements.len()
|
||||||
|
),
|
||||||
|
span: Some(*span),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => Err(RuntimeError {
|
||||||
|
message: format!("Cannot use tuple index on {}", obj_val.type_name()),
|
||||||
|
span: Some(*span),
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Lambda { params, body, .. } => {
|
Expr::Lambda { params, body, .. } => {
|
||||||
let closure = Closure {
|
let closure = Closure {
|
||||||
params: params.iter().map(|p| p.name.name.clone()).collect(),
|
params: params.iter().map(|p| p.name.name.clone()).collect(),
|
||||||
@@ -1481,8 +1570,28 @@ impl Interpreter {
|
|||||||
self.eval_expr_tail(result, &block_env, tail)
|
self.eval_expr_tail(result, &block_env, tail)
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
|
spread, fields, ..
|
||||||
|
} => {
|
||||||
let mut record = HashMap::new();
|
let mut record = HashMap::new();
|
||||||
|
|
||||||
|
// If there's a spread, evaluate it and start with its fields
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
let spread_val = self.eval_expr(spread_expr, env)?;
|
||||||
|
if let Value::Record(spread_fields) = spread_val {
|
||||||
|
record = spread_fields;
|
||||||
|
} else {
|
||||||
|
return Err(RuntimeError {
|
||||||
|
message: format!(
|
||||||
|
"Spread expression must evaluate to a record, got {}",
|
||||||
|
spread_val.type_name()
|
||||||
|
),
|
||||||
|
span: Some(expr.span()),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Override with explicit fields
|
||||||
for (name, expr) in fields {
|
for (name, expr) in fields {
|
||||||
let val = self.eval_expr(expr, env)?;
|
let val = self.eval_expr(expr, env)?;
|
||||||
record.insert(name.name.clone(), val);
|
record.insert(name.name.clone(), val);
|
||||||
@@ -1555,6 +1664,18 @@ impl Interpreter {
|
|||||||
span: Some(span),
|
span: Some(span),
|
||||||
}),
|
}),
|
||||||
},
|
},
|
||||||
|
BinaryOp::Concat => match (left, right) {
|
||||||
|
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
|
||||||
|
(Value::List(a), Value::List(b)) => {
|
||||||
|
let mut result = a;
|
||||||
|
result.extend(b);
|
||||||
|
Ok(Value::List(result))
|
||||||
|
}
|
||||||
|
(l, r) => Err(RuntimeError {
|
||||||
|
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
|
||||||
|
span: Some(span),
|
||||||
|
}),
|
||||||
|
},
|
||||||
BinaryOp::Sub => match (left, right) {
|
BinaryOp::Sub => match (left, right) {
|
||||||
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
|
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
|
||||||
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
|
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
|
||||||
@@ -2223,6 +2344,26 @@ impl Interpreter {
|
|||||||
Ok(EvalResult::Value(Value::String(result)))
|
Ok(EvalResult::Value(Value::String(result)))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::IntToString => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Int.toString requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::String(format!("{}", n)))),
|
||||||
|
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::FloatToString => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Float.toString requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(f) => Ok(EvalResult::Value(Value::String(format!("{}", f)))),
|
||||||
|
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
BuiltinFn::TypeOf => {
|
BuiltinFn::TypeOf => {
|
||||||
if args.len() != 1 {
|
if args.len() != 1 {
|
||||||
return Err(err("typeOf requires 1 argument"));
|
return Err(err("typeOf requires 1 argument"));
|
||||||
@@ -2399,6 +2540,45 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathSin => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Math.sin requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
|
||||||
|
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathCos => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Math.cos requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
|
||||||
|
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathAtan2 => {
|
||||||
|
if args.len() != 2 {
|
||||||
|
return Err(err("Math.atan2 requires 2 arguments: y, x"));
|
||||||
|
}
|
||||||
|
let y = match &args[0] {
|
||||||
|
Value::Float(n) => *n,
|
||||||
|
Value::Int(n) => *n as f64,
|
||||||
|
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
let x = match &args[1] {
|
||||||
|
Value::Float(n) => *n,
|
||||||
|
Value::Int(n) => *n as f64,
|
||||||
|
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
|
||||||
|
}
|
||||||
|
|
||||||
// Additional List operations
|
// Additional List operations
|
||||||
BuiltinFn::ListIsEmpty => {
|
BuiltinFn::ListIsEmpty => {
|
||||||
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
|
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
|
||||||
@@ -3828,6 +4008,26 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
Ok(Value::Unit)
|
Ok(Value::Unit)
|
||||||
}
|
}
|
||||||
|
("Test", "assertEqualMsg") => {
|
||||||
|
let expected = request.args.first().cloned().unwrap_or(Value::Unit);
|
||||||
|
let actual = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
||||||
|
let label = match request.args.get(2) {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => "Values not equal".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
if Value::values_equal(&expected, &actual) {
|
||||||
|
self.test_results.borrow_mut().passed += 1;
|
||||||
|
} else {
|
||||||
|
self.test_results.borrow_mut().failed += 1;
|
||||||
|
self.test_results.borrow_mut().failures.push(TestFailure {
|
||||||
|
message: label,
|
||||||
|
expected: Some(format!("{}", expected)),
|
||||||
|
actual: Some(format!("{}", actual)),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
Ok(Value::Unit)
|
||||||
|
}
|
||||||
("Test", "assertNotEqual") => {
|
("Test", "assertNotEqual") => {
|
||||||
let a = request.args.first().cloned().unwrap_or(Value::Unit);
|
let a = request.args.first().cloned().unwrap_or(Value::Unit);
|
||||||
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
||||||
@@ -4960,6 +5160,7 @@ mod tests {
|
|||||||
// Create a simple migration that adds a field
|
// Create a simple migration that adds a field
|
||||||
// Migration: old.name -> { name: old.name, email: "unknown" }
|
// Migration: old.name -> { name: old.name, email: "unknown" }
|
||||||
let migration_body = Expr::Record {
|
let migration_body = Expr::Record {
|
||||||
|
spread: None,
|
||||||
fields: vec![
|
fields: vec![
|
||||||
(
|
(
|
||||||
Ident::new("name", Span::default()),
|
Ident::new("name", Span::default()),
|
||||||
|
|||||||
32
src/lexer.rs
32
src/lexer.rs
@@ -70,6 +70,7 @@ pub enum TokenKind {
|
|||||||
|
|
||||||
// Operators
|
// Operators
|
||||||
Plus, // +
|
Plus, // +
|
||||||
|
PlusPlus, // ++
|
||||||
Minus, // -
|
Minus, // -
|
||||||
Star, // *
|
Star, // *
|
||||||
Slash, // /
|
Slash, // /
|
||||||
@@ -89,6 +90,7 @@ pub enum TokenKind {
|
|||||||
Arrow, // =>
|
Arrow, // =>
|
||||||
ThinArrow, // ->
|
ThinArrow, // ->
|
||||||
Dot, // .
|
Dot, // .
|
||||||
|
DotDotDot, // ...
|
||||||
Colon, // :
|
Colon, // :
|
||||||
ColonColon, // ::
|
ColonColon, // ::
|
||||||
Comma, // ,
|
Comma, // ,
|
||||||
@@ -160,6 +162,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::True => write!(f, "true"),
|
TokenKind::True => write!(f, "true"),
|
||||||
TokenKind::False => write!(f, "false"),
|
TokenKind::False => write!(f, "false"),
|
||||||
TokenKind::Plus => write!(f, "+"),
|
TokenKind::Plus => write!(f, "+"),
|
||||||
|
TokenKind::PlusPlus => write!(f, "++"),
|
||||||
TokenKind::Minus => write!(f, "-"),
|
TokenKind::Minus => write!(f, "-"),
|
||||||
TokenKind::Star => write!(f, "*"),
|
TokenKind::Star => write!(f, "*"),
|
||||||
TokenKind::Slash => write!(f, "/"),
|
TokenKind::Slash => write!(f, "/"),
|
||||||
@@ -179,6 +182,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::Arrow => write!(f, "=>"),
|
TokenKind::Arrow => write!(f, "=>"),
|
||||||
TokenKind::ThinArrow => write!(f, "->"),
|
TokenKind::ThinArrow => write!(f, "->"),
|
||||||
TokenKind::Dot => write!(f, "."),
|
TokenKind::Dot => write!(f, "."),
|
||||||
|
TokenKind::DotDotDot => write!(f, "..."),
|
||||||
TokenKind::Colon => write!(f, ":"),
|
TokenKind::Colon => write!(f, ":"),
|
||||||
TokenKind::ColonColon => write!(f, "::"),
|
TokenKind::ColonColon => write!(f, "::"),
|
||||||
TokenKind::Comma => write!(f, ","),
|
TokenKind::Comma => write!(f, ","),
|
||||||
@@ -268,7 +272,14 @@ impl<'a> Lexer<'a> {
|
|||||||
|
|
||||||
let kind = match c {
|
let kind = match c {
|
||||||
// Single-character tokens
|
// Single-character tokens
|
||||||
'+' => TokenKind::Plus,
|
'+' => {
|
||||||
|
if self.peek() == Some('+') {
|
||||||
|
self.advance();
|
||||||
|
TokenKind::PlusPlus
|
||||||
|
} else {
|
||||||
|
TokenKind::Plus
|
||||||
|
}
|
||||||
|
}
|
||||||
'*' => TokenKind::Star,
|
'*' => TokenKind::Star,
|
||||||
'%' => TokenKind::Percent,
|
'%' => TokenKind::Percent,
|
||||||
'(' => TokenKind::LParen,
|
'(' => TokenKind::LParen,
|
||||||
@@ -364,7 +375,22 @@ impl<'a> Lexer<'a> {
|
|||||||
TokenKind::Pipe
|
TokenKind::Pipe
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
'.' => TokenKind::Dot,
|
'.' => {
|
||||||
|
if self.peek() == Some('.') {
|
||||||
|
// Check for ... (need to peek past second dot)
|
||||||
|
// We look at source directly since we can only peek one ahead
|
||||||
|
let next_next = self.source[self.pos..].chars().nth(1);
|
||||||
|
if next_next == Some('.') {
|
||||||
|
self.advance(); // consume second '.'
|
||||||
|
self.advance(); // consume third '.'
|
||||||
|
TokenKind::DotDotDot
|
||||||
|
} else {
|
||||||
|
TokenKind::Dot
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
TokenKind::Dot
|
||||||
|
}
|
||||||
|
}
|
||||||
':' => {
|
':' => {
|
||||||
if self.peek() == Some(':') {
|
if self.peek() == Some(':') {
|
||||||
self.advance();
|
self.advance();
|
||||||
@@ -493,6 +519,8 @@ impl<'a> Lexer<'a> {
|
|||||||
Some('"') => '"',
|
Some('"') => '"',
|
||||||
Some('0') => '\0',
|
Some('0') => '\0',
|
||||||
Some('\'') => '\'',
|
Some('\'') => '\'',
|
||||||
|
Some('{') => '{',
|
||||||
|
Some('}') => '}',
|
||||||
Some('x') => {
|
Some('x') => {
|
||||||
// Hex escape \xNN
|
// Hex escape \xNN
|
||||||
let h1 = self.advance().and_then(|c| c.to_digit(16));
|
let h1 = self.advance().and_then(|c| c.to_digit(16));
|
||||||
|
|||||||
@@ -510,10 +510,13 @@ impl Linter {
|
|||||||
self.collect_refs_expr(&arm.body);
|
self.collect_refs_expr(&arm.body);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
self.collect_refs_expr(object);
|
self.collect_refs_expr(object);
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
self.collect_refs_expr(spread_expr);
|
||||||
|
}
|
||||||
for (_, val) in fields {
|
for (_, val) in fields {
|
||||||
self.collect_refs_expr(val);
|
self.collect_refs_expr(val);
|
||||||
}
|
}
|
||||||
|
|||||||
294
src/lsp.rs
294
src/lsp.rs
@@ -317,66 +317,227 @@ impl LspServer {
|
|||||||
let doc = self.documents.get(&uri)?;
|
let doc = self.documents.get(&uri)?;
|
||||||
let source = &doc.text;
|
let source = &doc.text;
|
||||||
|
|
||||||
// Try to get info from symbol table first
|
// Try to get info from symbol table first (position-based lookup)
|
||||||
if let Some(ref table) = doc.symbol_table {
|
if let Some(ref table) = doc.symbol_table {
|
||||||
let offset = self.position_to_offset(source, position);
|
let offset = self.position_to_offset(source, position);
|
||||||
if let Some(symbol) = table.definition_at_position(offset) {
|
if let Some(symbol) = table.definition_at_position(offset) {
|
||||||
let signature = symbol.type_signature.as_ref()
|
return Some(self.format_symbol_hover(symbol));
|
||||||
.map(|s| s.as_str())
|
|
||||||
.unwrap_or(&symbol.name);
|
|
||||||
let kind_str = match symbol.kind {
|
|
||||||
SymbolKind::Function => "function",
|
|
||||||
SymbolKind::Variable => "variable",
|
|
||||||
SymbolKind::Parameter => "parameter",
|
|
||||||
SymbolKind::Type => "type",
|
|
||||||
SymbolKind::TypeParameter => "type parameter",
|
|
||||||
SymbolKind::Variant => "variant",
|
|
||||||
SymbolKind::Effect => "effect",
|
|
||||||
SymbolKind::EffectOperation => "effect operation",
|
|
||||||
SymbolKind::Field => "field",
|
|
||||||
SymbolKind::Module => "module",
|
|
||||||
};
|
|
||||||
let doc_str = symbol.documentation.as_ref()
|
|
||||||
.map(|d| format!("\n\n{}", d))
|
|
||||||
.unwrap_or_default();
|
|
||||||
|
|
||||||
// Format signature: wrap long signatures onto multiple lines
|
|
||||||
let formatted_sig = format_signature_for_hover(signature);
|
|
||||||
|
|
||||||
// Add behavioral property documentation if present
|
|
||||||
let property_docs = extract_property_docs(signature);
|
|
||||||
|
|
||||||
return Some(Hover {
|
|
||||||
contents: HoverContents::Markup(MarkupContent {
|
|
||||||
kind: MarkupKind::Markdown,
|
|
||||||
value: format!("```lux\n{}\n```\n\n*{}*{}{}", formatted_sig, kind_str, property_docs, doc_str),
|
|
||||||
}),
|
|
||||||
range: None,
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Fall back to hardcoded info
|
// Get the word under cursor
|
||||||
|
|
||||||
// Extract the word at the cursor position
|
|
||||||
let word = self.get_word_at_position(source, position)?;
|
let word = self.get_word_at_position(source, position)?;
|
||||||
|
|
||||||
// Look up rich documentation for known symbols
|
// When hovering on a keyword like 'fn', 'type', 'effect', 'let', 'trait',
|
||||||
let info = self.get_rich_symbol_info(&word)
|
// look ahead to find the declaration name and show that symbol's info
|
||||||
.or_else(|| self.get_symbol_info(&word).map(|(s, d)| (s.to_string(), d.to_string())));
|
if let Some(ref table) = doc.symbol_table {
|
||||||
|
if matches!(word.as_str(), "fn" | "type" | "effect" | "let" | "trait" | "handler" | "impl") {
|
||||||
|
let offset = self.position_to_offset(source, position);
|
||||||
|
if let Some(name) = self.find_next_ident(source, offset + word.len()) {
|
||||||
|
for sym in table.global_symbols() {
|
||||||
|
if sym.name == name {
|
||||||
|
return Some(self.format_symbol_hover(sym));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if let Some((signature, doc)) = info {
|
// Try name-based lookup in symbol table (for usage sites)
|
||||||
let formatted_sig = format_signature_for_hover(&signature);
|
for sym in table.global_symbols() {
|
||||||
Some(Hover {
|
if sym.name == word {
|
||||||
|
return Some(self.format_symbol_hover(sym));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for module names (Console, List, String, etc.)
|
||||||
|
if let Some(hover) = self.get_module_hover(&word) {
|
||||||
|
return Some(hover);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rich documentation for behavioral property keywords
|
||||||
|
if let Some((signature, doc_text)) = self.get_rich_symbol_info(&word) {
|
||||||
|
return Some(Hover {
|
||||||
contents: HoverContents::Markup(MarkupContent {
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
kind: MarkupKind::Markdown,
|
kind: MarkupKind::Markdown,
|
||||||
value: format!("```lux\n{}\n```\n\n{}", formatted_sig, doc),
|
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
|
||||||
}),
|
}),
|
||||||
range: None,
|
range: None,
|
||||||
})
|
});
|
||||||
} else {
|
|
||||||
None
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Builtin keyword/function info
|
||||||
|
if let Some((signature, doc_text)) = self.get_symbol_info(&word) {
|
||||||
|
return Some(Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Format a symbol into a hover response
|
||||||
|
fn format_symbol_hover(&self, symbol: &crate::symbol_table::Symbol) -> Hover {
|
||||||
|
let signature = symbol.type_signature.as_ref()
|
||||||
|
.map(|s| s.as_str())
|
||||||
|
.unwrap_or(&symbol.name);
|
||||||
|
let kind_str = match symbol.kind {
|
||||||
|
SymbolKind::Function => "function",
|
||||||
|
SymbolKind::Variable => "variable",
|
||||||
|
SymbolKind::Parameter => "parameter",
|
||||||
|
SymbolKind::Type => "type",
|
||||||
|
SymbolKind::TypeParameter => "type parameter",
|
||||||
|
SymbolKind::Variant => "variant",
|
||||||
|
SymbolKind::Effect => "effect",
|
||||||
|
SymbolKind::EffectOperation => "effect operation",
|
||||||
|
SymbolKind::Field => "field",
|
||||||
|
SymbolKind::Module => "module",
|
||||||
|
};
|
||||||
|
let doc_str = symbol.documentation.as_ref()
|
||||||
|
.map(|d| format!("\n\n{}", d))
|
||||||
|
.unwrap_or_default();
|
||||||
|
let formatted_sig = format_signature_for_hover(signature);
|
||||||
|
let property_docs = extract_property_docs(signature);
|
||||||
|
|
||||||
|
Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!(
|
||||||
|
"```lux\n{}\n```\n*{}*{}{}",
|
||||||
|
formatted_sig, kind_str, property_docs, doc_str
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get hover info for built-in module names
|
||||||
|
fn get_module_hover(&self, name: &str) -> Option<Hover> {
|
||||||
|
let (sig, doc) = match name {
|
||||||
|
"Console" => (
|
||||||
|
"effect Console",
|
||||||
|
"**Console I/O**\n\n\
|
||||||
|
- `Console.print(msg: String): Unit` — print to stdout\n\
|
||||||
|
- `Console.readLine(): String` — read a line from stdin\n\
|
||||||
|
- `Console.readInt(): Int` — read an integer from stdin",
|
||||||
|
),
|
||||||
|
"File" => (
|
||||||
|
"effect File",
|
||||||
|
"**File System**\n\n\
|
||||||
|
- `File.read(path: String): String` — read file contents\n\
|
||||||
|
- `File.write(path: String, content: String): Unit` — write to file\n\
|
||||||
|
- `File.append(path: String, content: String): Unit` — append to file\n\
|
||||||
|
- `File.exists(path: String): Bool` — check if file exists\n\
|
||||||
|
- `File.delete(path: String): Unit` — delete a file\n\
|
||||||
|
- `File.list(path: String): List<String>` — list directory",
|
||||||
|
),
|
||||||
|
"Http" => (
|
||||||
|
"effect Http",
|
||||||
|
"**HTTP Client**\n\n\
|
||||||
|
- `Http.get(url: String): String` — GET request\n\
|
||||||
|
- `Http.post(url: String, body: String): String` — POST request\n\
|
||||||
|
- `Http.put(url: String, body: String): String` — PUT request\n\
|
||||||
|
- `Http.delete(url: String): String` — DELETE request",
|
||||||
|
),
|
||||||
|
"Sql" => (
|
||||||
|
"effect Sql",
|
||||||
|
"**SQL Database**\n\n\
|
||||||
|
- `Sql.open(path: String): Connection` — open database\n\
|
||||||
|
- `Sql.execute(conn: Connection, sql: String): Unit` — execute SQL\n\
|
||||||
|
- `Sql.query(conn: Connection, sql: String): List<Row>` — query rows\n\
|
||||||
|
- `Sql.close(conn: Connection): Unit` — close connection",
|
||||||
|
),
|
||||||
|
"Random" => (
|
||||||
|
"effect Random",
|
||||||
|
"**Random Number Generation**\n\n\
|
||||||
|
- `Random.int(min: Int, max: Int): Int` — random integer\n\
|
||||||
|
- `Random.float(): Float` — random float 0.0–1.0\n\
|
||||||
|
- `Random.bool(): Bool` — random boolean",
|
||||||
|
),
|
||||||
|
"Time" => (
|
||||||
|
"effect Time",
|
||||||
|
"**Time**\n\n\
|
||||||
|
- `Time.now(): Int` — current Unix timestamp (ms)\n\
|
||||||
|
- `Time.sleep(ms: Int): Unit` — sleep for milliseconds",
|
||||||
|
),
|
||||||
|
"Process" => (
|
||||||
|
"effect Process",
|
||||||
|
"**Process / System**\n\n\
|
||||||
|
- `Process.exec(cmd: String): String` — run shell command\n\
|
||||||
|
- `Process.env(name: String): String` — get env variable\n\
|
||||||
|
- `Process.args(): List<String>` — command-line arguments\n\
|
||||||
|
- `Process.exit(code: Int): Unit` — exit with code",
|
||||||
|
),
|
||||||
|
"Math" => (
|
||||||
|
"module Math",
|
||||||
|
"**Math Functions**\n\n\
|
||||||
|
- `Math.abs(n: Int): Int` — absolute value\n\
|
||||||
|
- `Math.min(a: Int, b: Int): Int` — minimum\n\
|
||||||
|
- `Math.max(a: Int, b: Int): Int` — maximum\n\
|
||||||
|
- `Math.sqrt(n: Float): Float` — square root\n\
|
||||||
|
- `Math.pow(base: Float, exp: Float): Float` — power\n\
|
||||||
|
- `Math.floor(n: Float): Int` — round down\n\
|
||||||
|
- `Math.ceil(n: Float): Int` — round up",
|
||||||
|
),
|
||||||
|
"List" => (
|
||||||
|
"module List",
|
||||||
|
"**List Operations**\n\n\
|
||||||
|
- `List.map(list, f)` — transform each element\n\
|
||||||
|
- `List.filter(list, p)` — keep matching elements\n\
|
||||||
|
- `List.fold(list, init, f)` — reduce to single value\n\
|
||||||
|
- `List.head(list)` — first element (Option)\n\
|
||||||
|
- `List.tail(list)` — all except first (Option)\n\
|
||||||
|
- `List.length(list)` — number of elements\n\
|
||||||
|
- `List.concat(a, b)` — concatenate lists\n\
|
||||||
|
- `List.range(start, end)` — integer range\n\
|
||||||
|
- `List.reverse(list)` — reverse order\n\
|
||||||
|
- `List.get(list, i)` — element at index (Option)",
|
||||||
|
),
|
||||||
|
"String" => (
|
||||||
|
"module String",
|
||||||
|
"**String Operations**\n\n\
|
||||||
|
- `String.length(s)` — string length\n\
|
||||||
|
- `String.split(s, delim)` — split by delimiter\n\
|
||||||
|
- `String.join(list, delim)` — join with delimiter\n\
|
||||||
|
- `String.trim(s)` — trim whitespace\n\
|
||||||
|
- `String.contains(s, sub)` — check substring\n\
|
||||||
|
- `String.replace(s, from, to)` — replace occurrences\n\
|
||||||
|
- `String.startsWith(s, prefix)` — check prefix\n\
|
||||||
|
- `String.endsWith(s, suffix)` — check suffix\n\
|
||||||
|
- `String.substring(s, start, end)` — extract range\n\
|
||||||
|
- `String.chars(s)` — list of characters",
|
||||||
|
),
|
||||||
|
"Option" => (
|
||||||
|
"type Option<A> = Some(A) | None",
|
||||||
|
"**Optional Value**\n\n\
|
||||||
|
- `Option.isSome(opt)` — has a value?\n\
|
||||||
|
- `Option.isNone(opt)` — is empty?\n\
|
||||||
|
- `Option.getOrElse(opt, default)` — unwrap or default\n\
|
||||||
|
- `Option.map(opt, f)` — transform if present\n\
|
||||||
|
- `Option.flatMap(opt, f)` — chain operations",
|
||||||
|
),
|
||||||
|
"Result" => (
|
||||||
|
"type Result<A, E> = Ok(A) | Err(E)",
|
||||||
|
"**Result of Fallible Operation**\n\n\
|
||||||
|
- `Result.isOk(r)` — succeeded?\n\
|
||||||
|
- `Result.isErr(r)` — failed?\n\
|
||||||
|
- `Result.map(r, f)` — transform success value\n\
|
||||||
|
- `Result.mapErr(r, f)` — transform error value",
|
||||||
|
),
|
||||||
|
_ => return None,
|
||||||
|
};
|
||||||
|
|
||||||
|
Some(Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!("```lux\n{}\n```\n{}", sig, doc),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
|
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
|
||||||
@@ -402,6 +563,26 @@ impl LspServer {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Find the next identifier in source after the given offset (skipping whitespace)
|
||||||
|
fn find_next_ident(&self, source: &str, start: usize) -> Option<String> {
|
||||||
|
let chars: Vec<char> = source.chars().collect();
|
||||||
|
let mut pos = start;
|
||||||
|
// Skip whitespace
|
||||||
|
while pos < chars.len() && (chars[pos] == ' ' || chars[pos] == '\t' || chars[pos] == '\n' || chars[pos] == '\r') {
|
||||||
|
pos += 1;
|
||||||
|
}
|
||||||
|
// Collect identifier
|
||||||
|
let ident_start = pos;
|
||||||
|
while pos < chars.len() && (chars[pos].is_alphanumeric() || chars[pos] == '_') {
|
||||||
|
pos += 1;
|
||||||
|
}
|
||||||
|
if pos > ident_start {
|
||||||
|
Some(chars[ident_start..pos].iter().collect())
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
|
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
|
||||||
match word {
|
match word {
|
||||||
// Keywords
|
// Keywords
|
||||||
@@ -607,17 +788,11 @@ impl LspServer {
|
|||||||
|
|
||||||
fn position_to_offset(&self, source: &str, position: Position) -> usize {
|
fn position_to_offset(&self, source: &str, position: Position) -> usize {
|
||||||
let mut offset = 0;
|
let mut offset = 0;
|
||||||
let mut line = 0u32;
|
for (line_idx, line) in source.lines().enumerate() {
|
||||||
|
if line_idx == position.line as usize {
|
||||||
for (i, c) in source.char_indices() {
|
return offset + (position.character as usize).min(line.len());
|
||||||
if line == position.line {
|
|
||||||
let col = i - offset;
|
|
||||||
return offset + (position.character as usize).min(col + 1);
|
|
||||||
}
|
|
||||||
if c == '\n' {
|
|
||||||
line += 1;
|
|
||||||
offset = i + 1;
|
|
||||||
}
|
}
|
||||||
|
offset += line.len() + 1; // +1 for newline
|
||||||
}
|
}
|
||||||
source.len()
|
source.len()
|
||||||
}
|
}
|
||||||
@@ -1396,12 +1571,15 @@ fn collect_call_site_hints(
|
|||||||
collect_call_site_hints(source, e, param_names, hints);
|
collect_call_site_hints(source, e, param_names, hints);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
collect_call_site_hints(source, spread_expr, param_names, hints);
|
||||||
|
}
|
||||||
for (_, e) in fields {
|
for (_, e) in fields {
|
||||||
collect_call_site_hints(source, e, param_names, hints);
|
collect_call_site_hints(source, e, param_names, hints);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
collect_call_site_hints(source, object, param_names, hints);
|
collect_call_site_hints(source, object, param_names, hints);
|
||||||
}
|
}
|
||||||
Expr::Run { expr, handlers, .. } => {
|
Expr::Run { expr, handlers, .. } => {
|
||||||
|
|||||||
195
src/main.rs
195
src/main.rs
@@ -1,4 +1,7 @@
|
|||||||
//! Lux - A functional programming language with first-class effects
|
//! Lux — Make the important things visible.
|
||||||
|
//!
|
||||||
|
//! A functional programming language with first-class effects, schema evolution,
|
||||||
|
//! and behavioral types. See `lux philosophy` or docs/PHILOSOPHY.md.
|
||||||
|
|
||||||
mod analysis;
|
mod analysis;
|
||||||
mod ast;
|
mod ast;
|
||||||
@@ -34,7 +37,7 @@ use std::borrow::Cow;
|
|||||||
use std::collections::HashSet;
|
use std::collections::HashSet;
|
||||||
use typechecker::TypeChecker;
|
use typechecker::TypeChecker;
|
||||||
|
|
||||||
const VERSION: &str = "0.1.0";
|
const VERSION: &str = env!("CARGO_PKG_VERSION");
|
||||||
|
|
||||||
const HELP: &str = r#"
|
const HELP: &str = r#"
|
||||||
Lux - A functional language with first-class effects
|
Lux - A functional language with first-class effects
|
||||||
@@ -171,9 +174,14 @@ fn main() {
|
|||||||
.and_then(|s| s.parse::<u16>().ok())
|
.and_then(|s| s.parse::<u16>().ok())
|
||||||
.unwrap_or(8080);
|
.unwrap_or(8080);
|
||||||
|
|
||||||
let dir = args.get(2)
|
let port_value_idx = args.iter()
|
||||||
.filter(|a| !a.starts_with('-'))
|
.position(|a| a == "--port" || a == "-p")
|
||||||
.map(|s| s.as_str())
|
.map(|i| i + 1);
|
||||||
|
let dir = args.iter().enumerate()
|
||||||
|
.skip(2)
|
||||||
|
.filter(|(i, a)| !a.starts_with('-') && Some(*i) != port_value_idx)
|
||||||
|
.map(|(_, a)| a.as_str())
|
||||||
|
.next()
|
||||||
.unwrap_or(".");
|
.unwrap_or(".");
|
||||||
|
|
||||||
serve_static_files(dir, port);
|
serve_static_files(dir, port);
|
||||||
@@ -205,16 +213,23 @@ fn main() {
|
|||||||
compile_to_c(&args[2], output_path, run_after, emit_c);
|
compile_to_c(&args[2], output_path, run_after, emit_c);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
"repl" => {
|
||||||
|
// Start REPL
|
||||||
|
run_repl();
|
||||||
|
}
|
||||||
"doc" => {
|
"doc" => {
|
||||||
// Generate API documentation
|
// Generate API documentation
|
||||||
generate_docs(&args[2..]);
|
generate_docs(&args[2..]);
|
||||||
}
|
}
|
||||||
|
"philosophy" => {
|
||||||
|
print_philosophy();
|
||||||
|
}
|
||||||
cmd => {
|
cmd => {
|
||||||
// Check if it looks like a command typo
|
// Check if it looks like a command typo
|
||||||
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
|
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
|
||||||
let known_commands = vec![
|
let known_commands = vec![
|
||||||
"fmt", "lint", "test", "watch", "init", "check", "debug",
|
"fmt", "lint", "test", "watch", "init", "check", "debug",
|
||||||
"pkg", "registry", "serve", "compile", "doc",
|
"pkg", "registry", "serve", "compile", "doc", "repl", "philosophy",
|
||||||
];
|
];
|
||||||
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
|
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
|
||||||
if !suggestions.is_empty() {
|
if !suggestions.is_empty() {
|
||||||
@@ -229,18 +244,24 @@ fn main() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Start REPL
|
// No arguments — show help
|
||||||
run_repl();
|
print_help();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn print_help() {
|
fn print_help() {
|
||||||
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
|
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
|
||||||
println!("{}", c(colors::DIM, "A functional language with first-class effects"));
|
println!("{}", c(colors::DIM, "Make the important things visible."));
|
||||||
|
println!();
|
||||||
|
println!(" {} Effects in types — see what code does", c(colors::DIM, "·"));
|
||||||
|
println!(" {} Composition over configuration — no DI frameworks", c(colors::DIM, "·"));
|
||||||
|
println!(" {} Safety without ceremony — inference where it helps", c(colors::DIM, "·"));
|
||||||
|
println!(" {} One right way — opinionated formatter, integrated tools", c(colors::DIM, "·"));
|
||||||
println!();
|
println!();
|
||||||
println!("{}", bc("", "Usage:"));
|
println!("{}", bc("", "Usage:"));
|
||||||
println!();
|
println!();
|
||||||
println!(" {} Start the REPL", bc(colors::CYAN, "lux"));
|
println!(" {} Show this help", bc(colors::CYAN, "lux"));
|
||||||
|
println!(" {} Start the REPL", bc(colors::CYAN, "lux repl"));
|
||||||
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
|
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
|
||||||
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
|
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
|
||||||
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
|
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
|
||||||
@@ -275,6 +296,8 @@ fn print_help() {
|
|||||||
c(colors::DIM, "(alias: s)"));
|
c(colors::DIM, "(alias: s)"));
|
||||||
println!(" {} {} {} Generate API documentation",
|
println!(" {} {} {} Generate API documentation",
|
||||||
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
|
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
|
||||||
|
println!(" {} {} Show language philosophy",
|
||||||
|
bc(colors::CYAN, "lux"), bc(colors::CYAN, "philosophy"));
|
||||||
println!(" {} {} Start LSP server",
|
println!(" {} {} Start LSP server",
|
||||||
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
|
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
|
||||||
println!(" {} {} Show this help",
|
println!(" {} {} Show this help",
|
||||||
@@ -283,6 +306,36 @@ fn print_help() {
|
|||||||
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
|
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn print_philosophy() {
|
||||||
|
println!("{}", bc(colors::GREEN, &format!("The Lux Philosophy")));
|
||||||
|
println!();
|
||||||
|
println!(" {}", bc("", "Make the important things visible."));
|
||||||
|
println!();
|
||||||
|
println!(" Most languages hide what matters most in production: what code");
|
||||||
|
println!(" can do, how data changes over time, and what guarantees functions");
|
||||||
|
println!(" provide. Lux makes all three first-class, compiler-checked features.");
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "1. Explicit over implicit"), c(colors::DIM, "— effects in types, not hidden behind interfaces"));
|
||||||
|
println!(" fn processOrder(order: Order): Receipt {} {}", c(colors::YELLOW, "with {Database, Email}"), c(colors::DIM, "// signature IS documentation"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "2. Composition over configuration"), c(colors::DIM, "— no DI frameworks, no monad transformers"));
|
||||||
|
println!(" run app() {} {}", c(colors::YELLOW, "with { Database = mock, Http = mock }"), c(colors::DIM, "// swap handlers, not libraries"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "3. Safety without ceremony"), c(colors::DIM, "— type inference where it helps, annotations where they document"));
|
||||||
|
println!(" let x = 42 {}", c(colors::DIM, "// inferred"));
|
||||||
|
println!(" fn f(x: Int): Int = x * 2 {}", c(colors::DIM, "// annotated: API contract"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "4. Practical over academic"), c(colors::DIM, "— ML semantics in C-family syntax, no monads to learn"));
|
||||||
|
println!(" {} {} {}", c(colors::DIM, "fn main(): Unit"), c(colors::YELLOW, "with {Console}"), c(colors::DIM, "= Console.print(\"Hello!\")"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "5. One right way"), c(colors::DIM, "— opinionated formatter, integrated tooling, built-in testing"));
|
||||||
|
println!(" lux fmt | lux lint | lux check | lux test | lux compile");
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "6. Tools are the language"), c(colors::DIM, "— formatter knows the AST, linter knows the types, LSP knows the effects"));
|
||||||
|
println!();
|
||||||
|
println!(" See {} for the full philosophy with language comparisons.", c(colors::CYAN, "docs/PHILOSOPHY.md"));
|
||||||
|
}
|
||||||
|
|
||||||
fn format_files(args: &[String]) {
|
fn format_files(args: &[String]) {
|
||||||
use formatter::{format, FormatConfig};
|
use formatter::{format, FormatConfig};
|
||||||
use std::path::Path;
|
use std::path::Path;
|
||||||
@@ -721,6 +774,36 @@ fn collect_lux_files_nonrecursive(dir: &str, pattern: Option<&str>, files: &mut
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Find a C compiler. Priority: $CC env var, build-time embedded path, PATH search.
|
||||||
|
fn find_c_compiler() -> String {
|
||||||
|
// 1. Explicit env var
|
||||||
|
if let Ok(cc) = std::env::var("CC") {
|
||||||
|
if !cc.is_empty() {
|
||||||
|
return cc;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// 2. Path captured at build time (e.g. absolute nix store path)
|
||||||
|
let built_in = env!("LUX_CC_PATH");
|
||||||
|
if !built_in.is_empty() && std::path::Path::new(built_in).exists() {
|
||||||
|
return built_in.to_string();
|
||||||
|
}
|
||||||
|
// 3. Search PATH
|
||||||
|
for name in &["cc", "gcc", "clang"] {
|
||||||
|
if let Ok(output) = std::process::Command::new("which").arg(name).output() {
|
||||||
|
if output.status.success() {
|
||||||
|
if let Ok(p) = String::from_utf8(output.stdout) {
|
||||||
|
let p = p.trim();
|
||||||
|
if !p.is_empty() {
|
||||||
|
return p.to_string();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// 4. Last resort
|
||||||
|
"cc".to_string()
|
||||||
|
}
|
||||||
|
|
||||||
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
|
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
|
||||||
use codegen::c_backend::CBackend;
|
use codegen::c_backend::CBackend;
|
||||||
use modules::ModuleLoader;
|
use modules::ModuleLoader;
|
||||||
@@ -764,7 +847,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
|
|||||||
|
|
||||||
// Generate C code
|
// Generate C code
|
||||||
let mut backend = CBackend::new();
|
let mut backend = CBackend::new();
|
||||||
let c_code = match backend.generate(&program) {
|
let c_code = match backend.generate(&program, loader.module_cache()) {
|
||||||
Ok(code) => code,
|
Ok(code) => code,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
|
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
|
||||||
@@ -812,13 +895,14 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
|
|||||||
std::process::exit(1);
|
std::process::exit(1);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Find C compiler
|
// Find C compiler: $CC env var > embedded build-time path > PATH search
|
||||||
let cc = std::env::var("CC").unwrap_or_else(|_| "cc".to_string());
|
let cc = find_c_compiler();
|
||||||
|
|
||||||
let compile_result = Command::new(&cc)
|
let compile_result = Command::new(&cc)
|
||||||
.args(["-O2", "-o"])
|
.args(["-O2", "-o"])
|
||||||
.arg(&output_bin)
|
.arg(&output_bin)
|
||||||
.arg(&temp_c)
|
.arg(&temp_c)
|
||||||
|
.arg("-lm")
|
||||||
.output();
|
.output();
|
||||||
|
|
||||||
match compile_result {
|
match compile_result {
|
||||||
@@ -1002,7 +1086,7 @@ fn run_tests(args: &[String]) {
|
|||||||
for test_file in &test_files {
|
for test_file in &test_files {
|
||||||
let path_str = test_file.to_string_lossy().to_string();
|
let path_str = test_file.to_string_lossy().to_string();
|
||||||
|
|
||||||
// Read and parse the file
|
// Read and parse the file (with module loading)
|
||||||
let source = match fs::read_to_string(test_file) {
|
let source = match fs::read_to_string(test_file) {
|
||||||
Ok(s) => s,
|
Ok(s) => s,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
@@ -1012,7 +1096,13 @@ fn run_tests(args: &[String]) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let program = match Parser::parse_source(&source) {
|
use modules::ModuleLoader;
|
||||||
|
let mut loader = ModuleLoader::new();
|
||||||
|
if let Some(parent) = test_file.parent() {
|
||||||
|
loader.add_search_path(parent.to_path_buf());
|
||||||
|
}
|
||||||
|
|
||||||
|
let program = match loader.load_source(&source, Some(test_file.as_path())) {
|
||||||
Ok(p) => p,
|
Ok(p) => p,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
|
||||||
@@ -1021,9 +1111,9 @@ fn run_tests(args: &[String]) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
// Type check
|
// Type check with module support
|
||||||
let mut checker = typechecker::TypeChecker::new();
|
let mut checker = typechecker::TypeChecker::new();
|
||||||
if let Err(errors) = checker.check_program(&program) {
|
if let Err(errors) = checker.check_program_with_modules(&program, &loader) {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
|
||||||
for err in errors {
|
for err in errors {
|
||||||
eprintln!(" {}", err);
|
eprintln!(" {}", err);
|
||||||
@@ -1051,7 +1141,7 @@ fn run_tests(args: &[String]) {
|
|||||||
interp.register_auto_migrations(&auto_migrations);
|
interp.register_auto_migrations(&auto_migrations);
|
||||||
interp.reset_test_results();
|
interp.reset_test_results();
|
||||||
|
|
||||||
match interp.run(&program) {
|
match interp.run_with_modules(&program, &loader) {
|
||||||
Ok(_) => {
|
Ok(_) => {
|
||||||
let results = interp.get_test_results();
|
let results = interp.get_test_results();
|
||||||
if results.failed == 0 && results.passed == 0 {
|
if results.failed == 0 && results.passed == 0 {
|
||||||
@@ -1085,8 +1175,8 @@ fn run_tests(args: &[String]) {
|
|||||||
interp.register_auto_migrations(&auto_migrations);
|
interp.register_auto_migrations(&auto_migrations);
|
||||||
interp.reset_test_results();
|
interp.reset_test_results();
|
||||||
|
|
||||||
// First run the file to define all functions
|
// First run the file to define all functions and load imports
|
||||||
if let Err(e) = interp.run(&program) {
|
if let Err(e) = interp.run_with_modules(&program, &loader) {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
|
||||||
total_failed += 1;
|
total_failed += 1;
|
||||||
continue;
|
continue;
|
||||||
@@ -4831,6 +4921,71 @@ c")"#;
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============ Multi-line Arguments Tests ============
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_function_args() {
|
||||||
|
let source = r#"
|
||||||
|
fn add(a: Int, b: Int): Int = a + b
|
||||||
|
let result = add(
|
||||||
|
1,
|
||||||
|
2
|
||||||
|
)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "3");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_function_args_with_lambda() {
|
||||||
|
let source = r#"
|
||||||
|
let xs = List.map(
|
||||||
|
[1, 2, 3],
|
||||||
|
fn(x) => x * 2
|
||||||
|
)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "[2, 4, 6]");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============ Tuple Index Tests ============
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_access() {
|
||||||
|
let source = r#"
|
||||||
|
let pair = (42, "hello")
|
||||||
|
let first = pair.0
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_access_second() {
|
||||||
|
let source = r#"
|
||||||
|
let pair = (42, "hello")
|
||||||
|
let second = pair.1
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "\"hello\"");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_triple() {
|
||||||
|
let source = r#"
|
||||||
|
let triple = (1, 2, 3)
|
||||||
|
let sum = triple.0 + triple.1 + triple.2
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "6");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_in_function() {
|
||||||
|
let source = r#"
|
||||||
|
fn first(pair: (Int, String)): Int = pair.0
|
||||||
|
fn second(pair: (Int, String)): String = pair.1
|
||||||
|
let p = (42, "hello")
|
||||||
|
let result = first(p)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
// Exhaustiveness checking tests
|
// Exhaustiveness checking tests
|
||||||
mod exhaustiveness_tests {
|
mod exhaustiveness_tests {
|
||||||
use super::*;
|
use super::*;
|
||||||
|
|||||||
@@ -305,6 +305,11 @@ impl ModuleLoader {
|
|||||||
self.cache.iter()
|
self.cache.iter()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Get the module cache (for passing to C backend)
|
||||||
|
pub fn module_cache(&self) -> &HashMap<String, Module> {
|
||||||
|
&self.cache
|
||||||
|
}
|
||||||
|
|
||||||
/// Clear the module cache
|
/// Clear the module cache
|
||||||
pub fn clear_cache(&mut self) {
|
pub fn clear_cache(&mut self) {
|
||||||
self.cache.clear();
|
self.cache.clear();
|
||||||
|
|||||||
@@ -1558,6 +1558,7 @@ impl Parser {
|
|||||||
loop {
|
loop {
|
||||||
let op = match self.peek_kind() {
|
let op = match self.peek_kind() {
|
||||||
TokenKind::Plus => BinaryOp::Add,
|
TokenKind::Plus => BinaryOp::Add,
|
||||||
|
TokenKind::PlusPlus => BinaryOp::Concat,
|
||||||
TokenKind::Minus => BinaryOp::Sub,
|
TokenKind::Minus => BinaryOp::Sub,
|
||||||
_ => break,
|
_ => break,
|
||||||
};
|
};
|
||||||
@@ -1646,6 +1647,20 @@ impl Parser {
|
|||||||
} else if self.check(TokenKind::Dot) {
|
} else if self.check(TokenKind::Dot) {
|
||||||
let start = expr.span();
|
let start = expr.span();
|
||||||
self.advance();
|
self.advance();
|
||||||
|
|
||||||
|
// Check for tuple index access: expr.0, expr.1, etc.
|
||||||
|
if let TokenKind::Int(n) = self.peek_kind() {
|
||||||
|
let index = n as usize;
|
||||||
|
self.advance();
|
||||||
|
let span = start.merge(self.previous_span());
|
||||||
|
expr = Expr::TupleIndex {
|
||||||
|
object: Box::new(expr),
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
};
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
let field = self.parse_ident()?;
|
let field = self.parse_ident()?;
|
||||||
|
|
||||||
// Check if this is an effect operation: Effect.operation(args)
|
// Check if this is an effect operation: Effect.operation(args)
|
||||||
@@ -1681,11 +1696,14 @@ impl Parser {
|
|||||||
|
|
||||||
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
|
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
|
||||||
let mut args = Vec::new();
|
let mut args = Vec::new();
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
args.push(self.parse_expr()?);
|
args.push(self.parse_expr()?);
|
||||||
|
self.skip_newlines();
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2190,6 +2208,11 @@ impl Parser {
|
|||||||
}));
|
}));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Check for record spread: { ...expr, field: val }
|
||||||
|
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
|
||||||
|
return self.parse_record_expr_rest(start);
|
||||||
|
}
|
||||||
|
|
||||||
// Check if it's a record (ident: expr) or block
|
// Check if it's a record (ident: expr) or block
|
||||||
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
|
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
|
||||||
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
|
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
|
||||||
@@ -2204,6 +2227,20 @@ impl Parser {
|
|||||||
|
|
||||||
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
||||||
let mut fields = Vec::new();
|
let mut fields = Vec::new();
|
||||||
|
let mut spread = None;
|
||||||
|
|
||||||
|
// Check for spread: { ...expr, ... }
|
||||||
|
if self.check(TokenKind::DotDotDot) {
|
||||||
|
self.advance(); // consume ...
|
||||||
|
let spread_expr = self.parse_expr()?;
|
||||||
|
spread = Some(Box::new(spread_expr));
|
||||||
|
|
||||||
|
self.skip_newlines();
|
||||||
|
if self.check(TokenKind::Comma) {
|
||||||
|
self.advance();
|
||||||
|
}
|
||||||
|
self.skip_newlines();
|
||||||
|
}
|
||||||
|
|
||||||
while !self.check(TokenKind::RBrace) {
|
while !self.check(TokenKind::RBrace) {
|
||||||
let name = self.parse_ident()?;
|
let name = self.parse_ident()?;
|
||||||
@@ -2220,7 +2257,11 @@ impl Parser {
|
|||||||
|
|
||||||
self.expect(TokenKind::RBrace)?;
|
self.expect(TokenKind::RBrace)?;
|
||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
Ok(Expr::Record { fields, span })
|
Ok(Expr::Record {
|
||||||
|
spread,
|
||||||
|
fields,
|
||||||
|
span,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
||||||
|
|||||||
@@ -228,13 +228,14 @@ impl SymbolTable {
|
|||||||
Declaration::Let(let_decl) => {
|
Declaration::Let(let_decl) => {
|
||||||
let is_public = matches!(let_decl.visibility, Visibility::Public);
|
let is_public = matches!(let_decl.visibility, Visibility::Public);
|
||||||
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
|
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
let_decl.name.name.clone(),
|
let_decl.name.name.clone(),
|
||||||
SymbolKind::Variable,
|
SymbolKind::Variable,
|
||||||
let_decl.span,
|
let_decl.span,
|
||||||
type_sig,
|
type_sig,
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = let_decl.doc.clone();
|
||||||
let id = self.add_symbol(scope_idx, symbol);
|
let id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(id, let_decl.name.span, true, true);
|
self.add_reference(id, let_decl.name.span, true, true);
|
||||||
|
|
||||||
@@ -279,13 +280,14 @@ impl SymbolTable {
|
|||||||
};
|
};
|
||||||
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
|
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
f.name.name.clone(),
|
f.name.name.clone(),
|
||||||
SymbolKind::Function,
|
SymbolKind::Function,
|
||||||
f.name.span,
|
f.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = f.doc.clone();
|
||||||
let fn_id = self.add_symbol(scope_idx, symbol);
|
let fn_id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(fn_id, f.name.span, true, false);
|
self.add_reference(fn_id, f.name.span, true, false);
|
||||||
|
|
||||||
@@ -326,13 +328,14 @@ impl SymbolTable {
|
|||||||
let is_public = matches!(t.visibility, Visibility::Public);
|
let is_public = matches!(t.visibility, Visibility::Public);
|
||||||
let type_sig = format!("type {}", t.name.name);
|
let type_sig = format!("type {}", t.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
t.name.name.clone(),
|
t.name.name.clone(),
|
||||||
SymbolKind::Type,
|
SymbolKind::Type,
|
||||||
t.name.span,
|
t.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = t.doc.clone();
|
||||||
let type_id = self.add_symbol(scope_idx, symbol);
|
let type_id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(type_id, t.name.span, true, false);
|
self.add_reference(type_id, t.name.span, true, false);
|
||||||
|
|
||||||
@@ -372,13 +375,14 @@ impl SymbolTable {
|
|||||||
let is_public = true; // Effects are typically public
|
let is_public = true; // Effects are typically public
|
||||||
let type_sig = format!("effect {}", e.name.name);
|
let type_sig = format!("effect {}", e.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
e.name.name.clone(),
|
e.name.name.clone(),
|
||||||
SymbolKind::Effect,
|
SymbolKind::Effect,
|
||||||
e.name.span,
|
e.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = e.doc.clone();
|
||||||
let effect_id = self.add_symbol(scope_idx, symbol);
|
let effect_id = self.add_symbol(scope_idx, symbol);
|
||||||
|
|
||||||
// Add operations
|
// Add operations
|
||||||
@@ -409,13 +413,14 @@ impl SymbolTable {
|
|||||||
let is_public = matches!(t.visibility, Visibility::Public);
|
let is_public = matches!(t.visibility, Visibility::Public);
|
||||||
let type_sig = format!("trait {}", t.name.name);
|
let type_sig = format!("trait {}", t.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
t.name.name.clone(),
|
t.name.name.clone(),
|
||||||
SymbolKind::Type, // Traits are like types
|
SymbolKind::Type, // Traits are like types
|
||||||
t.name.span,
|
t.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = t.doc.clone();
|
||||||
self.add_symbol(scope_idx, symbol);
|
self.add_symbol(scope_idx, symbol);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -479,7 +484,7 @@ impl SymbolTable {
|
|||||||
self.visit_expr(arg, scope_idx);
|
self.visit_expr(arg, scope_idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
self.visit_expr(object, scope_idx);
|
self.visit_expr(object, scope_idx);
|
||||||
}
|
}
|
||||||
Expr::If { condition, then_branch, else_branch, .. } => {
|
Expr::If { condition, then_branch, else_branch, .. } => {
|
||||||
@@ -522,7 +527,10 @@ impl SymbolTable {
|
|||||||
self.visit_expr(e, scope_idx);
|
self.visit_expr(e, scope_idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
self.visit_expr(spread_expr, scope_idx);
|
||||||
|
}
|
||||||
for (_, e) in fields {
|
for (_, e) in fields {
|
||||||
self.visit_expr(e, scope_idx);
|
self.visit_expr(e, scope_idx);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -335,11 +335,14 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
|
|||||||
Statement::Expr(e) => references_params(e, params),
|
Statement::Expr(e) => references_params(e, params),
|
||||||
}) || references_params(result, params)
|
}) || references_params(result, params)
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => references_params(object, params),
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => references_params(object, params),
|
||||||
Expr::Lambda { body, .. } => references_params(body, params),
|
Expr::Lambda { body, .. } => references_params(body, params),
|
||||||
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
||||||
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
||||||
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)),
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
spread.as_ref().is_some_and(|s| references_params(s, params))
|
||||||
|
|| fields.iter().any(|(_, e)| references_params(e, params))
|
||||||
|
}
|
||||||
Expr::Match { scrutinee, arms, .. } => {
|
Expr::Match { scrutinee, arms, .. } => {
|
||||||
references_params(scrutinee, params)
|
references_params(scrutinee, params)
|
||||||
|| arms.iter().any(|a| references_params(&a.body, params))
|
|| arms.iter().any(|a| references_params(&a.body, params))
|
||||||
@@ -516,10 +519,11 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
|
|||||||
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
|
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
|
||||||
elements.iter().any(|e| has_recursive_calls(func_name, e))
|
elements.iter().any(|e| has_recursive_calls(func_name, e))
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
|
spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|
||||||
|
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => has_recursive_calls(func_name, object),
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
|
||||||
Expr::Let { value, body, .. } => {
|
Expr::Let { value, body, .. } => {
|
||||||
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
|
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
|
||||||
}
|
}
|
||||||
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
|
|||||||
|
|
||||||
// Build the record expression
|
// Build the record expression
|
||||||
Some(Expr::Record {
|
Some(Expr::Record {
|
||||||
|
spread: None,
|
||||||
fields: field_exprs,
|
fields: field_exprs,
|
||||||
span,
|
span,
|
||||||
})
|
})
|
||||||
@@ -1536,7 +1541,7 @@ impl TypeChecker {
|
|||||||
// Use the declared type if present, otherwise use inferred
|
// Use the declared type if present, otherwise use inferred
|
||||||
let final_type = if let Some(ref type_expr) = let_decl.typ {
|
let final_type = if let Some(ref type_expr) = let_decl.typ {
|
||||||
let declared = self.resolve_type(type_expr);
|
let declared = self.resolve_type(type_expr);
|
||||||
if let Err(e) = unify(&inferred, &declared) {
|
if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Variable '{}' has type {}, but declared type is {}: {}",
|
"Variable '{}' has type {}, but declared type is {}: {}",
|
||||||
@@ -1673,6 +1678,42 @@ impl TypeChecker {
|
|||||||
span,
|
span,
|
||||||
} => self.infer_field(object, field, *span),
|
} => self.infer_field(object, field, *span),
|
||||||
|
|
||||||
|
Expr::TupleIndex {
|
||||||
|
object,
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
} => {
|
||||||
|
let object_type = self.infer_expr(object);
|
||||||
|
match &object_type {
|
||||||
|
Type::Tuple(types) => {
|
||||||
|
if *index < types.len() {
|
||||||
|
types[*index].clone()
|
||||||
|
} else {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Tuple index {} out of bounds for tuple with {} elements",
|
||||||
|
index,
|
||||||
|
types.len()
|
||||||
|
),
|
||||||
|
span: *span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Type::Var(_) => Type::var(),
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Cannot use tuple index on non-tuple type {}",
|
||||||
|
object_type
|
||||||
|
),
|
||||||
|
span: *span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Lambda {
|
Expr::Lambda {
|
||||||
params,
|
params,
|
||||||
return_type,
|
return_type,
|
||||||
@@ -1708,7 +1749,11 @@ impl TypeChecker {
|
|||||||
span,
|
span,
|
||||||
} => self.infer_block(statements, result, *span),
|
} => self.infer_block(statements, result, *span),
|
||||||
|
|
||||||
Expr::Record { fields, span } => self.infer_record(fields, *span),
|
Expr::Record {
|
||||||
|
spread,
|
||||||
|
fields,
|
||||||
|
span,
|
||||||
|
} => self.infer_record(spread.as_deref(), fields, *span),
|
||||||
|
|
||||||
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
|
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
|
||||||
|
|
||||||
@@ -1747,7 +1792,7 @@ impl TypeChecker {
|
|||||||
match op {
|
match op {
|
||||||
BinaryOp::Add => {
|
BinaryOp::Add => {
|
||||||
// Add supports both numeric types and string concatenation
|
// Add supports both numeric types and string concatenation
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1768,9 +1813,32 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BinaryOp::Concat => {
|
||||||
|
// Concat (++) supports strings and lists
|
||||||
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!("Operands of '++' must have same type: {}", e),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
match &left_type {
|
||||||
|
Type::String | Type::List(_) | Type::Var(_) => left_type,
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Operator '++' requires String or List operands, got {}",
|
||||||
|
left_type
|
||||||
|
),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
|
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
|
||||||
// Arithmetic: both operands must be same numeric type
|
// Arithmetic: both operands must be same numeric type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1794,7 +1862,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::Eq | BinaryOp::Ne => {
|
BinaryOp::Eq | BinaryOp::Ne => {
|
||||||
// Equality: operands must have same type
|
// Equality: operands must have same type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1805,7 +1873,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
|
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
|
||||||
// Comparison: operands must be same orderable type
|
// Comparison: operands must be same orderable type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1816,13 +1884,13 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::And | BinaryOp::Or => {
|
BinaryOp::And | BinaryOp::Or => {
|
||||||
// Logical: both must be Bool
|
// Logical: both must be Bool
|
||||||
if let Err(e) = unify(&left_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Left operand of '{}' must be Bool: {}", op, e),
|
message: format!("Left operand of '{}' must be Bool: {}", op, e),
|
||||||
span: left.span(),
|
span: left.span(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
if let Err(e) = unify(&right_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Right operand of '{}' must be Bool: {}", op, e),
|
message: format!("Right operand of '{}' must be Bool: {}", op, e),
|
||||||
span: right.span(),
|
span: right.span(),
|
||||||
@@ -1836,7 +1904,7 @@ impl TypeChecker {
|
|||||||
// right must be a function that accepts left's type
|
// right must be a function that accepts left's type
|
||||||
let result_type = Type::var();
|
let result_type = Type::var();
|
||||||
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
|
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
|
||||||
if let Err(e) = unify(&right_type, &expected_fn) {
|
if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Pipe target must be a function accepting {}: {}",
|
"Pipe target must be a function accepting {}: {}",
|
||||||
@@ -1868,7 +1936,7 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
UnaryOp::Not => {
|
UnaryOp::Not => {
|
||||||
if let Err(e) = unify(&operand_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operator '!' requires Bool operand: {}", e),
|
message: format!("Operator '!' requires Bool operand: {}", e),
|
||||||
span,
|
span,
|
||||||
@@ -1919,7 +1987,7 @@ impl TypeChecker {
|
|||||||
self.current_effects.clone(),
|
self.current_effects.clone(),
|
||||||
);
|
);
|
||||||
|
|
||||||
match unify(&func_type, &expected_fn) {
|
match unify_with_env(&func_type, &expected_fn, &self.env) {
|
||||||
Ok(subst) => result_type.apply(&subst),
|
Ok(subst) => result_type.apply(&subst),
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
// Provide more detailed error message based on the type of mismatch
|
// Provide more detailed error message based on the type of mismatch
|
||||||
@@ -1996,7 +2064,7 @@ impl TypeChecker {
|
|||||||
let result_type = Type::var();
|
let result_type = Type::var();
|
||||||
let expected_fn = Type::function(arg_types, result_type.clone());
|
let expected_fn = Type::function(arg_types, result_type.clone());
|
||||||
|
|
||||||
if let Err(e) = unify(field_type, &expected_fn) {
|
if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Type mismatch in {}.{} call: {}",
|
"Type mismatch in {}.{} call: {}",
|
||||||
@@ -2068,7 +2136,7 @@ impl TypeChecker {
|
|||||||
for (i, (arg_type, (_, param_type))) in
|
for (i, (arg_type, (_, param_type))) in
|
||||||
arg_types.iter().zip(op.params.iter()).enumerate()
|
arg_types.iter().zip(op.params.iter()).enumerate()
|
||||||
{
|
{
|
||||||
if let Err(e) = unify(arg_type, param_type) {
|
if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Argument {} of '{}.{}' has type {}, expected {}: {}",
|
"Argument {} of '{}.{}' has type {}, expected {}: {}",
|
||||||
@@ -2101,6 +2169,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
|
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
|
||||||
let object_type = self.infer_expr(object);
|
let object_type = self.infer_expr(object);
|
||||||
|
let object_type = self.env.expand_type_alias(&object_type);
|
||||||
|
|
||||||
match &object_type {
|
match &object_type {
|
||||||
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
|
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
|
||||||
@@ -2181,7 +2250,7 @@ impl TypeChecker {
|
|||||||
// Check return type if specified
|
// Check return type if specified
|
||||||
let ret_type = if let Some(rt) = return_type {
|
let ret_type = if let Some(rt) = return_type {
|
||||||
let declared = self.resolve_type(rt);
|
let declared = self.resolve_type(rt);
|
||||||
if let Err(e) = unify(&body_type, &declared) {
|
if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Lambda body type {} doesn't match declared {}: {}",
|
"Lambda body type {} doesn't match declared {}: {}",
|
||||||
@@ -2247,7 +2316,7 @@ impl TypeChecker {
|
|||||||
span: Span,
|
span: Span,
|
||||||
) -> Type {
|
) -> Type {
|
||||||
let cond_type = self.infer_expr(condition);
|
let cond_type = self.infer_expr(condition);
|
||||||
if let Err(e) = unify(&cond_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
|
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
|
||||||
span: condition.span(),
|
span: condition.span(),
|
||||||
@@ -2257,7 +2326,7 @@ impl TypeChecker {
|
|||||||
let then_type = self.infer_expr(then_branch);
|
let then_type = self.infer_expr(then_branch);
|
||||||
let else_type = self.infer_expr(else_branch);
|
let else_type = self.infer_expr(else_branch);
|
||||||
|
|
||||||
match unify(&then_type, &else_type) {
|
match unify_with_env(&then_type, &else_type, &self.env) {
|
||||||
Ok(subst) => then_type.apply(&subst),
|
Ok(subst) => then_type.apply(&subst),
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
@@ -2298,7 +2367,7 @@ impl TypeChecker {
|
|||||||
// Check guard if present
|
// Check guard if present
|
||||||
if let Some(ref guard) = arm.guard {
|
if let Some(ref guard) = arm.guard {
|
||||||
let guard_type = self.infer_expr(guard);
|
let guard_type = self.infer_expr(guard);
|
||||||
if let Err(e) = unify(&guard_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Match guard must be Bool: {}", e),
|
message: format!("Match guard must be Bool: {}", e),
|
||||||
span: guard.span(),
|
span: guard.span(),
|
||||||
@@ -2314,7 +2383,7 @@ impl TypeChecker {
|
|||||||
match &result_type {
|
match &result_type {
|
||||||
None => result_type = Some(body_type),
|
None => result_type = Some(body_type),
|
||||||
Some(prev) => {
|
Some(prev) => {
|
||||||
if let Err(e) = unify(prev, &body_type) {
|
if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Match arm has incompatible type: expected {}, got {}: {}",
|
"Match arm has incompatible type: expected {}, got {}: {}",
|
||||||
@@ -2364,7 +2433,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
Pattern::Literal(lit) => {
|
Pattern::Literal(lit) => {
|
||||||
let lit_type = self.infer_literal(lit);
|
let lit_type = self.infer_literal(lit);
|
||||||
if let Err(e) = unify(&lit_type, expected) {
|
if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Pattern literal type mismatch: {}", e),
|
message: format!("Pattern literal type mismatch: {}", e),
|
||||||
span: lit.span,
|
span: lit.span,
|
||||||
@@ -2378,7 +2447,7 @@ impl TypeChecker {
|
|||||||
// For now, handle Option specially
|
// For now, handle Option specially
|
||||||
match name.name.as_str() {
|
match name.name.as_str() {
|
||||||
"None" => {
|
"None" => {
|
||||||
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) {
|
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"None pattern doesn't match type {}: {}",
|
"None pattern doesn't match type {}: {}",
|
||||||
@@ -2391,7 +2460,7 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
"Some" => {
|
"Some" => {
|
||||||
let inner_type = Type::var();
|
let inner_type = Type::var();
|
||||||
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone())))
|
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
|
||||||
{
|
{
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
@@ -2420,7 +2489,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
Pattern::Tuple { elements, span } => {
|
Pattern::Tuple { elements, span } => {
|
||||||
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
|
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
|
||||||
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) {
|
if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
|
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
|
||||||
span: *span,
|
span: *span,
|
||||||
@@ -2470,7 +2539,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
if let Some(type_expr) = typ {
|
if let Some(type_expr) = typ {
|
||||||
let declared = self.resolve_type(type_expr);
|
let declared = self.resolve_type(type_expr);
|
||||||
if let Err(e) = unify(&value_type, &declared) {
|
if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Variable '{}' has type {}, but declared type is {}: {}",
|
"Variable '{}' has type {}, but declared type is {}: {}",
|
||||||
@@ -2491,12 +2560,46 @@ impl TypeChecker {
|
|||||||
self.infer_expr(result)
|
self.infer_expr(result)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type {
|
fn infer_record(
|
||||||
let field_types: Vec<(String, Type)> = fields
|
&mut self,
|
||||||
|
spread: Option<&Expr>,
|
||||||
|
fields: &[(Ident, Expr)],
|
||||||
|
span: Span,
|
||||||
|
) -> Type {
|
||||||
|
// Start with spread fields if present
|
||||||
|
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
|
||||||
|
let spread_type = self.infer_expr(spread_expr);
|
||||||
|
match spread_type {
|
||||||
|
Type::Record(spread_fields) => spread_fields,
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Spread expression must be a record type, got {}",
|
||||||
|
spread_type
|
||||||
|
),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
Vec::new()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
Vec::new()
|
||||||
|
};
|
||||||
|
|
||||||
|
// Apply explicit field overrides
|
||||||
|
let explicit_types: Vec<(String, Type)> = fields
|
||||||
.iter()
|
.iter()
|
||||||
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
|
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
|
||||||
.collect();
|
.collect();
|
||||||
|
|
||||||
|
for (name, typ) in explicit_types {
|
||||||
|
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
|
||||||
|
existing.1 = typ;
|
||||||
|
} else {
|
||||||
|
field_types.push((name, typ));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Type::Record(field_types)
|
Type::Record(field_types)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2513,7 +2616,7 @@ impl TypeChecker {
|
|||||||
let first_type = self.infer_expr(&elements[0]);
|
let first_type = self.infer_expr(&elements[0]);
|
||||||
for elem in &elements[1..] {
|
for elem in &elements[1..] {
|
||||||
let elem_type = self.infer_expr(elem);
|
let elem_type = self.infer_expr(elem);
|
||||||
if let Err(e) = unify(&first_type, &elem_type) {
|
if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("List elements must have same type: {}", e),
|
message: format!("List elements must have same type: {}", e),
|
||||||
span,
|
span,
|
||||||
@@ -2819,7 +2922,7 @@ impl TypeChecker {
|
|||||||
// Check return type matches if specified
|
// Check return type matches if specified
|
||||||
if let Some(ref return_type_expr) = impl_method.return_type {
|
if let Some(ref return_type_expr) = impl_method.return_type {
|
||||||
let return_type = self.resolve_type(return_type_expr);
|
let return_type = self.resolve_type(return_type_expr);
|
||||||
if let Err(e) = unify(&body_type, &return_type) {
|
if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Method '{}' body has type {}, but declared return type is {}: {}",
|
"Method '{}' body has type {}, but declared return type is {}: {}",
|
||||||
|
|||||||
51
src/types.rs
51
src/types.rs
@@ -1146,6 +1146,15 @@ impl TypeEnv {
|
|||||||
],
|
],
|
||||||
return_type: Type::Unit,
|
return_type: Type::Unit,
|
||||||
},
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "assertEqualMsg".to_string(),
|
||||||
|
params: vec![
|
||||||
|
("expected".to_string(), Type::Var(0)),
|
||||||
|
("actual".to_string(), Type::Var(0)),
|
||||||
|
("label".to_string(), Type::String),
|
||||||
|
],
|
||||||
|
return_type: Type::Unit,
|
||||||
|
},
|
||||||
EffectOpDef {
|
EffectOpDef {
|
||||||
name: "assertNotEqual".to_string(),
|
name: "assertNotEqual".to_string(),
|
||||||
params: vec![
|
params: vec![
|
||||||
@@ -1599,6 +1608,14 @@ impl TypeEnv {
|
|||||||
"parseFloat".to_string(),
|
"parseFloat".to_string(),
|
||||||
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
|
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"indexOf".to_string(),
|
||||||
|
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"lastIndexOf".to_string(),
|
||||||
|
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
|
||||||
|
),
|
||||||
]);
|
]);
|
||||||
env.bind("String", TypeScheme::mono(string_module_type));
|
env.bind("String", TypeScheme::mono(string_module_type));
|
||||||
|
|
||||||
@@ -1870,9 +1887,39 @@ impl TypeEnv {
|
|||||||
"round".to_string(),
|
"round".to_string(),
|
||||||
Type::function(vec![Type::var()], Type::Int),
|
Type::function(vec![Type::var()], Type::Int),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"sin".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::Float),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"cos".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::Float),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"atan2".to_string(),
|
||||||
|
Type::function(vec![Type::Float, Type::Float], Type::Float),
|
||||||
|
),
|
||||||
]);
|
]);
|
||||||
env.bind("Math", TypeScheme::mono(math_module_type));
|
env.bind("Math", TypeScheme::mono(math_module_type));
|
||||||
|
|
||||||
|
// Int module
|
||||||
|
let int_module_type = Type::Record(vec![
|
||||||
|
(
|
||||||
|
"toString".to_string(),
|
||||||
|
Type::function(vec![Type::Int], Type::String),
|
||||||
|
),
|
||||||
|
]);
|
||||||
|
env.bind("Int", TypeScheme::mono(int_module_type));
|
||||||
|
|
||||||
|
// Float module
|
||||||
|
let float_module_type = Type::Record(vec![
|
||||||
|
(
|
||||||
|
"toString".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::String),
|
||||||
|
),
|
||||||
|
]);
|
||||||
|
env.bind("Float", TypeScheme::mono(float_module_type));
|
||||||
|
|
||||||
env
|
env
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2032,7 +2079,9 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
|
|||||||
// Function's required effects (e1) must be a subset of available effects (e2)
|
// Function's required effects (e1) must be a subset of available effects (e2)
|
||||||
// A pure function (empty effects) can be called anywhere
|
// A pure function (empty effects) can be called anywhere
|
||||||
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
|
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
|
||||||
if !e1.is_subset(&e2) {
|
// When expected effects (e2) are empty, it means "no constraint" (e.g., callback parameter)
|
||||||
|
// so we allow any actual effects through
|
||||||
|
if !e2.is_empty() && !e1.is_subset(&e2) {
|
||||||
return Err(format!(
|
return Err(format!(
|
||||||
"Effect mismatch: expected {{{}}}, got {{{}}}",
|
"Effect mismatch: expected {{{}}}, got {{{}}}",
|
||||||
e1, e2
|
e1, e2
|
||||||
|
|||||||
Reference in New Issue
Block a user