Compare commits
57 Commits
98605d2b70
...
v0.1.9
| Author | SHA1 | Date | |
|---|---|---|---|
| 0f8babfd8b | |||
| 582d603513 | |||
| fbb7ddb6c3 | |||
| 400acc3f35 | |||
| ea3a7ca2dd | |||
| 7b40421a6a | |||
| 26b94935e9 | |||
| 018a799c05 | |||
| ec78286165 | |||
| f2688072ac | |||
| 746643527d | |||
| 091ff1e422 | |||
| 1fc472a54c | |||
| caabaeeb9c | |||
| 4e43d3d50d | |||
| fd5ed53b29 | |||
| 2800ce4e2d | |||
| ec365ebb3f | |||
| 52dcc88051 | |||
| 1842b668e5 | |||
| c67e3f31c3 | |||
| b0ccde749c | |||
| 4ba7a23ae3 | |||
| 89741b4a32 | |||
| 3a2376cd49 | |||
| 4dfb04a1b6 | |||
| 3cdde02eb2 | |||
| a5762d0397 | |||
| 1132c621c6 | |||
| a0fff1814e | |||
| 4e9e823246 | |||
| 6a2e4a7ac1 | |||
| 3d706cb32b | |||
| 7c3bfa9301 | |||
| b56c5461f1 | |||
| 61e1469845 | |||
| bb0a288210 | |||
| 5d7f4633e1 | |||
| d05b13d840 | |||
| 0ee3050704 | |||
| 80b1276f9f | |||
| bd843d2219 | |||
| d76aa17b38 | |||
| c23d9c7078 | |||
| fffacd2467 | |||
| 2ae2c132e5 | |||
| 4909ff9fff | |||
| 8e788c8a9f | |||
| dbdd3cca57 | |||
| 3ac022c04a | |||
| 6bedd37ac7 | |||
| 2909bf14b6 | |||
| d8871acf7e | |||
| 73b5eee664 | |||
| 542255780d | |||
| bac63bab2a | |||
| db82ca1a1c |
5
.gitignore
vendored
5
.gitignore
vendored
@@ -4,6 +4,11 @@
|
|||||||
# Claude Code project instructions
|
# Claude Code project instructions
|
||||||
CLAUDE.md
|
CLAUDE.md
|
||||||
|
|
||||||
|
# Build output
|
||||||
|
_site/
|
||||||
|
docs/*.html
|
||||||
|
docs/*.css
|
||||||
|
|
||||||
# Test binaries
|
# Test binaries
|
||||||
hello
|
hello
|
||||||
test_rc
|
test_rc
|
||||||
|
|||||||
78
CLAUDE.md
78
CLAUDE.md
@@ -42,15 +42,46 @@ When making changes:
|
|||||||
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
|
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
|
||||||
8. **Git commits**: Always use `--no-gpg-sign` flag
|
8. **Git commits**: Always use `--no-gpg-sign` flag
|
||||||
|
|
||||||
### Post-work checklist (run after each major piece of work)
|
### Post-work checklist (run after each committable change)
|
||||||
|
|
||||||
|
**MANDATORY: Run the full validation script after every committable change:**
|
||||||
|
```bash
|
||||||
|
./scripts/validate.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This script runs ALL of the following checks and will fail if any regress:
|
||||||
|
1. `cargo check` — no Rust compilation errors
|
||||||
|
2. `cargo test` — all Rust tests pass (currently 387)
|
||||||
|
3. `cargo build --release` — release binary builds
|
||||||
|
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
|
||||||
|
5. `lux check` on every package — type checking + lint passes
|
||||||
|
|
||||||
|
If `validate.sh` is not available or you need to run manually:
|
||||||
```bash
|
```bash
|
||||||
nix develop --command cargo check # No Rust errors
|
nix develop --command cargo check # No Rust errors
|
||||||
nix develop --command cargo test # All tests pass (currently 381)
|
nix develop --command cargo test # All Rust tests pass
|
||||||
./target/release/lux check # Type check + lint all .lux files
|
nix develop --command cargo build --release # Build release binary
|
||||||
./target/release/lux fmt # Format all .lux files
|
cd ../packages/path && ../../lang/target/release/lux test # Package tests
|
||||||
./target/release/lux lint # Standalone lint pass
|
cd ../packages/frontmatter && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/xml && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/rss && ../../lang/target/release/lux test
|
||||||
|
cd ../packages/markdown && ../../lang/target/release/lux test
|
||||||
```
|
```
|
||||||
|
|
||||||
|
**Do NOT commit if any check fails.** Fix the issue first.
|
||||||
|
|
||||||
|
### Commit after every piece of work
|
||||||
|
**After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
|
||||||
|
|
||||||
|
**Commit workflow:**
|
||||||
|
1. Make the change
|
||||||
|
2. Run `./scripts/validate.sh` (all 13 checks must pass)
|
||||||
|
3. `git add` the relevant files
|
||||||
|
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
|
||||||
|
5. Move on to the next task
|
||||||
|
|
||||||
|
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
|
||||||
|
|
||||||
**IMPORTANT: Always verify Lux code you write:**
|
**IMPORTANT: Always verify Lux code you write:**
|
||||||
- Run with interpreter: `./target/release/lux file.lux`
|
- Run with interpreter: `./target/release/lux file.lux`
|
||||||
- Compile to binary: `./target/release/lux compile file.lux`
|
- Compile to binary: `./target/release/lux compile file.lux`
|
||||||
@@ -68,10 +99,45 @@ nix develop --command cargo test # All tests pass (currently 381)
|
|||||||
| `lux serve` | `lux s` | Static file server |
|
| `lux serve` | `lux s` | Static file server |
|
||||||
| `lux compile` | `lux c` | Compile to binary |
|
| `lux compile` | `lux c` | Compile to binary |
|
||||||
|
|
||||||
|
## Documenting Lux Language Errors
|
||||||
|
|
||||||
|
When working on any major task that involves writing Lux code, **document every language error, limitation, or surprising behavior** you encounter. This log is optimized for LLM consumption so future sessions can avoid repeating mistakes.
|
||||||
|
|
||||||
|
**File:** Maintain an `ISSUES.md` in the relevant project directory (e.g., `~/src/blu-site/ISSUES.md`).
|
||||||
|
|
||||||
|
**Format for each entry:**
|
||||||
|
```markdown
|
||||||
|
## Issue N: <Short descriptive title>
|
||||||
|
|
||||||
|
**Category**: Parser limitation | Type checker gap | Missing feature | Runtime error | Documentation gap
|
||||||
|
**Severity**: High | Medium | Low
|
||||||
|
**Status**: Open | **Fixed** (commit hash or version)
|
||||||
|
|
||||||
|
<1-2 sentence description of the problem>
|
||||||
|
|
||||||
|
**Reproduction:**
|
||||||
|
```lux
|
||||||
|
// Minimal code that triggers the issue
|
||||||
|
```
|
||||||
|
|
||||||
|
**Error message:** `<exact error text>`
|
||||||
|
|
||||||
|
**Workaround:** <how to accomplish the goal despite the limitation>
|
||||||
|
|
||||||
|
**Fix:** <if fixed, what was changed and where>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Rules:**
|
||||||
|
- Add new issues as you encounter them during any task
|
||||||
|
- When a previously documented issue gets fixed, update its status to **Fixed** and note the commit/version
|
||||||
|
- Remove entries that are no longer relevant (e.g., the feature was redesigned entirely)
|
||||||
|
- Keep the summary table at the bottom of ISSUES.md in sync with the entries
|
||||||
|
- Do NOT duplicate issues already documented -- check existing entries first
|
||||||
|
|
||||||
## Code Quality
|
## Code Quality
|
||||||
|
|
||||||
- Fix all compiler warnings before committing
|
- Fix all compiler warnings before committing
|
||||||
- Ensure all tests pass (currently 381 tests)
|
- Ensure all tests pass (currently 387 tests)
|
||||||
- Add new tests when adding features
|
- Add new tests when adding features
|
||||||
- Keep examples and documentation in sync
|
- Keep examples and documentation in sync
|
||||||
|
|
||||||
|
|||||||
217
Cargo.lock
generated
217
Cargo.lock
generated
@@ -135,16 +135,6 @@ dependencies = [
|
|||||||
"libc",
|
"libc",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "core-foundation"
|
|
||||||
version = "0.10.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "b2a6cd9ae233e7f62ba4e9353e81a88df7fc8a5987b8d445b4d90c879bd156f6"
|
|
||||||
dependencies = [
|
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "core-foundation-sys"
|
name = "core-foundation-sys"
|
||||||
version = "0.8.7"
|
version = "0.8.7"
|
||||||
@@ -297,21 +287,6 @@ version = "0.1.5"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
|
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "foreign-types"
|
|
||||||
version = "0.3.2"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
|
|
||||||
dependencies = [
|
|
||||||
"foreign-types-shared",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "foreign-types-shared"
|
|
||||||
version = "0.1.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "form_urlencoded"
|
name = "form_urlencoded"
|
||||||
version = "1.2.2"
|
version = "1.2.2"
|
||||||
@@ -417,6 +392,12 @@ dependencies = [
|
|||||||
"wasip3",
|
"wasip3",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "glob"
|
||||||
|
version = "0.3.3"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "0cc23270f6e1808e30a928bdc84dea0b9b4136a8bc82338574f23baf47bbd280"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "h2"
|
name = "h2"
|
||||||
version = "0.3.27"
|
version = "0.3.27"
|
||||||
@@ -552,16 +533,17 @@ dependencies = [
|
|||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "hyper-tls"
|
name = "hyper-rustls"
|
||||||
version = "0.5.0"
|
version = "0.24.2"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905"
|
checksum = "ec3efd23720e2049821a693cbc7e65ea87c72f1c58ff2f9522ff332b1491e590"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bytes",
|
"futures-util",
|
||||||
|
"http",
|
||||||
"hyper",
|
"hyper",
|
||||||
"native-tls",
|
"rustls",
|
||||||
"tokio",
|
"tokio",
|
||||||
"tokio-native-tls",
|
"tokio-rustls",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -794,8 +776,9 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "lux"
|
name = "lux"
|
||||||
version = "0.1.0"
|
version = "0.1.8"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
|
"glob",
|
||||||
"lsp-server",
|
"lsp-server",
|
||||||
"lsp-types",
|
"lsp-types",
|
||||||
"postgres",
|
"postgres",
|
||||||
@@ -843,23 +826,6 @@ dependencies = [
|
|||||||
"windows-sys 0.61.2",
|
"windows-sys 0.61.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "native-tls"
|
|
||||||
version = "0.2.16"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "9d5d26952a508f321b4d3d2e80e78fc2603eaefcdf0c30783867f19586518bdc"
|
|
||||||
dependencies = [
|
|
||||||
"libc",
|
|
||||||
"log",
|
|
||||||
"openssl",
|
|
||||||
"openssl-probe",
|
|
||||||
"openssl-sys",
|
|
||||||
"schannel",
|
|
||||||
"security-framework",
|
|
||||||
"security-framework-sys",
|
|
||||||
"tempfile",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "nibble_vec"
|
name = "nibble_vec"
|
||||||
version = "0.1.0"
|
version = "0.1.0"
|
||||||
@@ -905,50 +871,6 @@ version = "1.21.3"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
|
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl"
|
|
||||||
version = "0.10.75"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
|
|
||||||
dependencies = [
|
|
||||||
"bitflags 2.10.0",
|
|
||||||
"cfg-if",
|
|
||||||
"foreign-types",
|
|
||||||
"libc",
|
|
||||||
"once_cell",
|
|
||||||
"openssl-macros",
|
|
||||||
"openssl-sys",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-macros"
|
|
||||||
version = "0.1.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
|
|
||||||
dependencies = [
|
|
||||||
"proc-macro2",
|
|
||||||
"quote",
|
|
||||||
"syn",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-probe"
|
|
||||||
version = "0.2.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "openssl-sys"
|
|
||||||
version = "0.9.111"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
|
|
||||||
dependencies = [
|
|
||||||
"cc",
|
|
||||||
"libc",
|
|
||||||
"pkg-config",
|
|
||||||
"vcpkg",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "parking_lot"
|
name = "parking_lot"
|
||||||
version = "0.12.5"
|
version = "0.12.5"
|
||||||
@@ -1203,15 +1125,15 @@ dependencies = [
|
|||||||
"http",
|
"http",
|
||||||
"http-body",
|
"http-body",
|
||||||
"hyper",
|
"hyper",
|
||||||
"hyper-tls",
|
"hyper-rustls",
|
||||||
"ipnet",
|
"ipnet",
|
||||||
"js-sys",
|
"js-sys",
|
||||||
"log",
|
"log",
|
||||||
"mime",
|
"mime",
|
||||||
"native-tls",
|
|
||||||
"once_cell",
|
"once_cell",
|
||||||
"percent-encoding",
|
"percent-encoding",
|
||||||
"pin-project-lite",
|
"pin-project-lite",
|
||||||
|
"rustls",
|
||||||
"rustls-pemfile",
|
"rustls-pemfile",
|
||||||
"serde",
|
"serde",
|
||||||
"serde_json",
|
"serde_json",
|
||||||
@@ -1219,15 +1141,30 @@ dependencies = [
|
|||||||
"sync_wrapper",
|
"sync_wrapper",
|
||||||
"system-configuration",
|
"system-configuration",
|
||||||
"tokio",
|
"tokio",
|
||||||
"tokio-native-tls",
|
"tokio-rustls",
|
||||||
"tower-service",
|
"tower-service",
|
||||||
"url",
|
"url",
|
||||||
"wasm-bindgen",
|
"wasm-bindgen",
|
||||||
"wasm-bindgen-futures",
|
"wasm-bindgen-futures",
|
||||||
"web-sys",
|
"web-sys",
|
||||||
|
"webpki-roots",
|
||||||
"winreg",
|
"winreg",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "ring"
|
||||||
|
version = "0.17.14"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "a4689e6c2294d81e88dc6261c768b63bc4fcdb852be6d1352498b114f61383b7"
|
||||||
|
dependencies = [
|
||||||
|
"cc",
|
||||||
|
"cfg-if",
|
||||||
|
"getrandom 0.2.17",
|
||||||
|
"libc",
|
||||||
|
"untrusted",
|
||||||
|
"windows-sys 0.52.0",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rusqlite"
|
name = "rusqlite"
|
||||||
version = "0.31.0"
|
version = "0.31.0"
|
||||||
@@ -1255,6 +1192,18 @@ dependencies = [
|
|||||||
"windows-sys 0.61.2",
|
"windows-sys 0.61.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rustls"
|
||||||
|
version = "0.21.12"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "3f56a14d1f48b391359b22f731fd4bd7e43c97f3c50eee276f3aa09c94784d3e"
|
||||||
|
dependencies = [
|
||||||
|
"log",
|
||||||
|
"ring",
|
||||||
|
"rustls-webpki",
|
||||||
|
"sct",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rustls-pemfile"
|
name = "rustls-pemfile"
|
||||||
version = "1.0.4"
|
version = "1.0.4"
|
||||||
@@ -1264,6 +1213,16 @@ dependencies = [
|
|||||||
"base64 0.21.7",
|
"base64 0.21.7",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rustls-webpki"
|
||||||
|
version = "0.101.7"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "8b6275d1ee7a1cd780b64aca7726599a1dbc893b1e64144529e55c3c2f745765"
|
||||||
|
dependencies = [
|
||||||
|
"ring",
|
||||||
|
"untrusted",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "rustversion"
|
name = "rustversion"
|
||||||
version = "1.0.22"
|
version = "1.0.22"
|
||||||
@@ -1298,15 +1257,6 @@ version = "1.0.23"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
|
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "schannel"
|
|
||||||
version = "0.1.28"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
|
|
||||||
dependencies = [
|
|
||||||
"windows-sys 0.61.2",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "scopeguard"
|
name = "scopeguard"
|
||||||
version = "1.2.0"
|
version = "1.2.0"
|
||||||
@@ -1314,26 +1264,13 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "security-framework"
|
name = "sct"
|
||||||
version = "3.6.0"
|
version = "0.7.1"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d17b898a6d6948c3a8ee4372c17cb384f90d2e6e912ef00895b14fd7ab54ec38"
|
checksum = "da046153aa2352493d6cb7da4b6e5c0c057d8a1d0a9aa8560baffdd945acd414"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bitflags 2.10.0",
|
"ring",
|
||||||
"core-foundation 0.10.1",
|
"untrusted",
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
"security-framework-sys",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "security-framework-sys"
|
|
||||||
version = "2.16.0"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "321c8673b092a9a42605034a9879d73cb79101ed5fd117bc9a597b89b4e9e61a"
|
|
||||||
dependencies = [
|
|
||||||
"core-foundation-sys",
|
|
||||||
"libc",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
@@ -1521,7 +1458,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
|||||||
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
|
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"bitflags 1.3.2",
|
"bitflags 1.3.2",
|
||||||
"core-foundation 0.9.4",
|
"core-foundation",
|
||||||
"system-configuration-sys",
|
"system-configuration-sys",
|
||||||
]
|
]
|
||||||
|
|
||||||
@@ -1619,16 +1556,6 @@ dependencies = [
|
|||||||
"windows-sys 0.61.2",
|
"windows-sys 0.61.2",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "tokio-native-tls"
|
|
||||||
version = "0.3.1"
|
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
|
||||||
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
|
|
||||||
dependencies = [
|
|
||||||
"native-tls",
|
|
||||||
"tokio",
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tokio-postgres"
|
name = "tokio-postgres"
|
||||||
version = "0.7.16"
|
version = "0.7.16"
|
||||||
@@ -1655,6 +1582,16 @@ dependencies = [
|
|||||||
"whoami",
|
"whoami",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "tokio-rustls"
|
||||||
|
version = "0.24.1"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "c28327cf380ac148141087fbfb9de9d7bd4e84ab5d2c28fbc911d753de8a7081"
|
||||||
|
dependencies = [
|
||||||
|
"rustls",
|
||||||
|
"tokio",
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "tokio-util"
|
name = "tokio-util"
|
||||||
version = "0.7.18"
|
version = "0.7.18"
|
||||||
@@ -1750,6 +1687,12 @@ version = "0.2.6"
|
|||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
|
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "untrusted"
|
||||||
|
version = "0.9.0"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "url"
|
name = "url"
|
||||||
version = "2.5.8"
|
version = "2.5.8"
|
||||||
@@ -1941,6 +1884,12 @@ dependencies = [
|
|||||||
"wasm-bindgen",
|
"wasm-bindgen",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "webpki-roots"
|
||||||
|
version = "0.25.4"
|
||||||
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
|
checksum = "5f20c57d8d7db6d3b86154206ae5d8fba62dd39573114de97c2cb0578251f8e1"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "whoami"
|
name = "whoami"
|
||||||
version = "2.1.1"
|
version = "2.1.1"
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "lux"
|
name = "lux"
|
||||||
version = "0.1.0"
|
version = "0.1.9"
|
||||||
edition = "2021"
|
edition = "2021"
|
||||||
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
|
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
@@ -13,10 +13,11 @@ lsp-types = "0.94"
|
|||||||
serde = { version = "1", features = ["derive"] }
|
serde = { version = "1", features = ["derive"] }
|
||||||
serde_json = "1"
|
serde_json = "1"
|
||||||
rand = "0.8"
|
rand = "0.8"
|
||||||
reqwest = { version = "0.11", features = ["blocking", "json"] }
|
reqwest = { version = "0.11", default-features = false, features = ["blocking", "json", "rustls-tls"] }
|
||||||
tiny_http = "0.12"
|
tiny_http = "0.12"
|
||||||
rusqlite = { version = "0.31", features = ["bundled"] }
|
rusqlite = { version = "0.31", features = ["bundled"] }
|
||||||
postgres = "0.19"
|
postgres = "0.19"
|
||||||
|
glob = "0.3"
|
||||||
|
|
||||||
|
|
||||||
[dev-dependencies]
|
[dev-dependencies]
|
||||||
|
|||||||
367
PACKAGES.md
Normal file
367
PACKAGES.md
Normal file
@@ -0,0 +1,367 @@
|
|||||||
|
# Lux Package Ecosystem Plan
|
||||||
|
|
||||||
|
## Current State
|
||||||
|
|
||||||
|
### Stdlib (built-in)
|
||||||
|
| Module | Coverage |
|
||||||
|
|--------|----------|
|
||||||
|
| String | Comprehensive (split, join, trim, indexOf, replace, etc.) |
|
||||||
|
| List | Good (map, filter, fold, head, tail, concat, range, find, any, all, take, drop) |
|
||||||
|
| Option | Basic (map, flatMap, getOrElse, isSome, isNone) |
|
||||||
|
| Result | Basic (map, flatMap, getOrElse, isOk, isErr) |
|
||||||
|
| Math | Basic (abs, min, max, sqrt, pow, floor, ceil, round) |
|
||||||
|
| Json | Comprehensive (parse, stringify, get, typed extractors, constructors) |
|
||||||
|
| File | Good (read, write, append, exists, delete, readDir, isDir, mkdir) |
|
||||||
|
| Console | Good (print, read, readLine, readInt) |
|
||||||
|
| Process | Good (exec, execStatus, env, args, exit, cwd) |
|
||||||
|
| Http | Basic (get, post, put, delete, setHeader) |
|
||||||
|
| HttpServer | Basic (listen, accept, respond) |
|
||||||
|
| Time | Minimal (now, sleep) |
|
||||||
|
| Random | Basic (int, float, bool) |
|
||||||
|
| Sql | Good (SQLite: open, query, execute, transactions) |
|
||||||
|
| Postgres | Good (connect, query, execute, transactions) |
|
||||||
|
| Schema | Niche (versioned data migration) |
|
||||||
|
| Test | Good (assert, assertEqual, assertTrue) |
|
||||||
|
| Concurrent | Experimental (spawn, await, yield, cancel) |
|
||||||
|
| Channel | Experimental (create, send, receive) |
|
||||||
|
|
||||||
|
### Registry (pkgs.lux) - 3 packages
|
||||||
|
| Package | Version | Notes |
|
||||||
|
|---------|---------|-------|
|
||||||
|
| json | 1.0.0 | Wraps stdlib Json with convenience functions (getPath, getString, etc.) |
|
||||||
|
| http-client | 0.1.0 | Wraps stdlib Http with JSON helpers, URL encoding |
|
||||||
|
| testing | 0.1.0 | Wraps stdlib Test with describe/it structure |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Gap Analysis
|
||||||
|
|
||||||
|
### What's Missing vs Other Languages
|
||||||
|
|
||||||
|
Compared to ecosystems like Rust/cargo, Go, Python, Elm, Gleam:
|
||||||
|
|
||||||
|
| Category | Gap | Impact | Notes |
|
||||||
|
|----------|-----|--------|-------|
|
||||||
|
| **Collections** | No HashMap, Set, Queue, Stack | Critical | List-of-pairs with O(n) lookup is the only option |
|
||||||
|
| **Sorting** | No List.sort or List.sortBy | High | Must implement insertion sort manually |
|
||||||
|
| **Date/Time** | Only `Time.now()` (epoch ms), no parsing/formatting | High | blu-site does string-based date formatting manually |
|
||||||
|
| **Markdown** | No markdown parser | High | blu-site has 300+ lines of hand-rolled markdown |
|
||||||
|
| **XML/RSS** | No XML generation | High | Can't generate RSS feeds or sitemaps |
|
||||||
|
| **Regex** | No pattern matching on strings | High | Character-by-character scanning required |
|
||||||
|
| **Path** | No file path utilities | Medium | basename/dirname manually reimplemented |
|
||||||
|
| **YAML/TOML** | No config file parsing (beyond JSON) | Medium | Frontmatter parsing is manual |
|
||||||
|
| **Template** | No string templating | Medium | HTML built via raw string concatenation |
|
||||||
|
| **URL** | No URL parsing/encoding | Medium | http-client has basic urlEncode but no parser |
|
||||||
|
| **Crypto** | No hashing (SHA256, etc.) | Medium | Can't do checksums, content hashing |
|
||||||
|
| **Base64** | No encoding/decoding | Low | Needed for data URIs, some auth |
|
||||||
|
| **CSV** | No CSV parsing | Low | Common data format |
|
||||||
|
| **UUID** | No UUID generation | Low | Useful for IDs |
|
||||||
|
| **Logging** | No structured logging | Low | Just Console.print |
|
||||||
|
| **CLI** | No argument parsing library | Low | Manual arg handling |
|
||||||
|
|
||||||
|
### What Should Be Stdlib vs Package
|
||||||
|
|
||||||
|
**Should be stdlib additions** (too fundamental to be packages):
|
||||||
|
- HashMap / Map type (requires runtime support)
|
||||||
|
- List.sort / List.sortBy (fundamental operation)
|
||||||
|
- Better Time module (date parsing, formatting)
|
||||||
|
- Regex (needs runtime/C support for performance)
|
||||||
|
- Path module (cross-platform file path handling)
|
||||||
|
|
||||||
|
**Should be packages** (application-level, opinionated, composable):
|
||||||
|
- markdown
|
||||||
|
- xml
|
||||||
|
- rss/atom
|
||||||
|
- frontmatter
|
||||||
|
- template
|
||||||
|
- csv
|
||||||
|
- crypto
|
||||||
|
- ssg (static site generator framework)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Priority Package Plans
|
||||||
|
|
||||||
|
Ordered by what unblocks blu-site fixes first, then general ecosystem value.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 1: `markdown` (Priority: HIGHEST)
|
||||||
|
|
||||||
|
**Why:** The 300-line markdown parser in blu-site's main.lux is general-purpose code that belongs in a reusable package. It's also the most complex part of blu-site and has known bugs (e.g., `### ` inside list items renders literally).
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
markdown/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: parse, parseInline
|
||||||
|
src/
|
||||||
|
inline.lux # Inline parsing (bold, italic, links, images, code)
|
||||||
|
block.lux # Block parsing (headings, lists, code blocks, blockquotes, hr)
|
||||||
|
types.lux # AST types (optional - could emit HTML directly)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
// Convert markdown string to HTML string
|
||||||
|
pub fn toHtml(markdown: String): String
|
||||||
|
|
||||||
|
// Convert inline markdown only (no blocks)
|
||||||
|
pub fn inlineToHtml(text: String): String
|
||||||
|
|
||||||
|
// Escape HTML entities
|
||||||
|
pub fn escapeHtml(s: String): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Improvements over current blu-site code:**
|
||||||
|
- Fix heading-inside-list-item rendering (`- ### Title` should work)
|
||||||
|
- Support nested lists (currently flat only)
|
||||||
|
- Support reference-style links `[text][ref]`
|
||||||
|
- Handle edge cases (empty lines in code blocks, nested blockquotes)
|
||||||
|
- Proper HTML entity escaping in more contexts
|
||||||
|
|
||||||
|
**Depends on:** Nothing (pure string processing)
|
||||||
|
|
||||||
|
**Estimated size:** ~400-500 lines of Lux
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 2: `xml` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** Needed for RSS/Atom feed generation, sitemap.xml, and robots.txt generation. General-purpose XML builder that doesn't try to parse XML (which would need regex), just emits it.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
xml/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: element, document, serialize
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type XmlNode =
|
||||||
|
| Element(String, List<XmlAttr>, List<XmlNode>)
|
||||||
|
| Text(String)
|
||||||
|
| CData(String)
|
||||||
|
| Comment(String)
|
||||||
|
| Declaration(String, String) // version, encoding
|
||||||
|
|
||||||
|
type XmlAttr =
|
||||||
|
| Attr(String, String)
|
||||||
|
|
||||||
|
// Build an XML element
|
||||||
|
pub fn element(tag: String, attrs: List<XmlAttr>, children: List<XmlNode>): XmlNode
|
||||||
|
|
||||||
|
// Build a text node (auto-escapes)
|
||||||
|
pub fn text(content: String): XmlNode
|
||||||
|
|
||||||
|
// Build a CDATA section
|
||||||
|
pub fn cdata(content: String): XmlNode
|
||||||
|
|
||||||
|
// Serialize XML tree to string
|
||||||
|
pub fn serialize(node: XmlNode): String
|
||||||
|
|
||||||
|
// Serialize with XML declaration header
|
||||||
|
pub fn document(version: String, encoding: String, root: XmlNode): String
|
||||||
|
|
||||||
|
// Convenience: self-closing element
|
||||||
|
pub fn selfClosing(tag: String, attrs: List<XmlAttr>): XmlNode
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~150-200 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 3: `rss` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** Directly needed for blu-site's #6 priority fix (add RSS feed). Builds on `xml` package.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
rss/
|
||||||
|
lux.toml # depends on xml
|
||||||
|
lib.lux # Public API: feed, item, toXml, toAtom
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type FeedInfo =
|
||||||
|
| FeedInfo(String, String, String, String, String)
|
||||||
|
// title, link, description, language, lastBuildDate
|
||||||
|
|
||||||
|
type FeedItem =
|
||||||
|
| FeedItem(String, String, String, String, String, String)
|
||||||
|
// title, link, description, pubDate, guid, categories (comma-separated)
|
||||||
|
|
||||||
|
// Generate RSS 2.0 XML string
|
||||||
|
pub fn toRss(info: FeedInfo, items: List<FeedItem>): String
|
||||||
|
|
||||||
|
// Generate Atom 1.0 XML string
|
||||||
|
pub fn toAtom(info: FeedInfo, items: List<FeedItem>): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** `xml`
|
||||||
|
|
||||||
|
**Estimated size:** ~100-150 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 4: `frontmatter` (Priority: HIGH)
|
||||||
|
|
||||||
|
**Why:** blu-site has ~50 lines of fragile frontmatter parsing. This is a common need for any content-driven Lux project. The current parser uses `String.indexOf(line, ": ")` which breaks on values containing `: `.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
frontmatter/
|
||||||
|
lux.toml
|
||||||
|
lib.lux # Public API: parse
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type FrontmatterResult =
|
||||||
|
| FrontmatterResult(List<(String, String)>, String)
|
||||||
|
// key-value pairs, remaining body
|
||||||
|
|
||||||
|
// Parse frontmatter from a string (--- delimited YAML-like header)
|
||||||
|
pub fn parse(content: String): FrontmatterResult
|
||||||
|
|
||||||
|
// Get a value by key from parsed frontmatter
|
||||||
|
pub fn get(pairs: List<(String, String)>, key: String): Option<String>
|
||||||
|
|
||||||
|
// Get a value or default
|
||||||
|
pub fn getOrDefault(pairs: List<(String, String)>, key: String, default: String): String
|
||||||
|
|
||||||
|
// Parse a space-separated tag string into a list
|
||||||
|
pub fn parseTags(tagString: String): List<String>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Improvements over current blu-site code:**
|
||||||
|
- Handle values with `: ` in them (only split on first `: `)
|
||||||
|
- Handle multi-line values (indented continuation)
|
||||||
|
- Handle quoted values with embedded newlines
|
||||||
|
- Strip quotes from values consistently
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~100-150 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 5: `path` (Priority: MEDIUM)
|
||||||
|
|
||||||
|
**Why:** blu-site manually implements `basename` and `dirname`. Any file-processing Lux program needs these. Tiny but universally useful.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
path/
|
||||||
|
lux.toml
|
||||||
|
lib.lux
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
// Get filename from path: "/foo/bar.txt" -> "bar.txt"
|
||||||
|
pub fn basename(p: String): String
|
||||||
|
|
||||||
|
// Get directory from path: "/foo/bar.txt" -> "/foo"
|
||||||
|
pub fn dirname(p: String): String
|
||||||
|
|
||||||
|
// Get file extension: "file.txt" -> "txt", "file" -> ""
|
||||||
|
pub fn extension(p: String): String
|
||||||
|
|
||||||
|
// Remove file extension: "file.txt" -> "file"
|
||||||
|
pub fn stem(p: String): String
|
||||||
|
|
||||||
|
// Join path segments: join("foo", "bar") -> "foo/bar"
|
||||||
|
pub fn join(a: String, b: String): String
|
||||||
|
|
||||||
|
// Normalize path: "foo//bar/../baz" -> "foo/baz"
|
||||||
|
pub fn normalize(p: String): String
|
||||||
|
|
||||||
|
// Check if path is absolute
|
||||||
|
pub fn isAbsolute(p: String): Bool
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** Nothing
|
||||||
|
|
||||||
|
**Estimated size:** ~80-120 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 6: `sitemap` (Priority: MEDIUM)
|
||||||
|
|
||||||
|
**Why:** Directly needed for blu-site's #9 priority fix. Simple package that generates sitemap.xml.
|
||||||
|
|
||||||
|
**Scope:**
|
||||||
|
```
|
||||||
|
sitemap/
|
||||||
|
lux.toml # depends on xml
|
||||||
|
lib.lux
|
||||||
|
```
|
||||||
|
|
||||||
|
**Public API:**
|
||||||
|
```lux
|
||||||
|
type SitemapEntry =
|
||||||
|
| SitemapEntry(String, String, String, String)
|
||||||
|
// url, lastmod (ISO date), changefreq, priority
|
||||||
|
|
||||||
|
// Generate sitemap.xml string
|
||||||
|
pub fn generate(entries: List<SitemapEntry>): String
|
||||||
|
|
||||||
|
// Generate a simple robots.txt pointing to the sitemap
|
||||||
|
pub fn robotsTxt(sitemapUrl: String): String
|
||||||
|
```
|
||||||
|
|
||||||
|
**Depends on:** `xml`
|
||||||
|
|
||||||
|
**Estimated size:** ~50-70 lines
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Package 7: `ssg` (Priority: LOW - future)
|
||||||
|
|
||||||
|
**Why:** Once markdown, frontmatter, rss, sitemap, and path packages exist, the remaining logic in blu-site's main.lux is generic SSG framework code: read content dirs, parse posts, sort by date, generate section indexes, generate tag pages, copy static assets. This could be extracted into a framework package that other Lux users could use to build their own static sites.
|
||||||
|
|
||||||
|
**This should wait** until the foundation packages above are stable and battle-tested through blu-site usage.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Non-Package Stdlib Improvements Needed
|
||||||
|
|
||||||
|
These gaps are too fundamental to be packages and should be added to the Lux language itself:
|
||||||
|
|
||||||
|
### HashMap (Critical)
|
||||||
|
Every package above that needs key-value lookups (frontmatter, xml attributes, etc.) is working around the lack of HashMap with `List<(String, String)>`. This is O(n) per lookup and makes code verbose. A stdlib `Map` module would transform the ecosystem.
|
||||||
|
|
||||||
|
### List.sort / List.sortBy (High)
|
||||||
|
blu-site implements insertion sort manually. Every content-driven app needs sorting. This should be a stdlib function.
|
||||||
|
|
||||||
|
### Time.format / Time.parse (High)
|
||||||
|
blu-site manually parses "2025-01-15" by substring extraction and maps month numbers to names. A proper date/time library (even just ISO 8601 parsing and basic formatting) would help every package above.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Order
|
||||||
|
|
||||||
|
```
|
||||||
|
Phase 1 (unblock blu-site fixes):
|
||||||
|
1. markdown - extract from blu-site, fix bugs, publish
|
||||||
|
2. frontmatter - extract from blu-site, improve robustness
|
||||||
|
3. path - tiny, universally useful
|
||||||
|
4. xml - needed by rss and sitemap
|
||||||
|
|
||||||
|
Phase 2 (complete blu-site features):
|
||||||
|
5. rss - depends on xml
|
||||||
|
6. sitemap - depends on xml
|
||||||
|
|
||||||
|
Phase 3 (ecosystem growth):
|
||||||
|
7. template - string templating (mustache-like)
|
||||||
|
8. csv - data processing
|
||||||
|
9. cli - argument parsing
|
||||||
|
10. ssg - framework extraction from blu-site
|
||||||
|
```
|
||||||
|
|
||||||
|
Each package should be developed in its own directory under `~/src/`, published to the git.qrty.ink registry, and tested by integrating it into blu-site.
|
||||||
19
README.md
19
README.md
@@ -2,15 +2,22 @@
|
|||||||
|
|
||||||
A functional programming language with first-class effects, schema evolution, and behavioral types.
|
A functional programming language with first-class effects, schema evolution, and behavioral types.
|
||||||
|
|
||||||
## Vision
|
## Philosophy
|
||||||
|
|
||||||
Most programming languages treat three critical concerns as afterthoughts:
|
**Make the important things visible.**
|
||||||
|
|
||||||
1. **Effects** — What can this code do? (Hidden, untraceable, untestable)
|
Most languages hide what matters most: what code can do (effects), how data changes over time (schema evolution), and what guarantees functions provide (behavioral properties). Lux makes all three first-class, compiler-checked language features.
|
||||||
2. **Data Evolution** — Types change, data persists. (Manual migrations, runtime failures)
|
|
||||||
3. **Behavioral Properties** — Is this idempotent? Does it terminate? (Comments and hope)
|
|
||||||
|
|
||||||
Lux makes these first-class language features. The compiler knows what your code does, how your data evolves, and what properties your functions guarantee.
|
| Principle | What it means |
|
||||||
|
|-----------|--------------|
|
||||||
|
| **Explicit over implicit** | Effects in types — see what code does |
|
||||||
|
| **Composition over configuration** | No DI frameworks — effects compose naturally |
|
||||||
|
| **Safety without ceremony** | Type inference + explicit signatures where they matter |
|
||||||
|
| **Practical over academic** | Familiar syntax, ML semantics, no monads |
|
||||||
|
| **One right way** | Opinionated formatter, integrated tooling, built-in test framework |
|
||||||
|
| **Tools are the language** | `lux fmt/lint/check/test/compile` — one binary, not seven tools |
|
||||||
|
|
||||||
|
See [docs/PHILOSOPHY.md](./docs/PHILOSOPHY.md) for the full philosophy with language comparisons and design rationale.
|
||||||
|
|
||||||
## Core Principles
|
## Core Principles
|
||||||
|
|
||||||
|
|||||||
38
build.rs
Normal file
38
build.rs
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
use std::path::PathBuf;
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
// Capture the absolute C compiler path at build time so the binary is self-contained.
|
||||||
|
// This is critical for Nix builds where cc/gcc live in /nix/store paths.
|
||||||
|
let cc_path = std::env::var("CC").ok()
|
||||||
|
.filter(|s| !s.is_empty())
|
||||||
|
.and_then(|s| resolve_absolute(&s))
|
||||||
|
.or_else(|| find_in_path("cc"))
|
||||||
|
.or_else(|| find_in_path("gcc"))
|
||||||
|
.or_else(|| find_in_path("clang"))
|
||||||
|
.unwrap_or_default();
|
||||||
|
|
||||||
|
println!("cargo:rustc-env=LUX_CC_PATH={}", cc_path);
|
||||||
|
println!("cargo:rerun-if-env-changed=CC");
|
||||||
|
println!("cargo:rerun-if-env-changed=PATH");
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Resolve a command name to its absolute path by searching PATH.
|
||||||
|
fn find_in_path(cmd: &str) -> Option<String> {
|
||||||
|
let path_var = std::env::var("PATH").ok()?;
|
||||||
|
for dir in path_var.split(':') {
|
||||||
|
let candidate = PathBuf::from(dir).join(cmd);
|
||||||
|
if candidate.is_file() {
|
||||||
|
return Some(candidate.to_string_lossy().into_owned());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
|
/// If the path is already absolute and exists, return it. Otherwise search PATH.
|
||||||
|
fn resolve_absolute(cmd: &str) -> Option<String> {
|
||||||
|
let p = PathBuf::from(cmd);
|
||||||
|
if p.is_absolute() && p.is_file() {
|
||||||
|
return Some(cmd.to_string());
|
||||||
|
}
|
||||||
|
find_in_path(cmd)
|
||||||
|
}
|
||||||
449
docs/PHILOSOPHY.md
Normal file
449
docs/PHILOSOPHY.md
Normal file
@@ -0,0 +1,449 @@
|
|||||||
|
# The Lux Philosophy
|
||||||
|
|
||||||
|
## In One Sentence
|
||||||
|
|
||||||
|
**Make the important things visible.**
|
||||||
|
|
||||||
|
## The Three Pillars
|
||||||
|
|
||||||
|
Most programming languages hide the things that matter most in production:
|
||||||
|
|
||||||
|
1. **What can this code do?** — Side effects are invisible in function signatures
|
||||||
|
2. **How does data change over time?** — Schema evolution is a deployment problem, not a language one
|
||||||
|
3. **What guarantees does this code provide?** — Properties like idempotency live in comments and hope
|
||||||
|
|
||||||
|
Lux makes all three first-class, compiler-checked language features.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
### 1. Explicit Over Implicit
|
||||||
|
|
||||||
|
Every function signature tells you what it does:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
fn processOrder(order: Order): Receipt with {Database, Email, Logger}
|
||||||
|
```
|
||||||
|
|
||||||
|
You don't need to read the body, trace call chains, or check documentation. The signature *is* the documentation. Code review becomes: "should this function really send emails?"
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- Effects are declared in types, not hidden behind interfaces
|
||||||
|
- No dependency injection frameworks — just swap handlers
|
||||||
|
- No mocking libraries — test with different effect implementations
|
||||||
|
- No "spooky action at a distance" — if a function can fail, its type says so
|
||||||
|
|
||||||
|
**How this compares:**
|
||||||
|
| Language | Side effects | Lux equivalent |
|
||||||
|
|----------|-------------|----------------|
|
||||||
|
| JavaScript | Anything, anywhere, silently | `with {Console, Http, File}` |
|
||||||
|
| Python | Implicit, discovered by reading code | Effect declarations in signature |
|
||||||
|
| Java | Checked exceptions (partial), DI frameworks | Effects + handlers |
|
||||||
|
| Go | Return error values (partial) | `with {Fail}` or `Result` |
|
||||||
|
| Rust | `unsafe` blocks, `Result`/`Option` | Effects for I/O, Result for values |
|
||||||
|
| Haskell | Monad transformers (explicit but heavy) | Effects (explicit and lightweight) |
|
||||||
|
| Koka | Algebraic effects (similar) | Same family, more familiar syntax |
|
||||||
|
|
||||||
|
### 2. Composition Over Configuration
|
||||||
|
|
||||||
|
Things combine naturally without glue code:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Multiple effects compose by listing them
|
||||||
|
fn sync(id: UserId): User with {Database, Http, Logger} = ...
|
||||||
|
|
||||||
|
// Handlers compose by providing them
|
||||||
|
run sync(id) with {
|
||||||
|
Database = postgres(conn),
|
||||||
|
Http = realHttp,
|
||||||
|
Logger = consoleLogger
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
No monad transformers. No middleware stacks. No factory factories. Effects are sets; they union naturally.
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- Functions compose with `|>` (pipes)
|
||||||
|
- Effects compose by set union
|
||||||
|
- Types compose via generics and ADTs
|
||||||
|
- Tests compose by handler substitution
|
||||||
|
|
||||||
|
### 3. Safety Without Ceremony
|
||||||
|
|
||||||
|
The type system catches errors at compile time, but doesn't make you fight it:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Type inference keeps code clean
|
||||||
|
let x = 42 // Int, inferred
|
||||||
|
let names = ["Alice", "Bob"] // List<String>, inferred
|
||||||
|
|
||||||
|
// But function signatures are always explicit
|
||||||
|
fn greet(name: String): String = "Hello, {name}"
|
||||||
|
```
|
||||||
|
|
||||||
|
**The balance:**
|
||||||
|
- Function signatures: always annotated (documentation + API contract)
|
||||||
|
- Local bindings: inferred (reduces noise in implementation)
|
||||||
|
- Effects: declared or inferred (explicit at boundaries, lightweight inside)
|
||||||
|
- Behavioral properties: opt-in (`is pure`, `is total` — add when valuable)
|
||||||
|
|
||||||
|
### 4. Practical Over Academic
|
||||||
|
|
||||||
|
Lux borrows from the best of programming language research, but wraps it in familiar syntax:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// This is algebraic effects. But it reads like normal code.
|
||||||
|
fn main(): Unit with {Console} = {
|
||||||
|
Console.print("What's your name?")
|
||||||
|
let name = Console.readLine()
|
||||||
|
Console.print("Hello, {name}!")
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Compare with Haskell's equivalent:
|
||||||
|
```haskell
|
||||||
|
main :: IO ()
|
||||||
|
main = do
|
||||||
|
putStrLn "What's your name?"
|
||||||
|
name <- getLine
|
||||||
|
putStrLn ("Hello, " ++ name ++ "!")
|
||||||
|
```
|
||||||
|
|
||||||
|
Both are explicit about effects. Lux chooses syntax that reads like imperative code while maintaining the same guarantees.
|
||||||
|
|
||||||
|
**What this means in practice:**
|
||||||
|
- ML-family semantics, C-family appearance
|
||||||
|
- No monads to learn (effects replace them)
|
||||||
|
- No category theory prerequisites
|
||||||
|
- The learning curve is: functions → types → effects (days, not months)
|
||||||
|
|
||||||
|
### 5. One Right Way
|
||||||
|
|
||||||
|
Like Go and Python, Lux favors having one obvious way to do things:
|
||||||
|
|
||||||
|
- **One formatter** (`lux fmt`) — opinionated, not configurable, ends all style debates
|
||||||
|
- **One test framework** (built-in `Test` effect) — no framework shopping
|
||||||
|
- **One way to handle effects** — declare, handle, compose
|
||||||
|
- **One package manager** (`lux pkg`) — integrated, not bolted on
|
||||||
|
|
||||||
|
This is a deliberate rejection of the JavaScript/Ruby approach where every project assembles its own stack from dozens of competing libraries.
|
||||||
|
|
||||||
|
### 6. Tools Are Part of the Language
|
||||||
|
|
||||||
|
The compiler, linter, formatter, LSP, package manager, and test runner are one thing, not seven:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
lux fmt # Format
|
||||||
|
lux lint # Lint (with --explain for education)
|
||||||
|
lux check # Type check + lint
|
||||||
|
lux test # Run tests
|
||||||
|
lux compile # Build a binary
|
||||||
|
lux serve # Serve files
|
||||||
|
lux --lsp # Editor integration
|
||||||
|
```
|
||||||
|
|
||||||
|
This follows Go's philosophy: a language is its toolchain. The formatter knows the AST. The linter knows the type system. The LSP knows the effects. They're not afterthoughts.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Design Decisions and Their Reasons
|
||||||
|
|
||||||
|
### Why algebraic effects instead of monads?
|
||||||
|
|
||||||
|
Monads are powerful but have poor ergonomics for composition. Combining `IO`, `State`, and `Error` in Haskell requires monad transformers — a notoriously difficult concept. Effects compose naturally:
|
||||||
|
|
||||||
|
```lux
|
||||||
|
// Just list the effects you need. No transformers.
|
||||||
|
fn app(): Unit with {Console, File, Http, Time} = ...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Why not just `async/await`?
|
||||||
|
|
||||||
|
`async/await` solves one effect (concurrency). Effects solve all of them: I/O, state, randomness, failure, concurrency, logging, databases. One mechanism, universally applicable.
|
||||||
|
|
||||||
|
### Why require function type annotations?
|
||||||
|
|
||||||
|
Three reasons:
|
||||||
|
1. **Documentation**: Every function signature is self-documenting
|
||||||
|
2. **Error messages**: Inference failures produce confusing errors; annotations localize them
|
||||||
|
3. **API stability**: Changing a function body shouldn't silently change its type
|
||||||
|
|
||||||
|
### Why an opinionated formatter?
|
||||||
|
|
||||||
|
Style debates waste engineering time. `gofmt` proved that an opinionated, non-configurable formatter eliminates an entire category of bikeshedding. `lux fmt` does the same.
|
||||||
|
|
||||||
|
### Why immutable by default?
|
||||||
|
|
||||||
|
Mutable state is the root of most concurrency bugs and many logic bugs. Immutability makes code easier to reason about. When you need state, the `State` effect makes it explicit and trackable.
|
||||||
|
|
||||||
|
### Why behavioral types?
|
||||||
|
|
||||||
|
Properties like "this function is idempotent" or "this function always terminates" are critical for correctness but typically live in comments. Making them part of the type system means:
|
||||||
|
- The compiler can verify them (or generate property tests)
|
||||||
|
- Callers can require them (`where F is idempotent`)
|
||||||
|
- They serve as machine-readable documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Comparison with Popular Languages
|
||||||
|
|
||||||
|
### JavaScript / TypeScript (SO #1 / #6 by usage)
|
||||||
|
|
||||||
|
| Aspect | JavaScript/TypeScript | Lux |
|
||||||
|
|--------|----------------------|-----|
|
||||||
|
| **Type system** | Optional/gradual (TS) | Required, Hindley-Milner |
|
||||||
|
| **Side effects** | Anywhere, implicit | Declared in types |
|
||||||
|
| **Testing** | Mock libraries (Jest, etc.) | Swap effect handlers |
|
||||||
|
| **Formatting** | Prettier (configurable) | `lux fmt` (opinionated) |
|
||||||
|
| **Package management** | npm (massive ecosystem) | `lux pkg` (small ecosystem) |
|
||||||
|
| **Paradigm** | Multi-paradigm | Functional-first |
|
||||||
|
| **Null safety** | Optional chaining (partial) | `Option<T>`, no null |
|
||||||
|
| **Error handling** | try/catch (unchecked) | `Result<T, E>` + `Fail` effect |
|
||||||
|
| **Shared** | Familiar syntax, first-class functions, closures, string interpolation |
|
||||||
|
|
||||||
|
**What Lux learns from JS/TS:** Familiar syntax matters. String interpolation, arrow functions, and readable code lower the barrier to entry.
|
||||||
|
|
||||||
|
**What Lux rejects:** Implicit `any`, unchecked exceptions, the "pick your own adventure" toolchain.
|
||||||
|
|
||||||
|
### Python (SO #4 by usage, #1 most desired)
|
||||||
|
|
||||||
|
| Aspect | Python | Lux |
|
||||||
|
|--------|--------|-----|
|
||||||
|
| **Type system** | Optional (type hints) | Required, static |
|
||||||
|
| **Side effects** | Implicit | Explicit |
|
||||||
|
| **Performance** | Slow (interpreted) | Faster (compiled to C) |
|
||||||
|
| **Syntax** | Whitespace-significant | Braces/keywords |
|
||||||
|
| **Immutability** | Mutable by default | Immutable by default |
|
||||||
|
| **Tooling** | Fragmented (black, ruff, mypy, pytest...) | Unified (`lux` binary) |
|
||||||
|
| **Shared** | Clean syntax philosophy, "one way to do it", readability focus |
|
||||||
|
|
||||||
|
**What Lux learns from Python:** Readability counts. The Zen of Python's emphasis on one obvious way to do things resonates with Lux's design.
|
||||||
|
|
||||||
|
**What Lux rejects:** Dynamic typing, mutable-by-default, fragmented tooling.
|
||||||
|
|
||||||
|
### Rust (SO #1 most admired)
|
||||||
|
|
||||||
|
| Aspect | Rust | Lux |
|
||||||
|
|--------|------|-----|
|
||||||
|
| **Memory** | Ownership/borrowing (manual) | Reference counting (automatic) |
|
||||||
|
| **Type system** | Traits, generics, lifetimes | ADTs, effects, generics |
|
||||||
|
| **Side effects** | Implicit (except `unsafe`) | Explicit (effect system) |
|
||||||
|
| **Error handling** | `Result<T, E>` + `?` | `Result<T, E>` + `Fail` effect |
|
||||||
|
| **Performance** | Zero-cost, systems-level | Good, not systems-level |
|
||||||
|
| **Learning curve** | Steep (ownership) | Moderate (effects) |
|
||||||
|
| **Pattern matching** | Excellent, exhaustive | Excellent, exhaustive |
|
||||||
|
| **Shared** | ADTs, pattern matching, `Option`/`Result`, no null, immutable by default, strong type system |
|
||||||
|
|
||||||
|
**What Lux learns from Rust:** ADTs with exhaustive matching, `Option`/`Result` instead of null/exceptions, excellent error messages, integrated tooling (cargo model).
|
||||||
|
|
||||||
|
**What Lux rejects:** Ownership complexity (Lux uses GC/RC instead), lifetimes, `unsafe`.
|
||||||
|
|
||||||
|
### Go (SO #13 by usage, #11 most admired)
|
||||||
|
|
||||||
|
| Aspect | Go | Lux |
|
||||||
|
|--------|-----|-----|
|
||||||
|
| **Type system** | Structural, simple | HM inference, ADTs |
|
||||||
|
| **Side effects** | Implicit | Explicit |
|
||||||
|
| **Error handling** | Multiple returns (`val, err`) | `Result<T, E>` + effects |
|
||||||
|
| **Formatting** | `gofmt` (opinionated) | `lux fmt` (opinionated) |
|
||||||
|
| **Tooling** | All-in-one (`go` binary) | All-in-one (`lux` binary) |
|
||||||
|
| **Concurrency** | Goroutines + channels | `Concurrent` + `Channel` effects |
|
||||||
|
| **Generics** | Added late, limited | First-class from day one |
|
||||||
|
| **Shared** | Opinionated formatter, unified tooling, practical philosophy |
|
||||||
|
|
||||||
|
**What Lux learns from Go:** Unified toolchain, opinionated formatting, simplicity as a feature, fast compilation.
|
||||||
|
|
||||||
|
**What Lux rejects:** Verbose error handling (`if err != nil`), no ADTs, no generics (historically), nil.
|
||||||
|
|
||||||
|
### Java / C# (SO #7 / #8 by usage)
|
||||||
|
|
||||||
|
| Aspect | Java/C# | Lux |
|
||||||
|
|--------|---------|-----|
|
||||||
|
| **Paradigm** | OOP-first | FP-first |
|
||||||
|
| **Effects** | DI frameworks (Spring, etc.) | Language-level effects |
|
||||||
|
| **Testing** | Mock frameworks (Mockito, etc.) | Handler swapping |
|
||||||
|
| **Null safety** | Nullable (Java), nullable ref types (C#) | `Option<T>` |
|
||||||
|
| **Boilerplate** | High (getters, setters, factories) | Low (records, inference) |
|
||||||
|
| **Shared** | Static typing, generics, pattern matching (recent), established ecosystems |
|
||||||
|
|
||||||
|
**What Lux learns from Java/C#:** Enterprise needs (database effects, HTTP, serialization) matter. Testability is a first-class concern.
|
||||||
|
|
||||||
|
**What Lux rejects:** OOP ceremony, DI frameworks, null, boilerplate.
|
||||||
|
|
||||||
|
### Haskell / OCaml / Elm (FP family)
|
||||||
|
|
||||||
|
| Aspect | Haskell | Elm | Lux |
|
||||||
|
|--------|---------|-----|-----|
|
||||||
|
| **Effects** | Monads + transformers | Cmd/Sub (Elm Architecture) | Algebraic effects |
|
||||||
|
| **Learning curve** | Steep | Moderate | Moderate |
|
||||||
|
| **Error messages** | Improving | Excellent | Good (aspiring to Elm-quality) |
|
||||||
|
| **Practical focus** | Academic-leaning | Web-focused | General-purpose |
|
||||||
|
| **Syntax** | Unique | Unique | Familiar (C-family feel) |
|
||||||
|
| **Shared** | Immutability, ADTs, pattern matching, type inference, no null |
|
||||||
|
|
||||||
|
**What Lux learns from Haskell:** Effects must be explicit. Types must be powerful. Purity matters.
|
||||||
|
|
||||||
|
**What Lux learns from Elm:** Error messages should teach. Tooling should be integrated. Simplicity beats power.
|
||||||
|
|
||||||
|
**What Lux rejects (from Haskell):** Monad transformers, academic syntax, steep learning curve.
|
||||||
|
|
||||||
|
### Gleam / Elixir (SO #2 / #3 most admired, 2025)
|
||||||
|
|
||||||
|
| Aspect | Gleam | Elixir | Lux |
|
||||||
|
|--------|-------|--------|-----|
|
||||||
|
| **Type system** | Static, HM | Dynamic | Static, HM |
|
||||||
|
| **Effects** | No special tracking | Implicit | First-class |
|
||||||
|
| **Concurrency** | BEAM (built-in) | BEAM (built-in) | Effect-based |
|
||||||
|
| **Error handling** | `Result` | Pattern matching on tuples | `Result` + `Fail` effect |
|
||||||
|
| **Shared** | Friendly errors, pipe operator, functional style, immutability |
|
||||||
|
|
||||||
|
**What Lux learns from Gleam:** Friendly developer experience, clear error messages, and pragmatic FP resonate with developers.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tooling Philosophy Audit
|
||||||
|
|
||||||
|
### Does the linter follow the philosophy?
|
||||||
|
|
||||||
|
**Yes, strongly.** The linter embodies "make the important things visible":
|
||||||
|
|
||||||
|
- `could-be-pure`: Nudges users toward declaring purity — making guarantees visible
|
||||||
|
- `could-be-total`: Same for termination
|
||||||
|
- `unnecessary-effect-decl`: Keeps effect signatures honest — don't claim effects you don't use
|
||||||
|
- `unused-variable/import/function`: Keeps code focused — everything visible should be meaningful
|
||||||
|
- `single-arm-match` / `manual-map-option`: Teaches idiomatic patterns
|
||||||
|
|
||||||
|
The category system (correctness > suspicious > idiom > style > pedantic) reflects the philosophy of being practical, not academic: real bugs are errors, style preferences are opt-in.
|
||||||
|
|
||||||
|
### Does the formatter follow the philosophy?
|
||||||
|
|
||||||
|
**Yes, with one gap.** The formatter is opinionated and non-configurable, matching the "one right way" principle. It enforces consistent style across all Lux code.
|
||||||
|
|
||||||
|
**Gap:** `max_width` and `trailing_commas` are declared in `FormatConfig` but never used. This is harmless but inconsistent — either remove the unused config or implement line wrapping.
|
||||||
|
|
||||||
|
### Does the type checker follow the philosophy?
|
||||||
|
|
||||||
|
**Yes.** The type checker embodies every core principle:
|
||||||
|
- Effects are tracked and verified in function types
|
||||||
|
- Behavioral properties are checked where possible
|
||||||
|
- Error messages include context and suggestions
|
||||||
|
- Type inference reduces ceremony while maintaining safety
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What Could Be Improved
|
||||||
|
|
||||||
|
### High-value additions (improve experience significantly, low verbosity cost)
|
||||||
|
|
||||||
|
1. **Pipe-friendly standard library**
|
||||||
|
- Currently: `List.map(myList, fn(x: Int): Int => x * 2)`
|
||||||
|
- Better: Allow `myList |> List.map(fn(x: Int): Int => x * 2)`
|
||||||
|
- Many languages (Elixir, F#, Gleam) make the pipe operator the primary composition tool. If the first argument of stdlib functions is always the data, pipes become natural. This is a **library convention**, not a language change.
|
||||||
|
- **LLM impact:** Pipe chains are easier for LLMs to generate and read — linear data flow with no nesting.
|
||||||
|
- **Human impact:** Reduces cognitive load. Reading left-to-right matches how humans think about data transformation.
|
||||||
|
|
||||||
|
2. **Exhaustive `match` warnings for non-enum types**
|
||||||
|
- The linter warns about `wildcard-on-small-enum`, but could also warn when a match on `Option` or `Result` uses a wildcard instead of handling both cases explicitly.
|
||||||
|
- **Both audiences:** Prevents subtle bugs where new variants are silently caught by `_`.
|
||||||
|
|
||||||
|
3. **Error message improvements toward Elm quality**
|
||||||
|
- Current errors show the right information but could be more conversational and suggest fixes more consistently.
|
||||||
|
- Example improvement: When a function is called with wrong argument count, show the expected signature and highlight which argument is wrong.
|
||||||
|
- **LLM impact:** Structured error messages with clear "expected X, got Y" patterns are easier for LLMs to parse and fix.
|
||||||
|
- **Human impact:** Friendly errors reduce frustration, especially for beginners.
|
||||||
|
|
||||||
|
4. **`let ... else` for fallible destructuring**
|
||||||
|
- Rust's `let ... else` pattern handles the "unwrap or bail" case elegantly:
|
||||||
|
```lux
|
||||||
|
let Some(value) = maybeValue else return defaultValue
|
||||||
|
```
|
||||||
|
- Currently requires a full `match` expression for this common pattern.
|
||||||
|
- **Both audiences:** Reduces boilerplate for the most common Option/Result handling pattern.
|
||||||
|
|
||||||
|
5. **Trait/typeclass system for overloading**
|
||||||
|
- Currently `toString`, `==`, and similar operations are built-in. A trait system would let users define their own:
|
||||||
|
```lux
|
||||||
|
trait Show<T> { fn show(value: T): String }
|
||||||
|
impl Show<User> { fn show(u: User): String = "User({u.name})" }
|
||||||
|
```
|
||||||
|
- **Note:** This exists partially. Expanding it would enable more generic programming without losing explicitness.
|
||||||
|
- **LLM impact:** Traits provide clear, greppable contracts. LLMs can generate trait impls from examples.
|
||||||
|
|
||||||
|
### Medium-value additions (good improvements, some verbosity cost)
|
||||||
|
|
||||||
|
6. **Named arguments or builder pattern for records**
|
||||||
|
- When functions take many parameters, the linter already warns at 5+. Named arguments or record-punning would help:
|
||||||
|
```lux
|
||||||
|
fn createUser({ name, email, age }: UserConfig): User = ...
|
||||||
|
createUser({ name: "Alice", email: "alice@ex.com", age: 30 })
|
||||||
|
```
|
||||||
|
- **Trade-off:** Adds syntax, but the linter already pushes users toward records for many params.
|
||||||
|
|
||||||
|
7. **Async/concurrent effect sugar**
|
||||||
|
- The `Concurrent` effect exists but could benefit from syntactic sugar:
|
||||||
|
```lux
|
||||||
|
let (a, b) = concurrent {
|
||||||
|
fetch("/api/users"),
|
||||||
|
fetch("/api/posts")
|
||||||
|
}
|
||||||
|
```
|
||||||
|
- **Trade-off:** Adds syntax, but concurrent code is important enough to warrant it.
|
||||||
|
|
||||||
|
8. **Module-level documentation with `///` doc comments**
|
||||||
|
- The `missing-doc-comment` lint exists, but the doc generation system could be enhanced with richer doc comments that include examples, parameter descriptions, and effect documentation.
|
||||||
|
- **LLM impact:** Structured documentation is the single highest-value feature for LLM code understanding.
|
||||||
|
|
||||||
|
### Lower-value or risky additions (consider carefully)
|
||||||
|
|
||||||
|
9. **Type inference for function return types**
|
||||||
|
- Would reduce ceremony: `fn double(x: Int) = x * 2` instead of `fn double(x: Int): Int = x * 2`
|
||||||
|
- **Risk:** Violates the "function signatures are documentation" principle. A body change could silently change the API. Current approach is the right trade-off.
|
||||||
|
|
||||||
|
10. **Operator overloading**
|
||||||
|
- Tempting for numeric types, but quickly leads to the C++ problem where `+` could mean anything.
|
||||||
|
- **Risk:** Violates "make the important things visible" — you can't tell what `a + b` does.
|
||||||
|
- **Better:** Keep operators for built-in numeric types. Use named functions for everything else.
|
||||||
|
|
||||||
|
11. **Macros**
|
||||||
|
- Powerful but drastically complicate tooling, error messages, and readability.
|
||||||
|
- **Risk:** Rust's macro system is powerful but produces some of the worst error messages in the language.
|
||||||
|
- **Better:** Solve specific problems with language features (effects, generics) rather than a general metaprogramming escape hatch.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The LLM Perspective
|
||||||
|
|
||||||
|
Lux has several properties that make it unusually well-suited for LLM-assisted programming:
|
||||||
|
|
||||||
|
1. **Effect signatures are machine-readable contracts.** An LLM reading `fn f(): T with {Database, Logger}` knows exactly what capabilities to provide when generating handler code.
|
||||||
|
|
||||||
|
2. **Behavioral properties are verifiable assertions.** `is pure`, `is idempotent` give LLMs clear constraints to check their own output against.
|
||||||
|
|
||||||
|
3. **The opinionated formatter eliminates style ambiguity.** LLMs don't need to guess indentation, brace style, or naming conventions — `lux fmt` handles it.
|
||||||
|
|
||||||
|
4. **Exhaustive pattern matching forces completeness.** LLMs that generate `match` expressions are reminded by the compiler when they miss cases.
|
||||||
|
|
||||||
|
5. **Small, consistent standard library.** `List.map`, `String.split`, `Option.map` — uniform `Module.function` convention is easy to learn from few examples.
|
||||||
|
|
||||||
|
6. **Effect-based testing needs no framework knowledge.** An LLM doesn't need to know Jest, pytest, or JUnit — just swap handlers.
|
||||||
|
|
||||||
|
**What would help LLMs more:**
|
||||||
|
- Structured error output (JSON mode) for programmatic error fixing
|
||||||
|
- Example-rich documentation that LLMs can learn patterns from
|
||||||
|
- A canonical set of "Lux patterns" (like Go's proverbs) that encode best practices in memorable form
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Lux's philosophy can be compressed to five words: **Make the important things visible.**
|
||||||
|
|
||||||
|
This manifests as:
|
||||||
|
- **Effects in types** — see what code does
|
||||||
|
- **Properties in types** — see what code guarantees
|
||||||
|
- **Versions in types** — see how data evolves
|
||||||
|
- **One tool for everything** — see how to build
|
||||||
|
- **One format for all** — see consistent style
|
||||||
|
|
||||||
|
The language is in the sweet spot between Haskell's rigor and Python's practicality, with Go's tooling philosophy and Elm's developer experience aspirations. It doesn't try to be everything — it tries to make the things that matter most in real software visible, composable, and verifiable.
|
||||||
46
flake.nix
46
flake.nix
@@ -14,6 +14,7 @@
|
|||||||
pkgs = import nixpkgs { inherit system overlays; };
|
pkgs = import nixpkgs { inherit system overlays; };
|
||||||
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
|
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
|
||||||
extensions = [ "rust-src" "rust-analyzer" ];
|
extensions = [ "rust-src" "rust-analyzer" ];
|
||||||
|
targets = [ "x86_64-unknown-linux-musl" ];
|
||||||
};
|
};
|
||||||
in
|
in
|
||||||
{
|
{
|
||||||
@@ -22,8 +23,8 @@
|
|||||||
rustToolchain
|
rustToolchain
|
||||||
cargo-watch
|
cargo-watch
|
||||||
cargo-edit
|
cargo-edit
|
||||||
pkg-config
|
# Static builds
|
||||||
openssl
|
pkgsStatic.stdenv.cc
|
||||||
# Benchmark tools
|
# Benchmark tools
|
||||||
hyperfine
|
hyperfine
|
||||||
poop
|
poop
|
||||||
@@ -43,7 +44,7 @@
|
|||||||
printf "\n"
|
printf "\n"
|
||||||
printf " \033[1;35m╦ ╦ ╦╦ ╦\033[0m\n"
|
printf " \033[1;35m╦ ╦ ╦╦ ╦\033[0m\n"
|
||||||
printf " \033[1;35m║ ║ ║╔╣\033[0m\n"
|
printf " \033[1;35m║ ║ ║╔╣\033[0m\n"
|
||||||
printf " \033[1;35m╩═╝╚═╝╩ ╩\033[0m v0.1.0\n"
|
printf " \033[1;35m╩═╝╚═╝╩ ╩\033[0m v0.1.9\n"
|
||||||
printf "\n"
|
printf "\n"
|
||||||
printf " Functional language with first-class effects\n"
|
printf " Functional language with first-class effects\n"
|
||||||
printf "\n"
|
printf "\n"
|
||||||
@@ -61,18 +62,47 @@
|
|||||||
|
|
||||||
packages.default = pkgs.rustPlatform.buildRustPackage {
|
packages.default = pkgs.rustPlatform.buildRustPackage {
|
||||||
pname = "lux";
|
pname = "lux";
|
||||||
version = "0.1.0";
|
version = "0.1.9";
|
||||||
src = ./.;
|
src = ./.;
|
||||||
cargoLock.lockFile = ./Cargo.lock;
|
cargoLock.lockFile = ./Cargo.lock;
|
||||||
|
|
||||||
nativeBuildInputs = [ pkgs.pkg-config ];
|
|
||||||
buildInputs = [ pkgs.openssl ];
|
|
||||||
|
|
||||||
doCheck = false;
|
doCheck = false;
|
||||||
};
|
};
|
||||||
|
|
||||||
# Benchmark scripts
|
packages.static = let
|
||||||
|
muslPkgs = import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
crossSystem = {
|
||||||
|
config = "x86_64-unknown-linux-musl";
|
||||||
|
isStatic = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
in muslPkgs.rustPlatform.buildRustPackage {
|
||||||
|
pname = "lux";
|
||||||
|
version = "0.1.9";
|
||||||
|
src = ./.;
|
||||||
|
cargoLock.lockFile = ./Cargo.lock;
|
||||||
|
|
||||||
|
CARGO_BUILD_TARGET = "x86_64-unknown-linux-musl";
|
||||||
|
CARGO_BUILD_RUSTFLAGS = "-C target-feature=+crt-static";
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
|
||||||
|
postInstall = ''
|
||||||
|
$STRIP $out/bin/lux 2>/dev/null || true
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
apps = {
|
apps = {
|
||||||
|
# Release automation
|
||||||
|
release = {
|
||||||
|
type = "app";
|
||||||
|
program = toString (pkgs.writeShellScript "lux-release" ''
|
||||||
|
exec ${self}/scripts/release.sh "$@"
|
||||||
|
'');
|
||||||
|
};
|
||||||
|
|
||||||
|
# Benchmark scripts
|
||||||
# Run hyperfine benchmark comparison
|
# Run hyperfine benchmark comparison
|
||||||
bench = {
|
bench = {
|
||||||
type = "app";
|
type = "app";
|
||||||
|
|||||||
225
projects/lux-compiler/ast.lux
Normal file
225
projects/lux-compiler/ast.lux
Normal file
@@ -0,0 +1,225 @@
|
|||||||
|
// Lux AST — Self-hosted Abstract Syntax Tree definitions
|
||||||
|
//
|
||||||
|
// Direct translation of src/ast.rs into Lux ADTs.
|
||||||
|
// These types represent the parsed structure of a Lux program.
|
||||||
|
//
|
||||||
|
// Naming conventions to avoid collisions:
|
||||||
|
// Ex = Expr variant, Pat = Pattern, Te = TypeExpr
|
||||||
|
// Td = TypeDef, Vf = VariantFields, Op = Operator
|
||||||
|
// Decl = Declaration, St = Statement
|
||||||
|
|
||||||
|
// === Source Location ===
|
||||||
|
|
||||||
|
type Span = | Span(Int, Int)
|
||||||
|
|
||||||
|
// === Identifiers ===
|
||||||
|
|
||||||
|
type Ident = | Ident(String, Span)
|
||||||
|
|
||||||
|
// === Visibility ===
|
||||||
|
|
||||||
|
type Visibility = | Public | Private
|
||||||
|
|
||||||
|
// === Schema Evolution ===
|
||||||
|
|
||||||
|
type Version = | Version(Int, Span)
|
||||||
|
|
||||||
|
type VersionConstraint =
|
||||||
|
| VcExact(Version)
|
||||||
|
| VcAtLeast(Version)
|
||||||
|
| VcLatest(Span)
|
||||||
|
|
||||||
|
// === Behavioral Types ===
|
||||||
|
|
||||||
|
type BehavioralProperty =
|
||||||
|
| BpPure
|
||||||
|
| BpTotal
|
||||||
|
| BpIdempotent
|
||||||
|
| BpDeterministic
|
||||||
|
| BpCommutative
|
||||||
|
|
||||||
|
// === Trait Bound (needed before WhereClause) ===
|
||||||
|
|
||||||
|
type TraitBound = | TraitBound(Ident, List<TypeExpr>, Span)
|
||||||
|
|
||||||
|
// === Trait Constraint (needed before WhereClause) ===
|
||||||
|
|
||||||
|
type TraitConstraint = | TraitConstraint(Ident, List<TraitBound>, Span)
|
||||||
|
|
||||||
|
// === Where Clauses ===
|
||||||
|
|
||||||
|
type WhereClause =
|
||||||
|
| WcProperty(Ident, BehavioralProperty, Span)
|
||||||
|
| WcResult(Expr, Span)
|
||||||
|
| WcTrait(TraitConstraint)
|
||||||
|
|
||||||
|
// === Module Path ===
|
||||||
|
|
||||||
|
type ModulePath = | ModulePath(List<Ident>, Span)
|
||||||
|
|
||||||
|
// === Import ===
|
||||||
|
|
||||||
|
// path, alias, items, wildcard, span
|
||||||
|
type ImportDecl = | ImportDecl(ModulePath, Option<Ident>, Option<List<Ident>>, Bool, Span)
|
||||||
|
|
||||||
|
// === Program ===
|
||||||
|
|
||||||
|
type Program = | Program(List<ImportDecl>, List<Declaration>)
|
||||||
|
|
||||||
|
// === Declarations ===
|
||||||
|
|
||||||
|
type Declaration =
|
||||||
|
| DeclFunction(FunctionDecl)
|
||||||
|
| DeclEffect(EffectDecl)
|
||||||
|
| DeclType(TypeDecl)
|
||||||
|
| DeclHandler(HandlerDecl)
|
||||||
|
| DeclLet(LetDecl)
|
||||||
|
| DeclTrait(TraitDecl)
|
||||||
|
| DeclImpl(ImplDecl)
|
||||||
|
|
||||||
|
// === Parameter ===
|
||||||
|
|
||||||
|
type Parameter = | Parameter(Ident, TypeExpr, Span)
|
||||||
|
|
||||||
|
// === Effect Operation ===
|
||||||
|
|
||||||
|
type EffectOp = | EffectOp(Ident, List<Parameter>, TypeExpr, Span)
|
||||||
|
|
||||||
|
// === Record Field ===
|
||||||
|
|
||||||
|
type RecordField = | RecordField(Ident, TypeExpr, Span)
|
||||||
|
|
||||||
|
// === Variant Fields ===
|
||||||
|
|
||||||
|
type VariantFields =
|
||||||
|
| VfUnit
|
||||||
|
| VfTuple(List<TypeExpr>)
|
||||||
|
| VfRecord(List<RecordField>)
|
||||||
|
|
||||||
|
// === Variant ===
|
||||||
|
|
||||||
|
type Variant = | Variant(Ident, VariantFields, Span)
|
||||||
|
|
||||||
|
// === Migration ===
|
||||||
|
|
||||||
|
type Migration = | Migration(Version, Expr, Span)
|
||||||
|
|
||||||
|
// === Handler Impl ===
|
||||||
|
|
||||||
|
// op_name, params, resume, body, span
|
||||||
|
type HandlerImpl = | HandlerImpl(Ident, List<Ident>, Option<Ident>, Expr, Span)
|
||||||
|
|
||||||
|
// === Impl Method ===
|
||||||
|
|
||||||
|
// name, params, return_type, body, span
|
||||||
|
type ImplMethod = | ImplMethod(Ident, List<Parameter>, Option<TypeExpr>, Expr, Span)
|
||||||
|
|
||||||
|
// === Trait Method ===
|
||||||
|
|
||||||
|
// name, type_params, params, return_type, default_impl, span
|
||||||
|
type TraitMethod = | TraitMethod(Ident, List<Ident>, List<Parameter>, TypeExpr, Option<Expr>, Span)
|
||||||
|
|
||||||
|
// === Type Expressions ===
|
||||||
|
|
||||||
|
type TypeExpr =
|
||||||
|
| TeNamed(Ident)
|
||||||
|
| TeApp(TypeExpr, List<TypeExpr>)
|
||||||
|
| TeFunction(List<TypeExpr>, TypeExpr, List<Ident>)
|
||||||
|
| TeTuple(List<TypeExpr>)
|
||||||
|
| TeRecord(List<RecordField>)
|
||||||
|
| TeUnit
|
||||||
|
| TeVersioned(TypeExpr, VersionConstraint)
|
||||||
|
|
||||||
|
// === Literal ===
|
||||||
|
|
||||||
|
type LiteralKind =
|
||||||
|
| LitInt(Int)
|
||||||
|
| LitFloat(String)
|
||||||
|
| LitString(String)
|
||||||
|
| LitChar(Char)
|
||||||
|
| LitBool(Bool)
|
||||||
|
| LitUnit
|
||||||
|
|
||||||
|
type Literal = | Literal(LiteralKind, Span)
|
||||||
|
|
||||||
|
// === Binary Operators ===
|
||||||
|
|
||||||
|
type BinaryOp =
|
||||||
|
| OpAdd | OpSub | OpMul | OpDiv | OpMod
|
||||||
|
| OpEq | OpNe | OpLt | OpLe | OpGt | OpGe
|
||||||
|
| OpAnd | OpOr
|
||||||
|
| OpPipe | OpConcat
|
||||||
|
|
||||||
|
// === Unary Operators ===
|
||||||
|
|
||||||
|
type UnaryOp = | OpNeg | OpNot
|
||||||
|
|
||||||
|
// === Statements ===
|
||||||
|
|
||||||
|
type Statement =
|
||||||
|
| StExpr(Expr)
|
||||||
|
| StLet(Ident, Option<TypeExpr>, Expr, Span)
|
||||||
|
|
||||||
|
// === Match Arms ===
|
||||||
|
|
||||||
|
type MatchArm = | MatchArm(Pattern, Option<Expr>, Expr, Span)
|
||||||
|
|
||||||
|
// === Patterns ===
|
||||||
|
|
||||||
|
type Pattern =
|
||||||
|
| PatWildcard(Span)
|
||||||
|
| PatVar(Ident)
|
||||||
|
| PatLiteral(Literal)
|
||||||
|
| PatConstructor(Ident, List<Pattern>, Span)
|
||||||
|
| PatRecord(List<(Ident, Pattern)>, Span)
|
||||||
|
| PatTuple(List<Pattern>, Span)
|
||||||
|
|
||||||
|
// === Function Declaration ===
|
||||||
|
// visibility, doc, name, type_params, params, return_type, effects, properties, where_clauses, body, span
|
||||||
|
type FunctionDecl = | FunctionDecl(Visibility, Option<String>, Ident, List<Ident>, List<Parameter>, TypeExpr, List<Ident>, List<BehavioralProperty>, List<WhereClause>, Expr, Span)
|
||||||
|
|
||||||
|
// === Effect Declaration ===
|
||||||
|
// doc, name, type_params, operations, span
|
||||||
|
type EffectDecl = | EffectDecl(Option<String>, Ident, List<Ident>, List<EffectOp>, Span)
|
||||||
|
|
||||||
|
// === Type Declaration ===
|
||||||
|
// visibility, doc, name, type_params, version, definition, migrations, span
|
||||||
|
type TypeDecl = | TypeDecl(Visibility, Option<String>, Ident, List<Ident>, Option<Version>, TypeDef, List<Migration>, Span)
|
||||||
|
|
||||||
|
// === Handler Declaration ===
|
||||||
|
// name, params, effect, implementations, span
|
||||||
|
type HandlerDecl = | HandlerDecl(Ident, List<Parameter>, Ident, List<HandlerImpl>, Span)
|
||||||
|
|
||||||
|
// === Let Declaration ===
|
||||||
|
// visibility, doc, name, typ, value, span
|
||||||
|
type LetDecl = | LetDecl(Visibility, Option<String>, Ident, Option<TypeExpr>, Expr, Span)
|
||||||
|
|
||||||
|
// === Trait Declaration ===
|
||||||
|
// visibility, doc, name, type_params, super_traits, methods, span
|
||||||
|
type TraitDecl = | TraitDecl(Visibility, Option<String>, Ident, List<Ident>, List<TraitBound>, List<TraitMethod>, Span)
|
||||||
|
|
||||||
|
// === Impl Declaration ===
|
||||||
|
// type_params, constraints, trait_name, trait_args, target_type, methods, span
|
||||||
|
type ImplDecl = | ImplDecl(List<Ident>, List<TraitConstraint>, Ident, List<TypeExpr>, TypeExpr, List<ImplMethod>, Span)
|
||||||
|
|
||||||
|
// === Expressions ===
|
||||||
|
|
||||||
|
type Expr =
|
||||||
|
| ExLiteral(Literal)
|
||||||
|
| ExVar(Ident)
|
||||||
|
| ExBinaryOp(BinaryOp, Expr, Expr, Span)
|
||||||
|
| ExUnaryOp(UnaryOp, Expr, Span)
|
||||||
|
| ExCall(Expr, List<Expr>, Span)
|
||||||
|
| ExEffectOp(Ident, Ident, List<Expr>, Span)
|
||||||
|
| ExField(Expr, Ident, Span)
|
||||||
|
| ExTupleIndex(Expr, Int, Span)
|
||||||
|
| ExLambda(List<Parameter>, Option<TypeExpr>, List<Ident>, Expr, Span)
|
||||||
|
| ExLet(Ident, Option<TypeExpr>, Expr, Expr, Span)
|
||||||
|
| ExIf(Expr, Expr, Expr, Span)
|
||||||
|
| ExMatch(Expr, List<MatchArm>, Span)
|
||||||
|
| ExBlock(List<Statement>, Expr, Span)
|
||||||
|
| ExRecord(Option<Expr>, List<(Ident, Expr)>, Span)
|
||||||
|
| ExTuple(List<Expr>, Span)
|
||||||
|
| ExList(List<Expr>, Span)
|
||||||
|
| ExRun(Expr, List<(Ident, Expr)>, Span)
|
||||||
|
| ExResume(Expr, Span)
|
||||||
213
scripts/release.sh
Executable file
213
scripts/release.sh
Executable file
@@ -0,0 +1,213 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Lux Release Script
|
||||||
|
# Builds a static binary, generates changelog, and creates a Gitea release.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
|
||||||
|
# ./scripts/release.sh patch # same as above
|
||||||
|
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
|
||||||
|
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
|
||||||
|
# ./scripts/release.sh v1.2.3 # explicit version
|
||||||
|
#
|
||||||
|
# Environment:
|
||||||
|
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
|
||||||
|
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
|
||||||
|
|
||||||
|
# cd to repo root (directory containing this script's parent)
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR/.."
|
||||||
|
|
||||||
|
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
|
||||||
|
REPO_OWNER="blu"
|
||||||
|
REPO_NAME="lux"
|
||||||
|
API_BASE="$GITEA_URL/api/v1"
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
BOLD='\033[1m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
info() { printf "${CYAN}::${NC} %s\n" "$1"; }
|
||||||
|
ok() { printf "${GREEN}ok${NC} %s\n" "$1"; }
|
||||||
|
warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
|
||||||
|
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
|
||||||
|
|
||||||
|
# --- Determine version ---
|
||||||
|
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
|
||||||
|
BUMP="${1:-patch}"
|
||||||
|
|
||||||
|
bump_version() {
|
||||||
|
local ver="$1" part="$2"
|
||||||
|
IFS='.' read -r major minor patch <<< "$ver"
|
||||||
|
case "$part" in
|
||||||
|
major) echo "$((major + 1)).0.0" ;;
|
||||||
|
minor) echo "$major.$((minor + 1)).0" ;;
|
||||||
|
patch) echo "$major.$minor.$((patch + 1))" ;;
|
||||||
|
*) echo "$part" ;; # treat as explicit version
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
case "$BUMP" in
|
||||||
|
major|minor|patch)
|
||||||
|
VERSION=$(bump_version "$CURRENT" "$BUMP")
|
||||||
|
info "Bumping $BUMP: $CURRENT → $VERSION"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
# Explicit version — strip v prefix if present
|
||||||
|
VERSION="${BUMP#v}"
|
||||||
|
info "Explicit version: $VERSION"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
TAG="v$VERSION"
|
||||||
|
|
||||||
|
# --- Check for clean working tree ---
|
||||||
|
if [ -n "$(git status --porcelain)" ]; then
|
||||||
|
warn "Working tree has uncommitted changes:"
|
||||||
|
git status --short
|
||||||
|
printf "\n"
|
||||||
|
read -rp "Continue anyway? [y/N] " confirm
|
||||||
|
[[ "$confirm" =~ ^[Yy]$ ]] || exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Check if tag already exists ---
|
||||||
|
if git rev-parse "$TAG" >/dev/null 2>&1; then
|
||||||
|
err "Tag $TAG already exists. Choose a different version."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Update version in source files ---
|
||||||
|
if [ "$VERSION" != "$CURRENT" ]; then
|
||||||
|
info "Updating version in Cargo.toml and flake.nix..."
|
||||||
|
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
|
||||||
|
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
|
||||||
|
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
|
||||||
|
git add Cargo.toml flake.nix
|
||||||
|
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
|
||||||
|
ok "Version updated and committed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Generate changelog ---
|
||||||
|
info "Generating changelog..."
|
||||||
|
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
|
||||||
|
if [ -n "$LAST_TAG" ]; then
|
||||||
|
RANGE="$LAST_TAG..HEAD"
|
||||||
|
info "Changes since $LAST_TAG:"
|
||||||
|
else
|
||||||
|
RANGE="HEAD"
|
||||||
|
info "First release — summarizing recent commits:"
|
||||||
|
fi
|
||||||
|
|
||||||
|
CHANGELOG=$(git log "$RANGE" --pretty=format:"- %s" --no-merges 2>/dev/null | head -50 || true)
|
||||||
|
if [ -z "$CHANGELOG" ]; then
|
||||||
|
CHANGELOG="- Initial release"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Build static binary ---
|
||||||
|
info "Building static binary (nix build .#static)..."
|
||||||
|
nix build .#static
|
||||||
|
BINARY="result/bin/lux"
|
||||||
|
if [ ! -f "$BINARY" ]; then
|
||||||
|
err "Static binary not found at $BINARY"
|
||||||
|
fi
|
||||||
|
|
||||||
|
BINARY_SIZE=$(ls -lh "$BINARY" | awk '{print $5}')
|
||||||
|
BINARY_TYPE=$(file "$BINARY" | sed 's/.*: //')
|
||||||
|
ok "Binary: $BINARY_SIZE, $BINARY_TYPE"
|
||||||
|
|
||||||
|
# --- Prepare release artifact ---
|
||||||
|
ARTIFACT="/tmp/lux-${TAG}-linux-x86_64"
|
||||||
|
cp "$BINARY" "$ARTIFACT"
|
||||||
|
chmod +x "$ARTIFACT"
|
||||||
|
|
||||||
|
# --- Show release summary ---
|
||||||
|
printf "\n"
|
||||||
|
printf "${BOLD}═══ Release Summary ═══${NC}\n"
|
||||||
|
printf "\n"
|
||||||
|
printf " ${BOLD}Tag:${NC} %s\n" "$TAG"
|
||||||
|
printf " ${BOLD}Binary:${NC} %s (%s)\n" "lux-${TAG}-linux-x86_64" "$BINARY_SIZE"
|
||||||
|
printf " ${BOLD}Commit:${NC} %s\n" "$(git rev-parse --short HEAD)"
|
||||||
|
printf "\n"
|
||||||
|
printf "${BOLD}Changelog:${NC}\n"
|
||||||
|
printf "%s\n" "$CHANGELOG"
|
||||||
|
printf "\n"
|
||||||
|
|
||||||
|
# --- Confirm ---
|
||||||
|
read -rp "Create release $TAG? [y/N] " confirm
|
||||||
|
[[ "$confirm" =~ ^[Yy]$ ]] || { info "Aborted."; exit 0; }
|
||||||
|
|
||||||
|
# --- Get Gitea token ---
|
||||||
|
if [ -z "${GITEA_TOKEN:-}" ]; then
|
||||||
|
printf "\n"
|
||||||
|
info "Gitea API token required (create at $GITEA_URL/user/settings/applications)"
|
||||||
|
read -rsp "Token: " GITEA_TOKEN
|
||||||
|
printf "\n"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "$GITEA_TOKEN" ]; then
|
||||||
|
err "No token provided"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Create and push tag ---
|
||||||
|
info "Creating tag $TAG..."
|
||||||
|
git tag -a "$TAG" -m "Release $TAG" --no-sign
|
||||||
|
ok "Tag created"
|
||||||
|
|
||||||
|
info "Pushing tag to origin..."
|
||||||
|
git push origin "$TAG"
|
||||||
|
ok "Tag pushed"
|
||||||
|
|
||||||
|
# --- Create Gitea release ---
|
||||||
|
info "Creating release on Gitea..."
|
||||||
|
|
||||||
|
RELEASE_BODY=$(printf "## Lux %s\n\n### Changes\n\n%s\n\n### Installation\n\n\`\`\`bash\ncurl -Lo lux %s/%s/%s/releases/download/%s/lux-linux-x86_64\nchmod +x lux\n./lux --version\n\`\`\`" \
|
||||||
|
"$TAG" "$CHANGELOG" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG")
|
||||||
|
|
||||||
|
RELEASE_JSON=$(jq -n \
|
||||||
|
--arg tag "$TAG" \
|
||||||
|
--arg name "Lux $TAG" \
|
||||||
|
--arg body "$RELEASE_BODY" \
|
||||||
|
'{tag_name: $tag, name: $name, body: $body, draft: false, prerelease: false}')
|
||||||
|
|
||||||
|
RELEASE_RESPONSE=$(curl -s -X POST \
|
||||||
|
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases" \
|
||||||
|
-H "Authorization: token $GITEA_TOKEN" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "$RELEASE_JSON")
|
||||||
|
|
||||||
|
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id // empty')
|
||||||
|
if [ -z "$RELEASE_ID" ]; then
|
||||||
|
echo "$RELEASE_RESPONSE" | jq . 2>/dev/null || echo "$RELEASE_RESPONSE"
|
||||||
|
err "Failed to create release"
|
||||||
|
fi
|
||||||
|
ok "Release created (id: $RELEASE_ID)"
|
||||||
|
|
||||||
|
# --- Upload binary ---
|
||||||
|
info "Uploading binary..."
|
||||||
|
UPLOAD_RESPONSE=$(curl -s -X POST \
|
||||||
|
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases/$RELEASE_ID/assets?name=lux-linux-x86_64" \
|
||||||
|
-H "Authorization: token $GITEA_TOKEN" \
|
||||||
|
-H "Content-Type: application/octet-stream" \
|
||||||
|
--data-binary "@$ARTIFACT")
|
||||||
|
|
||||||
|
ASSET_NAME=$(echo "$UPLOAD_RESPONSE" | jq -r '.name // empty')
|
||||||
|
if [ -z "$ASSET_NAME" ]; then
|
||||||
|
echo "$UPLOAD_RESPONSE" | jq . 2>/dev/null || echo "$UPLOAD_RESPONSE"
|
||||||
|
err "Failed to upload binary"
|
||||||
|
fi
|
||||||
|
ok "Binary uploaded: $ASSET_NAME"
|
||||||
|
|
||||||
|
# --- Done ---
|
||||||
|
printf "\n"
|
||||||
|
printf "${GREEN}${BOLD}Release $TAG published!${NC}\n"
|
||||||
|
printf "\n"
|
||||||
|
printf " ${BOLD}URL:${NC} %s/%s/%s/releases/tag/%s\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
|
||||||
|
printf " ${BOLD}Download:${NC} %s/%s/%s/releases/download/%s/lux-linux-x86_64\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
|
||||||
|
printf "\n"
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$ARTIFACT"
|
||||||
211
scripts/validate.sh
Executable file
211
scripts/validate.sh
Executable file
@@ -0,0 +1,211 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Lux Full Validation Script
|
||||||
|
# Runs all checks: Rust tests, package tests, type checking, example compilation.
|
||||||
|
# Run after every committable change to ensure no regressions.
|
||||||
|
|
||||||
|
# cd to repo root (directory containing this script's parent)
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR/.."
|
||||||
|
|
||||||
|
LUX="$(pwd)/target/release/lux"
|
||||||
|
PACKAGES_DIR="$(pwd)/../packages"
|
||||||
|
PROJECTS_DIR="$(pwd)/projects"
|
||||||
|
EXAMPLES_DIR="$(pwd)/examples"
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
BOLD='\033[1m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
FAILED=0
|
||||||
|
TOTAL=0
|
||||||
|
|
||||||
|
step() {
|
||||||
|
TOTAL=$((TOTAL + 1))
|
||||||
|
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
|
||||||
|
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
|
||||||
|
|
||||||
|
# --- Rust checks ---
|
||||||
|
step "cargo check"
|
||||||
|
if nix develop --command cargo check 2>/dev/null; then ok; else fail; fi
|
||||||
|
|
||||||
|
step "cargo test"
|
||||||
|
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
|
||||||
|
|
||||||
|
# --- Build release binary ---
|
||||||
|
step "cargo build --release"
|
||||||
|
if nix develop --command cargo build --release 2>/dev/null; then ok; else fail; fi
|
||||||
|
|
||||||
|
# --- Package tests ---
|
||||||
|
for pkg in path frontmatter xml rss markdown; do
|
||||||
|
PKG_DIR="$PACKAGES_DIR/$pkg"
|
||||||
|
if [ -d "$PKG_DIR" ]; then
|
||||||
|
step "lux test ($pkg)"
|
||||||
|
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Lux check on packages ---
|
||||||
|
for pkg in path frontmatter xml rss markdown; do
|
||||||
|
PKG_DIR="$PACKAGES_DIR/$pkg"
|
||||||
|
if [ -d "$PKG_DIR" ]; then
|
||||||
|
step "lux check ($pkg)"
|
||||||
|
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
|
||||||
|
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
|
||||||
|
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Project checks ---
|
||||||
|
for proj_dir in "$PROJECTS_DIR"/*/; do
|
||||||
|
proj=$(basename "$proj_dir")
|
||||||
|
if [ -f "$proj_dir/main.lux" ]; then
|
||||||
|
step "lux check (project: $proj)"
|
||||||
|
OUTPUT=$("$LUX" check "$proj_dir/main.lux" 2>&1 || true)
|
||||||
|
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
|
||||||
|
fi
|
||||||
|
# Check any standalone .lux files in the project
|
||||||
|
for lux_file in "$proj_dir"/*.lux; do
|
||||||
|
[ -f "$lux_file" ] || continue
|
||||||
|
fname=$(basename "$lux_file")
|
||||||
|
[ "$fname" = "main.lux" ] && continue
|
||||||
|
step "lux check (project: $proj/$fname)"
|
||||||
|
OUTPUT=$("$LUX" check "$lux_file" 2>&1 || true)
|
||||||
|
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
|
||||||
|
done
|
||||||
|
done
|
||||||
|
|
||||||
|
# === Compilation & Interpreter Checks ===
|
||||||
|
|
||||||
|
# --- Interpreter: examples ---
|
||||||
|
# Skip: http_api, http, http_router, http_server (network), postgres_demo (db),
|
||||||
|
# random, property_testing (Random effect), shell (Process), json (File I/O),
|
||||||
|
# file_io (File I/O), test_math, test_lists (Test effect), stress_shared_rc,
|
||||||
|
# test_rc_comparison (internal tests), modules/* (need cwd)
|
||||||
|
INTERP_SKIP="http_api http http_router http_server postgres_demo random property_testing shell json file_io test_math test_lists stress_shared_rc test_rc_comparison"
|
||||||
|
for f in "$EXAMPLES_DIR"/*.lux; do
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
skip=false
|
||||||
|
for s in $INTERP_SKIP; do [ "$name" = "$s" ] && skip=true; done
|
||||||
|
$skip && continue
|
||||||
|
step "interpreter (examples/$name)"
|
||||||
|
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Interpreter: examples/standard ---
|
||||||
|
# Skip: guessing_game (reads stdin)
|
||||||
|
for f in "$EXAMPLES_DIR"/standard/*.lux; do
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
[ "$name" = "guessing_game" ] && continue
|
||||||
|
step "interpreter (standard/$name)"
|
||||||
|
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Interpreter: examples/showcase ---
|
||||||
|
# Skip: task_manager (parse error in current version)
|
||||||
|
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
[ "$name" = "task_manager" ] && continue
|
||||||
|
step "interpreter (showcase/$name)"
|
||||||
|
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Interpreter: projects ---
|
||||||
|
# Skip: guessing-game (Random), rest-api (HttpServer)
|
||||||
|
PROJ_INTERP_SKIP="guessing-game rest-api"
|
||||||
|
for proj_dir in "$PROJECTS_DIR"/*/; do
|
||||||
|
proj=$(basename "$proj_dir")
|
||||||
|
[ -f "$proj_dir/main.lux" ] || continue
|
||||||
|
skip=false
|
||||||
|
for s in $PROJ_INTERP_SKIP; do [ "$proj" = "$s" ] && skip=true; done
|
||||||
|
$skip && continue
|
||||||
|
step "interpreter (project: $proj)"
|
||||||
|
if timeout 10 "$LUX" "$proj_dir/main.lux" >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- JS compilation: examples ---
|
||||||
|
# Skip files that fail JS compilation (unsupported features)
|
||||||
|
JS_SKIP="http_api http http_router postgres_demo property_testing json test_lists test_rc_comparison"
|
||||||
|
for f in "$EXAMPLES_DIR"/*.lux; do
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
skip=false
|
||||||
|
for s in $JS_SKIP; do [ "$name" = "$s" ] && skip=true; done
|
||||||
|
$skip && continue
|
||||||
|
step "compile JS (examples/$name)"
|
||||||
|
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- JS compilation: examples/standard ---
|
||||||
|
# Skip: stdlib_demo (uses String.toUpper not in JS backend)
|
||||||
|
for f in "$EXAMPLES_DIR"/standard/*.lux; do
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
[ "$name" = "stdlib_demo" ] && continue
|
||||||
|
step "compile JS (standard/$name)"
|
||||||
|
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- JS compilation: examples/showcase ---
|
||||||
|
# Skip: task_manager (unsupported features)
|
||||||
|
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
name=$(basename "$f" .lux)
|
||||||
|
[ "$name" = "task_manager" ] && continue
|
||||||
|
step "compile JS (showcase/$name)"
|
||||||
|
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- JS compilation: projects ---
|
||||||
|
# Skip: json-parser, rest-api (unsupported features)
|
||||||
|
JS_PROJ_SKIP="json-parser rest-api"
|
||||||
|
for proj_dir in "$PROJECTS_DIR"/*/; do
|
||||||
|
proj=$(basename "$proj_dir")
|
||||||
|
[ -f "$proj_dir/main.lux" ] || continue
|
||||||
|
skip=false
|
||||||
|
for s in $JS_PROJ_SKIP; do [ "$proj" = "$s" ] && skip=true; done
|
||||||
|
$skip && continue
|
||||||
|
step "compile JS (project: $proj)"
|
||||||
|
if "$LUX" compile "$proj_dir/main.lux" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- C compilation: examples ---
|
||||||
|
# Only compile examples known to work with C backend
|
||||||
|
C_EXAMPLES="hello factorial pipelines tailcall jit_test"
|
||||||
|
for name in $C_EXAMPLES; do
|
||||||
|
f="$EXAMPLES_DIR/$name.lux"
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
step "compile C (examples/$name)"
|
||||||
|
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- C compilation: examples/standard ---
|
||||||
|
C_STD_EXAMPLES="hello_world factorial fizzbuzz primes guessing_game"
|
||||||
|
for name in $C_STD_EXAMPLES; do
|
||||||
|
f="$EXAMPLES_DIR/standard/$name.lux"
|
||||||
|
[ -f "$f" ] || continue
|
||||||
|
step "compile C (standard/$name)"
|
||||||
|
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Cleanup ---
|
||||||
|
rm -f /tmp/lux_validate.js /tmp/lux_validate_bin
|
||||||
|
|
||||||
|
# --- Summary ---
|
||||||
|
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
|
||||||
|
if [ $FAILED -eq 0 ]; then
|
||||||
|
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
|
||||||
|
else
|
||||||
|
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
33
src/ast.rs
33
src/ast.rs
@@ -221,6 +221,8 @@ pub enum Declaration {
|
|||||||
Trait(TraitDecl),
|
Trait(TraitDecl),
|
||||||
/// Trait implementation: impl Trait for Type { ... }
|
/// Trait implementation: impl Trait for Type { ... }
|
||||||
Impl(ImplDecl),
|
Impl(ImplDecl),
|
||||||
|
/// Extern function declaration (FFI): extern fn name(params): ReturnType
|
||||||
|
ExternFn(ExternFnDecl),
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Function declaration
|
/// Function declaration
|
||||||
@@ -428,6 +430,21 @@ pub struct ImplMethod {
|
|||||||
pub span: Span,
|
pub span: Span,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Extern function declaration (FFI)
|
||||||
|
#[derive(Debug, Clone)]
|
||||||
|
pub struct ExternFnDecl {
|
||||||
|
pub visibility: Visibility,
|
||||||
|
/// Documentation comment
|
||||||
|
pub doc: Option<String>,
|
||||||
|
pub name: Ident,
|
||||||
|
pub type_params: Vec<Ident>,
|
||||||
|
pub params: Vec<Parameter>,
|
||||||
|
pub return_type: TypeExpr,
|
||||||
|
/// Optional JS name override: extern fn foo(...): T = "jsFoo"
|
||||||
|
pub js_name: Option<String>,
|
||||||
|
pub span: Span,
|
||||||
|
}
|
||||||
|
|
||||||
/// Type expressions
|
/// Type expressions
|
||||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||||
pub enum TypeExpr {
|
pub enum TypeExpr {
|
||||||
@@ -499,6 +516,12 @@ pub enum Expr {
|
|||||||
field: Ident,
|
field: Ident,
|
||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
|
/// Tuple index access: tuple.0, tuple.1
|
||||||
|
TupleIndex {
|
||||||
|
object: Box<Expr>,
|
||||||
|
index: usize,
|
||||||
|
span: Span,
|
||||||
|
},
|
||||||
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
|
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
|
||||||
Lambda {
|
Lambda {
|
||||||
params: Vec<Parameter>,
|
params: Vec<Parameter>,
|
||||||
@@ -535,7 +558,9 @@ pub enum Expr {
|
|||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
/// Record literal: { name: "Alice", age: 30 }
|
/// Record literal: { name: "Alice", age: 30 }
|
||||||
|
/// With optional spread: { ...base, name: "Bob" }
|
||||||
Record {
|
Record {
|
||||||
|
spread: Option<Box<Expr>>,
|
||||||
fields: Vec<(Ident, Expr)>,
|
fields: Vec<(Ident, Expr)>,
|
||||||
span: Span,
|
span: Span,
|
||||||
},
|
},
|
||||||
@@ -563,6 +588,7 @@ impl Expr {
|
|||||||
Expr::Call { span, .. } => *span,
|
Expr::Call { span, .. } => *span,
|
||||||
Expr::EffectOp { span, .. } => *span,
|
Expr::EffectOp { span, .. } => *span,
|
||||||
Expr::Field { span, .. } => *span,
|
Expr::Field { span, .. } => *span,
|
||||||
|
Expr::TupleIndex { span, .. } => *span,
|
||||||
Expr::Lambda { span, .. } => *span,
|
Expr::Lambda { span, .. } => *span,
|
||||||
Expr::Let { span, .. } => *span,
|
Expr::Let { span, .. } => *span,
|
||||||
Expr::If { span, .. } => *span,
|
Expr::If { span, .. } => *span,
|
||||||
@@ -614,7 +640,8 @@ pub enum BinaryOp {
|
|||||||
And,
|
And,
|
||||||
Or,
|
Or,
|
||||||
// Other
|
// Other
|
||||||
Pipe, // |>
|
Pipe, // |>
|
||||||
|
Concat, // ++
|
||||||
}
|
}
|
||||||
|
|
||||||
impl fmt::Display for BinaryOp {
|
impl fmt::Display for BinaryOp {
|
||||||
@@ -634,6 +661,7 @@ impl fmt::Display for BinaryOp {
|
|||||||
BinaryOp::And => write!(f, "&&"),
|
BinaryOp::And => write!(f, "&&"),
|
||||||
BinaryOp::Or => write!(f, "||"),
|
BinaryOp::Or => write!(f, "||"),
|
||||||
BinaryOp::Pipe => write!(f, "|>"),
|
BinaryOp::Pipe => write!(f, "|>"),
|
||||||
|
BinaryOp::Concat => write!(f, "++"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -686,8 +714,9 @@ pub enum Pattern {
|
|||||||
Var(Ident),
|
Var(Ident),
|
||||||
/// Literal: 42, "hello", true
|
/// Literal: 42, "hello", true
|
||||||
Literal(Literal),
|
Literal(Literal),
|
||||||
/// Constructor: Some(x), None, Ok(v)
|
/// Constructor: Some(x), None, Ok(v), module.Constructor(x)
|
||||||
Constructor {
|
Constructor {
|
||||||
|
module: Option<Ident>,
|
||||||
name: Ident,
|
name: Ident,
|
||||||
fields: Vec<Pattern>,
|
fields: Vec<Pattern>,
|
||||||
span: Span,
|
span: Span,
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -69,6 +69,10 @@ pub struct JsBackend {
|
|||||||
has_handlers: bool,
|
has_handlers: bool,
|
||||||
/// Variable substitutions for let binding
|
/// Variable substitutions for let binding
|
||||||
var_substitutions: HashMap<String, String>,
|
var_substitutions: HashMap<String, String>,
|
||||||
|
/// Effects actually used in the program (for tree-shaking runtime)
|
||||||
|
used_effects: HashSet<String>,
|
||||||
|
/// Extern function names mapped to their JS names
|
||||||
|
extern_fns: HashMap<String, String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl JsBackend {
|
impl JsBackend {
|
||||||
@@ -90,6 +94,8 @@ impl JsBackend {
|
|||||||
effectful_functions: HashSet::new(),
|
effectful_functions: HashSet::new(),
|
||||||
has_handlers: false,
|
has_handlers: false,
|
||||||
var_substitutions: HashMap::new(),
|
var_substitutions: HashMap::new(),
|
||||||
|
used_effects: HashSet::new(),
|
||||||
|
extern_fns: HashMap::new(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -97,9 +103,6 @@ impl JsBackend {
|
|||||||
pub fn generate(&mut self, program: &Program) -> Result<String, JsGenError> {
|
pub fn generate(&mut self, program: &Program) -> Result<String, JsGenError> {
|
||||||
self.output.clear();
|
self.output.clear();
|
||||||
|
|
||||||
// Emit runtime helpers
|
|
||||||
self.emit_runtime();
|
|
||||||
|
|
||||||
// First pass: collect all function names, types, and effects
|
// First pass: collect all function names, types, and effects
|
||||||
for decl in &program.declarations {
|
for decl in &program.declarations {
|
||||||
match decl {
|
match decl {
|
||||||
@@ -112,10 +115,24 @@ impl JsBackend {
|
|||||||
Declaration::Type(t) => {
|
Declaration::Type(t) => {
|
||||||
self.collect_type(t)?;
|
self.collect_type(t)?;
|
||||||
}
|
}
|
||||||
|
Declaration::ExternFn(ext) => {
|
||||||
|
let js_name = ext
|
||||||
|
.js_name
|
||||||
|
.clone()
|
||||||
|
.unwrap_or_else(|| ext.name.name.clone());
|
||||||
|
self.extern_fns.insert(ext.name.name.clone(), js_name);
|
||||||
|
self.functions.insert(ext.name.name.clone());
|
||||||
|
}
|
||||||
_ => {}
|
_ => {}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Collect used effects for tree-shaking
|
||||||
|
self.collect_used_effects(program);
|
||||||
|
|
||||||
|
// Emit runtime helpers (tree-shaken based on used effects)
|
||||||
|
self.emit_runtime();
|
||||||
|
|
||||||
// Emit type constructors
|
// Emit type constructors
|
||||||
for decl in &program.declarations {
|
for decl in &program.declarations {
|
||||||
if let Declaration::Type(t) = decl {
|
if let Declaration::Type(t) = decl {
|
||||||
@@ -163,32 +180,181 @@ impl JsBackend {
|
|||||||
Ok(self.output.clone())
|
Ok(self.output.clone())
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Emit the minimal Lux runtime
|
/// Collect all effects used in the program for runtime tree-shaking
|
||||||
|
fn collect_used_effects(&mut self, program: &Program) {
|
||||||
|
for decl in &program.declarations {
|
||||||
|
match decl {
|
||||||
|
Declaration::Function(f) => {
|
||||||
|
for effect in &f.effects {
|
||||||
|
self.used_effects.insert(effect.name.clone());
|
||||||
|
}
|
||||||
|
self.collect_effects_from_expr(&f.body);
|
||||||
|
}
|
||||||
|
Declaration::Let(l) => {
|
||||||
|
self.collect_effects_from_expr(&l.value);
|
||||||
|
}
|
||||||
|
Declaration::Handler(h) => {
|
||||||
|
self.used_effects.insert(h.effect.name.clone());
|
||||||
|
for imp in &h.implementations {
|
||||||
|
self.collect_effects_from_expr(&imp.body);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Recursively collect effect names from an expression
|
||||||
|
fn collect_effects_from_expr(&mut self, expr: &Expr) {
|
||||||
|
match expr {
|
||||||
|
Expr::EffectOp { effect, args, .. } => {
|
||||||
|
self.used_effects.insert(effect.name.clone());
|
||||||
|
for arg in args {
|
||||||
|
self.collect_effects_from_expr(arg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Run { expr, handlers, .. } => {
|
||||||
|
self.collect_effects_from_expr(expr);
|
||||||
|
for (effect, handler) in handlers {
|
||||||
|
self.used_effects.insert(effect.name.clone());
|
||||||
|
self.collect_effects_from_expr(handler);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Call { func, args, .. } => {
|
||||||
|
self.collect_effects_from_expr(func);
|
||||||
|
for arg in args {
|
||||||
|
self.collect_effects_from_expr(arg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Lambda { body, effects, .. } => {
|
||||||
|
for effect in effects {
|
||||||
|
self.used_effects.insert(effect.name.clone());
|
||||||
|
}
|
||||||
|
self.collect_effects_from_expr(body);
|
||||||
|
}
|
||||||
|
Expr::Let { value, body, .. } => {
|
||||||
|
self.collect_effects_from_expr(value);
|
||||||
|
self.collect_effects_from_expr(body);
|
||||||
|
}
|
||||||
|
Expr::If { condition, then_branch, else_branch, .. } => {
|
||||||
|
self.collect_effects_from_expr(condition);
|
||||||
|
self.collect_effects_from_expr(then_branch);
|
||||||
|
self.collect_effects_from_expr(else_branch);
|
||||||
|
}
|
||||||
|
Expr::Match { scrutinee, arms, .. } => {
|
||||||
|
self.collect_effects_from_expr(scrutinee);
|
||||||
|
for arm in arms {
|
||||||
|
self.collect_effects_from_expr(&arm.body);
|
||||||
|
if let Some(guard) = &arm.guard {
|
||||||
|
self.collect_effects_from_expr(guard);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Block { statements, result, .. } => {
|
||||||
|
for stmt in statements {
|
||||||
|
match stmt {
|
||||||
|
Statement::Expr(e) => self.collect_effects_from_expr(e),
|
||||||
|
Statement::Let { value, .. } => self.collect_effects_from_expr(value),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
self.collect_effects_from_expr(result);
|
||||||
|
}
|
||||||
|
Expr::BinaryOp { left, right, .. } => {
|
||||||
|
self.collect_effects_from_expr(left);
|
||||||
|
self.collect_effects_from_expr(right);
|
||||||
|
}
|
||||||
|
Expr::UnaryOp { operand, .. } => {
|
||||||
|
self.collect_effects_from_expr(operand);
|
||||||
|
}
|
||||||
|
Expr::Field { object, .. } => {
|
||||||
|
self.collect_effects_from_expr(object);
|
||||||
|
}
|
||||||
|
Expr::TupleIndex { object, .. } => {
|
||||||
|
self.collect_effects_from_expr(object);
|
||||||
|
}
|
||||||
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(s) = spread {
|
||||||
|
self.collect_effects_from_expr(s);
|
||||||
|
}
|
||||||
|
for (_, expr) in fields {
|
||||||
|
self.collect_effects_from_expr(expr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
|
||||||
|
for el in elements {
|
||||||
|
self.collect_effects_from_expr(el);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Expr::Resume { value, .. } => {
|
||||||
|
self.collect_effects_from_expr(value);
|
||||||
|
}
|
||||||
|
Expr::Literal(_) | Expr::Var(_) => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Emit the Lux runtime, tree-shaken based on used effects
|
||||||
fn emit_runtime(&mut self) {
|
fn emit_runtime(&mut self) {
|
||||||
|
let uses_console = self.used_effects.contains("Console");
|
||||||
|
let uses_random = self.used_effects.contains("Random");
|
||||||
|
let uses_time = self.used_effects.contains("Time");
|
||||||
|
let uses_http = self.used_effects.contains("Http");
|
||||||
|
let uses_dom = self.used_effects.contains("Dom");
|
||||||
|
let uses_html = self.used_effects.contains("Html") || uses_dom;
|
||||||
|
|
||||||
self.writeln("// Lux Runtime");
|
self.writeln("// Lux Runtime");
|
||||||
self.writeln("const Lux = {");
|
self.writeln("const Lux = {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
|
|
||||||
// Option helpers
|
// Core helpers — always emitted
|
||||||
self.writeln("Some: (value) => ({ tag: \"Some\", value }),");
|
self.writeln("Some: (value) => ({ tag: \"Some\", value }),");
|
||||||
self.writeln("None: () => ({ tag: \"None\" }),");
|
self.writeln("None: () => ({ tag: \"None\" }),");
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
|
|
||||||
// Result helpers
|
|
||||||
self.writeln("Ok: (value) => ({ tag: \"Ok\", value }),");
|
self.writeln("Ok: (value) => ({ tag: \"Ok\", value }),");
|
||||||
self.writeln("Err: (error) => ({ tag: \"Err\", error }),");
|
self.writeln("Err: (error) => ({ tag: \"Err\", error }),");
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
|
|
||||||
// List helpers
|
|
||||||
self.writeln("Cons: (head, tail) => [head, ...tail],");
|
self.writeln("Cons: (head, tail) => [head, ...tail],");
|
||||||
self.writeln("Nil: () => [],");
|
self.writeln("Nil: () => [],");
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
|
|
||||||
// Default handlers for effects
|
// Default handlers — only include effects that are used
|
||||||
self.writeln("defaultHandlers: {");
|
self.writeln("defaultHandlers: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
|
|
||||||
// Console effect
|
if uses_console {
|
||||||
|
self.emit_console_handler();
|
||||||
|
}
|
||||||
|
if uses_random {
|
||||||
|
self.emit_random_handler();
|
||||||
|
}
|
||||||
|
if uses_time {
|
||||||
|
self.emit_time_handler();
|
||||||
|
}
|
||||||
|
if uses_http {
|
||||||
|
self.emit_http_handler();
|
||||||
|
}
|
||||||
|
if uses_dom {
|
||||||
|
self.emit_dom_handler();
|
||||||
|
}
|
||||||
|
|
||||||
|
self.indent -= 1;
|
||||||
|
self.writeln("},");
|
||||||
|
|
||||||
|
// HTML rendering — only if Html or Dom effects are used
|
||||||
|
if uses_html {
|
||||||
|
self.emit_html_helpers();
|
||||||
|
}
|
||||||
|
|
||||||
|
// TEA runtime — only if Dom is used
|
||||||
|
if uses_dom {
|
||||||
|
self.emit_tea_runtime();
|
||||||
|
}
|
||||||
|
|
||||||
|
self.indent -= 1;
|
||||||
|
self.writeln("};");
|
||||||
|
self.writeln("");
|
||||||
|
}
|
||||||
|
|
||||||
|
fn emit_console_handler(&mut self) {
|
||||||
self.writeln("Console: {");
|
self.writeln("Console: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("print: (msg) => console.log(msg),");
|
self.writeln("print: (msg) => console.log(msg),");
|
||||||
@@ -207,8 +373,9 @@ impl JsBackend {
|
|||||||
self.writeln("readInt: () => parseInt(Lux.defaultHandlers.Console.readLine(), 10)");
|
self.writeln("readInt: () => parseInt(Lux.defaultHandlers.Console.readLine(), 10)");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// Random effect
|
fn emit_random_handler(&mut self) {
|
||||||
self.writeln("Random: {");
|
self.writeln("Random: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("int: (min, max) => Math.floor(Math.random() * (max - min + 1)) + min,");
|
self.writeln("int: (min, max) => Math.floor(Math.random() * (max - min + 1)) + min,");
|
||||||
@@ -216,16 +383,18 @@ impl JsBackend {
|
|||||||
self.writeln("float: () => Math.random()");
|
self.writeln("float: () => Math.random()");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// Time effect
|
fn emit_time_handler(&mut self) {
|
||||||
self.writeln("Time: {");
|
self.writeln("Time: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("now: () => Date.now(),");
|
self.writeln("now: () => Date.now(),");
|
||||||
self.writeln("sleep: (ms) => new Promise(resolve => setTimeout(resolve, ms))");
|
self.writeln("sleep: (ms) => new Promise(resolve => setTimeout(resolve, ms))");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// Http effect (browser/Node compatible)
|
fn emit_http_handler(&mut self) {
|
||||||
self.writeln("Http: {");
|
self.writeln("Http: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("get: async (url) => {");
|
self.writeln("get: async (url) => {");
|
||||||
@@ -287,8 +456,9 @@ impl JsBackend {
|
|||||||
self.writeln("}");
|
self.writeln("}");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// Dom effect (browser only - stubs for Node.js)
|
fn emit_dom_handler(&mut self) {
|
||||||
self.writeln("Dom: {");
|
self.writeln("Dom: {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
|
|
||||||
@@ -316,7 +486,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Element creation
|
|
||||||
self.writeln("createElement: (tag) => {");
|
self.writeln("createElement: (tag) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (typeof document === 'undefined') return null;");
|
self.writeln("if (typeof document === 'undefined') return null;");
|
||||||
@@ -331,7 +500,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// DOM manipulation
|
|
||||||
self.writeln("appendChild: (parent, child) => {");
|
self.writeln("appendChild: (parent, child) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (parent && child) parent.appendChild(child);");
|
self.writeln("if (parent && child) parent.appendChild(child);");
|
||||||
@@ -356,7 +524,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Content
|
|
||||||
self.writeln("setTextContent: (el, text) => {");
|
self.writeln("setTextContent: (el, text) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el) el.textContent = text;");
|
self.writeln("if (el) el.textContent = text;");
|
||||||
@@ -381,7 +548,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Attributes
|
|
||||||
self.writeln("setAttribute: (el, name, value) => {");
|
self.writeln("setAttribute: (el, name, value) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el) el.setAttribute(name, value);");
|
self.writeln("if (el) el.setAttribute(name, value);");
|
||||||
@@ -408,7 +574,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Classes
|
|
||||||
self.writeln("addClass: (el, className) => {");
|
self.writeln("addClass: (el, className) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el) el.classList.add(className);");
|
self.writeln("if (el) el.classList.add(className);");
|
||||||
@@ -433,7 +598,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Styles
|
|
||||||
self.writeln("setStyle: (el, property, value) => {");
|
self.writeln("setStyle: (el, property, value) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el) el.style[property] = value;");
|
self.writeln("if (el) el.style[property] = value;");
|
||||||
@@ -446,7 +610,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Form elements
|
|
||||||
self.writeln("getValue: (el) => {");
|
self.writeln("getValue: (el) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("return el ? el.value : '';");
|
self.writeln("return el ? el.value : '';");
|
||||||
@@ -471,7 +634,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Events
|
|
||||||
self.writeln("addEventListener: (el, event, handler) => {");
|
self.writeln("addEventListener: (el, event, handler) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el) el.addEventListener(event, handler);");
|
self.writeln("if (el) el.addEventListener(event, handler);");
|
||||||
@@ -484,7 +646,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Focus
|
|
||||||
self.writeln("focus: (el) => {");
|
self.writeln("focus: (el) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (el && el.focus) el.focus();");
|
self.writeln("if (el && el.focus) el.focus();");
|
||||||
@@ -497,7 +658,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Document
|
|
||||||
self.writeln("getBody: () => {");
|
self.writeln("getBody: () => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (typeof document === 'undefined') return null;");
|
self.writeln("if (typeof document === 'undefined') return null;");
|
||||||
@@ -512,7 +672,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Window
|
|
||||||
self.writeln("getWindow: () => {");
|
self.writeln("getWindow: () => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (typeof window === 'undefined') return null;");
|
self.writeln("if (typeof window === 'undefined') return null;");
|
||||||
@@ -545,7 +704,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Scroll
|
|
||||||
self.writeln("scrollTo: (x, y) => {");
|
self.writeln("scrollTo: (x, y) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (typeof window !== 'undefined') window.scrollTo(x, y);");
|
self.writeln("if (typeof window !== 'undefined') window.scrollTo(x, y);");
|
||||||
@@ -558,7 +716,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Dimensions
|
|
||||||
self.writeln("getBoundingClientRect: (el) => {");
|
self.writeln("getBoundingClientRect: (el) => {");
|
||||||
self.indent += 1;
|
self.indent += 1;
|
||||||
self.writeln("if (!el) return { top: 0, left: 0, width: 0, height: 0, right: 0, bottom: 0 };");
|
self.writeln("if (!el) return { top: 0, left: 0, width: 0, height: 0, right: 0, bottom: 0 };");
|
||||||
@@ -574,13 +731,11 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("}");
|
self.writeln("}");
|
||||||
|
|
||||||
self.indent -= 1;
|
|
||||||
self.writeln("}");
|
|
||||||
|
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// HTML rendering helpers
|
fn emit_html_helpers(&mut self) {
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
self.writeln("// HTML rendering");
|
self.writeln("// HTML rendering");
|
||||||
self.writeln("renderHtml: (node) => {");
|
self.writeln("renderHtml: (node) => {");
|
||||||
@@ -682,8 +837,9 @@ impl JsBackend {
|
|||||||
self.writeln("return el;");
|
self.writeln("return el;");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
}
|
||||||
|
|
||||||
// TEA (The Elm Architecture) runtime
|
fn emit_tea_runtime(&mut self) {
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
self.writeln("// The Elm Architecture (TEA) runtime");
|
self.writeln("// The Elm Architecture (TEA) runtime");
|
||||||
self.writeln("app: (config) => {");
|
self.writeln("app: (config) => {");
|
||||||
@@ -727,7 +883,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Simple app (for string-based views like the counter example)
|
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
self.writeln("// Simple TEA app (string-based view)");
|
self.writeln("// Simple TEA app (string-based view)");
|
||||||
self.writeln("simpleApp: (config) => {");
|
self.writeln("simpleApp: (config) => {");
|
||||||
@@ -757,7 +912,6 @@ impl JsBackend {
|
|||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("},");
|
self.writeln("},");
|
||||||
|
|
||||||
// Diff and patch (basic implementation for view_deps optimization)
|
|
||||||
self.writeln("");
|
self.writeln("");
|
||||||
self.writeln("// Basic diff - checks if model fields changed");
|
self.writeln("// Basic diff - checks if model fields changed");
|
||||||
self.writeln("hasChanged: (oldModel, newModel, ...paths) => {");
|
self.writeln("hasChanged: (oldModel, newModel, ...paths) => {");
|
||||||
@@ -777,11 +931,7 @@ impl JsBackend {
|
|||||||
self.writeln("}");
|
self.writeln("}");
|
||||||
self.writeln("return false;");
|
self.writeln("return false;");
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("}");
|
self.writeln("},");
|
||||||
|
|
||||||
self.indent -= 1;
|
|
||||||
self.writeln("};");
|
|
||||||
self.writeln("");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Collect type information from a type declaration
|
/// Collect type information from a type declaration
|
||||||
@@ -888,7 +1038,8 @@ impl JsBackend {
|
|||||||
let prev_has_handlers = self.has_handlers;
|
let prev_has_handlers = self.has_handlers;
|
||||||
self.has_handlers = is_effectful;
|
self.has_handlers = is_effectful;
|
||||||
|
|
||||||
// Clear var substitutions for this function
|
// Save and clear var substitutions for this function scope
|
||||||
|
let saved_substitutions = self.var_substitutions.clone();
|
||||||
self.var_substitutions.clear();
|
self.var_substitutions.clear();
|
||||||
|
|
||||||
// Emit function body
|
// Emit function body
|
||||||
@@ -896,6 +1047,7 @@ impl JsBackend {
|
|||||||
self.writeln(&format!("return {};", body_code));
|
self.writeln(&format!("return {};", body_code));
|
||||||
|
|
||||||
self.has_handlers = prev_has_handlers;
|
self.has_handlers = prev_has_handlers;
|
||||||
|
self.var_substitutions = saved_substitutions;
|
||||||
|
|
||||||
self.indent -= 1;
|
self.indent -= 1;
|
||||||
self.writeln("}");
|
self.writeln("}");
|
||||||
@@ -909,13 +1061,16 @@ impl JsBackend {
|
|||||||
let val = self.emit_expr(&let_decl.value)?;
|
let val = self.emit_expr(&let_decl.value)?;
|
||||||
let var_name = &let_decl.name.name;
|
let var_name = &let_decl.name.name;
|
||||||
|
|
||||||
// Check if this is a run expression (often results in undefined)
|
if var_name == "_" {
|
||||||
// We still want to execute it for its side effects
|
// Wildcard binding: just execute for side effects
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
self.writeln(&format!("{};", val));
|
||||||
|
} else {
|
||||||
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
|
|
||||||
// Register the variable for future use
|
// Register the variable for future use
|
||||||
self.var_substitutions
|
self.var_substitutions
|
||||||
.insert(var_name.clone(), var_name.clone());
|
.insert(var_name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
@@ -954,12 +1109,17 @@ impl JsBackend {
|
|||||||
let r = self.emit_expr(right)?;
|
let r = self.emit_expr(right)?;
|
||||||
|
|
||||||
// Check for string concatenation
|
// Check for string concatenation
|
||||||
if matches!(op, BinaryOp::Add) {
|
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
|
||||||
if self.is_string_expr(left) || self.is_string_expr(right) {
|
if self.is_string_expr(left) || self.is_string_expr(right) {
|
||||||
return Ok(format!("({} + {})", l, r));
|
return Ok(format!("({} + {})", l, r));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ++ on lists: use .concat()
|
||||||
|
if matches!(op, BinaryOp::Concat) {
|
||||||
|
return Ok(format!("{}.concat({})", l, r));
|
||||||
|
}
|
||||||
|
|
||||||
let op_str = match op {
|
let op_str = match op {
|
||||||
BinaryOp::Add => "+",
|
BinaryOp::Add => "+",
|
||||||
BinaryOp::Sub => "-",
|
BinaryOp::Sub => "-",
|
||||||
@@ -974,6 +1134,7 @@ impl JsBackend {
|
|||||||
BinaryOp::Ge => ">=",
|
BinaryOp::Ge => ">=",
|
||||||
BinaryOp::And => "&&",
|
BinaryOp::And => "&&",
|
||||||
BinaryOp::Or => "||",
|
BinaryOp::Or => "||",
|
||||||
|
BinaryOp::Concat => unreachable!("handled above"),
|
||||||
BinaryOp::Pipe => {
|
BinaryOp::Pipe => {
|
||||||
// Pipe operator: x |> f becomes f(x)
|
// Pipe operator: x |> f becomes f(x)
|
||||||
return Ok(format!("{}({})", r, l));
|
return Ok(format!("{}({})", r, l));
|
||||||
@@ -1034,18 +1195,26 @@ impl JsBackend {
|
|||||||
name, value, body, ..
|
name, value, body, ..
|
||||||
} => {
|
} => {
|
||||||
let val = self.emit_expr(value)?;
|
let val = self.emit_expr(value)?;
|
||||||
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
|
||||||
|
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
if name.name == "_" {
|
||||||
|
// Wildcard binding: just execute for side effects
|
||||||
|
self.writeln(&format!("{};", val));
|
||||||
|
} else {
|
||||||
|
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
||||||
|
|
||||||
// Add substitution
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
self.var_substitutions
|
|
||||||
.insert(name.name.clone(), var_name.clone());
|
// Add substitution
|
||||||
|
self.var_substitutions
|
||||||
|
.insert(name.name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
|
|
||||||
let body_result = self.emit_expr(body)?;
|
let body_result = self.emit_expr(body)?;
|
||||||
|
|
||||||
// Remove substitution
|
// Remove substitution
|
||||||
self.var_substitutions.remove(&name.name);
|
if name.name != "_" {
|
||||||
|
self.var_substitutions.remove(&name.name);
|
||||||
|
}
|
||||||
|
|
||||||
Ok(body_result)
|
Ok(body_result)
|
||||||
}
|
}
|
||||||
@@ -1057,6 +1226,31 @@ impl JsBackend {
|
|||||||
if module_name.name == "List" {
|
if module_name.name == "List" {
|
||||||
return self.emit_list_operation(&field.name, args);
|
return self.emit_list_operation(&field.name, args);
|
||||||
}
|
}
|
||||||
|
if module_name.name == "Map" {
|
||||||
|
return self.emit_map_operation(&field.name, args);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Int/Float module operations
|
||||||
|
if let Expr::Field { object, field, .. } = func.as_ref() {
|
||||||
|
if let Expr::Var(module_name) = object.as_ref() {
|
||||||
|
if module_name.name == "Int" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
match field.name.as_str() {
|
||||||
|
"toFloat" => return Ok(arg),
|
||||||
|
"toString" => return Ok(format!("String({})", arg)),
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if module_name.name == "Float" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
match field.name.as_str() {
|
||||||
|
"toInt" => return Ok(format!("Math.trunc({})", arg)),
|
||||||
|
"toString" => return Ok(format!("String({})", arg)),
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1066,6 +1260,10 @@ impl JsBackend {
|
|||||||
let arg = self.emit_expr(&args[0])?;
|
let arg = self.emit_expr(&args[0])?;
|
||||||
return Ok(format!("String({})", arg));
|
return Ok(format!("String({})", arg));
|
||||||
}
|
}
|
||||||
|
if ident.name == "print" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
return Ok(format!("console.log({})", arg));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
|
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
|
||||||
@@ -1142,6 +1340,26 @@ impl JsBackend {
|
|||||||
return self.emit_math_operation(&operation.name, args);
|
return self.emit_math_operation(&operation.name, args);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Special case: Int module operations
|
||||||
|
if effect.name == "Int" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
match operation.name.as_str() {
|
||||||
|
"toFloat" => return Ok(arg), // JS numbers are already floats
|
||||||
|
"toString" => return Ok(format!("String({})", arg)),
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Special case: Float module operations
|
||||||
|
if effect.name == "Float" {
|
||||||
|
let arg = self.emit_expr(&args[0])?;
|
||||||
|
match operation.name.as_str() {
|
||||||
|
"toInt" => return Ok(format!("Math.trunc({})", arg)),
|
||||||
|
"toString" => return Ok(format!("String({})", arg)),
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Special case: Result module operations (not an effect)
|
// Special case: Result module operations (not an effect)
|
||||||
if effect.name == "Result" {
|
if effect.name == "Result" {
|
||||||
return self.emit_result_operation(&operation.name, args);
|
return self.emit_result_operation(&operation.name, args);
|
||||||
@@ -1152,6 +1370,11 @@ impl JsBackend {
|
|||||||
return self.emit_json_operation(&operation.name, args);
|
return self.emit_json_operation(&operation.name, args);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Special case: Map module operations (not an effect)
|
||||||
|
if effect.name == "Map" {
|
||||||
|
return self.emit_map_operation(&operation.name, args);
|
||||||
|
}
|
||||||
|
|
||||||
// Special case: Html module operations (not an effect)
|
// Special case: Html module operations (not an effect)
|
||||||
if effect.name == "Html" {
|
if effect.name == "Html" {
|
||||||
return self.emit_html_operation(&operation.name, args);
|
return self.emit_html_operation(&operation.name, args);
|
||||||
@@ -1197,18 +1420,39 @@ impl JsBackend {
|
|||||||
param_names
|
param_names
|
||||||
};
|
};
|
||||||
|
|
||||||
// Save handler state
|
// Save state
|
||||||
let prev_has_handlers = self.has_handlers;
|
let prev_has_handlers = self.has_handlers;
|
||||||
|
let saved_substitutions = self.var_substitutions.clone();
|
||||||
self.has_handlers = !effects.is_empty();
|
self.has_handlers = !effects.is_empty();
|
||||||
|
|
||||||
|
// Register lambda params as themselves (override any outer substitutions)
|
||||||
|
for p in &all_params {
|
||||||
|
self.var_substitutions.insert(p.clone(), p.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Capture any statements emitted during body evaluation
|
||||||
|
let output_start = self.output.len();
|
||||||
|
let prev_indent = self.indent;
|
||||||
|
self.indent += 1;
|
||||||
|
|
||||||
let body_code = self.emit_expr(body)?;
|
let body_code = self.emit_expr(body)?;
|
||||||
|
self.writeln(&format!("return {};", body_code));
|
||||||
|
|
||||||
|
// Extract body statements and restore output
|
||||||
|
let body_statements = self.output[output_start..].to_string();
|
||||||
|
self.output.truncate(output_start);
|
||||||
|
self.indent = prev_indent;
|
||||||
|
|
||||||
|
// Restore state
|
||||||
self.has_handlers = prev_has_handlers;
|
self.has_handlers = prev_has_handlers;
|
||||||
|
self.var_substitutions = saved_substitutions;
|
||||||
|
|
||||||
|
let indent_str = " ".repeat(self.indent);
|
||||||
Ok(format!(
|
Ok(format!(
|
||||||
"(function({}) {{ return {}; }})",
|
"(function({}) {{\n{}{}}})",
|
||||||
all_params.join(", "),
|
all_params.join(", "),
|
||||||
body_code
|
body_statements,
|
||||||
|
indent_str,
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1228,10 +1472,15 @@ impl JsBackend {
|
|||||||
}
|
}
|
||||||
Statement::Let { name, value, .. } => {
|
Statement::Let { name, value, .. } => {
|
||||||
let val = self.emit_expr(value)?;
|
let val = self.emit_expr(value)?;
|
||||||
let var_name = format!("{}_{}", name.name, self.fresh_name());
|
if name.name == "_" {
|
||||||
self.writeln(&format!("const {} = {};", var_name, val));
|
self.writeln(&format!("{};", val));
|
||||||
self.var_substitutions
|
} else {
|
||||||
.insert(name.name.clone(), var_name.clone());
|
let var_name =
|
||||||
|
format!("{}_{}", name.name, self.fresh_name());
|
||||||
|
self.writeln(&format!("const {} = {};", var_name, val));
|
||||||
|
self.var_substitutions
|
||||||
|
.insert(name.name.clone(), var_name.clone());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1240,15 +1489,19 @@ impl JsBackend {
|
|||||||
self.emit_expr(result)
|
self.emit_expr(result)
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
let field_strs: Result<Vec<_>, _> = fields
|
spread, fields, ..
|
||||||
.iter()
|
} => {
|
||||||
.map(|(name, expr)| {
|
let mut parts = Vec::new();
|
||||||
let val = self.emit_expr(expr)?;
|
if let Some(spread_expr) = spread {
|
||||||
Ok(format!("{}: {}", name.name, val))
|
let spread_code = self.emit_expr(spread_expr)?;
|
||||||
})
|
parts.push(format!("...{}", spread_code));
|
||||||
.collect();
|
}
|
||||||
Ok(format!("{{ {} }}", field_strs?.join(", ")))
|
for (name, expr) in fields {
|
||||||
|
let val = self.emit_expr(expr)?;
|
||||||
|
parts.push(format!("{}: {}", name.name, val));
|
||||||
|
}
|
||||||
|
Ok(format!("{{ {} }}", parts.join(", ")))
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Tuple { elements, .. } => {
|
Expr::Tuple { elements, .. } => {
|
||||||
@@ -1268,6 +1521,11 @@ impl JsBackend {
|
|||||||
Ok(format!("{}.{}", obj, field.name))
|
Ok(format!("{}.{}", obj, field.name))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Expr::TupleIndex { object, index, .. } => {
|
||||||
|
let obj = self.emit_expr(object)?;
|
||||||
|
Ok(format!("{}[{}]", obj, index))
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Run {
|
Expr::Run {
|
||||||
expr, handlers, ..
|
expr, handlers, ..
|
||||||
} => {
|
} => {
|
||||||
@@ -1565,6 +1823,18 @@ impl JsBackend {
|
|||||||
end, start, start
|
end, start, start
|
||||||
))
|
))
|
||||||
}
|
}
|
||||||
|
"sort" => {
|
||||||
|
let list = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!(
|
||||||
|
"[...{}].sort((a, b) => a < b ? -1 : a > b ? 1 : 0)",
|
||||||
|
list
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"sortBy" => {
|
||||||
|
let list = self.emit_expr(&args[0])?;
|
||||||
|
let func = self.emit_expr(&args[1])?;
|
||||||
|
Ok(format!("[...{}].sort({})", list, func))
|
||||||
|
}
|
||||||
_ => Err(JsGenError {
|
_ => Err(JsGenError {
|
||||||
message: format!("Unknown List operation: {}", operation),
|
message: format!("Unknown List operation: {}", operation),
|
||||||
span: None,
|
span: None,
|
||||||
@@ -2062,6 +2332,86 @@ impl JsBackend {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Emit Map module operations using JS Map
|
||||||
|
fn emit_map_operation(
|
||||||
|
&mut self,
|
||||||
|
operation: &str,
|
||||||
|
args: &[Expr],
|
||||||
|
) -> Result<String, JsGenError> {
|
||||||
|
match operation {
|
||||||
|
"new" => Ok("new Map()".to_string()),
|
||||||
|
"set" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
let key = self.emit_expr(&args[1])?;
|
||||||
|
let val = self.emit_expr(&args[2])?;
|
||||||
|
Ok(format!(
|
||||||
|
"(function() {{ var m = new Map({}); m.set({}, {}); return m; }})()",
|
||||||
|
map, key, val
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"get" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
let key = self.emit_expr(&args[1])?;
|
||||||
|
Ok(format!(
|
||||||
|
"({0}.has({1}) ? Lux.Some({0}.get({1})) : Lux.None())",
|
||||||
|
map, key
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"contains" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
let key = self.emit_expr(&args[1])?;
|
||||||
|
Ok(format!("{}.has({})", map, key))
|
||||||
|
}
|
||||||
|
"remove" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
let key = self.emit_expr(&args[1])?;
|
||||||
|
Ok(format!(
|
||||||
|
"(function() {{ var m = new Map({}); m.delete({}); return m; }})()",
|
||||||
|
map, key
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"keys" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!("Array.from({}.keys()).sort()", map))
|
||||||
|
}
|
||||||
|
"values" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!(
|
||||||
|
"Array.from({0}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }}).map(function(e) {{ return e[1]; }})",
|
||||||
|
map
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"size" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!("{}.size", map))
|
||||||
|
}
|
||||||
|
"isEmpty" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!("({}.size === 0)", map))
|
||||||
|
}
|
||||||
|
"fromList" => {
|
||||||
|
let list = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!("new Map({}.map(function(t) {{ return [t[0], t[1]]; }}))", list))
|
||||||
|
}
|
||||||
|
"toList" => {
|
||||||
|
let map = self.emit_expr(&args[0])?;
|
||||||
|
Ok(format!(
|
||||||
|
"Array.from({}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }})",
|
||||||
|
map
|
||||||
|
))
|
||||||
|
}
|
||||||
|
"merge" => {
|
||||||
|
let m1 = self.emit_expr(&args[0])?;
|
||||||
|
let m2 = self.emit_expr(&args[1])?;
|
||||||
|
Ok(format!("new Map([...{}, ...{}])", m1, m2))
|
||||||
|
}
|
||||||
|
_ => Err(JsGenError {
|
||||||
|
message: format!("Unknown Map operation: {}", operation),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/// Emit Html module operations for type-safe HTML construction
|
/// Emit Html module operations for type-safe HTML construction
|
||||||
fn emit_html_operation(
|
fn emit_html_operation(
|
||||||
&mut self,
|
&mut self,
|
||||||
@@ -2333,7 +2683,7 @@ impl JsBackend {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::BinaryOp { op, left, right, .. } => {
|
Expr::BinaryOp { op, left, right, .. } => {
|
||||||
matches!(op, BinaryOp::Add)
|
matches!(op, BinaryOp::Add | BinaryOp::Concat)
|
||||||
&& (self.is_string_expr(left) || self.is_string_expr(right))
|
&& (self.is_string_expr(left) || self.is_string_expr(right))
|
||||||
}
|
}
|
||||||
_ => false,
|
_ => false,
|
||||||
@@ -2384,6 +2734,10 @@ impl JsBackend {
|
|||||||
|
|
||||||
/// Mangle a Lux name to a valid JavaScript name
|
/// Mangle a Lux name to a valid JavaScript name
|
||||||
fn mangle_name(&self, name: &str) -> String {
|
fn mangle_name(&self, name: &str) -> String {
|
||||||
|
// Extern functions use their JS name directly (no mangling)
|
||||||
|
if let Some(js_name) = self.extern_fns.get(name) {
|
||||||
|
return js_name.clone();
|
||||||
|
}
|
||||||
format!("{}_lux", name)
|
format!("{}_lux", name)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -3732,7 +4086,7 @@ line3"
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_js_runtime_generated() {
|
fn test_js_runtime_generated() {
|
||||||
// Test that the Lux runtime is properly generated
|
// Test that the Lux runtime core is always generated
|
||||||
use crate::parser::Parser;
|
use crate::parser::Parser;
|
||||||
|
|
||||||
let source = r#"
|
let source = r#"
|
||||||
@@ -3743,21 +4097,51 @@ line3"
|
|||||||
let mut backend = JsBackend::new();
|
let mut backend = JsBackend::new();
|
||||||
let js_code = backend.generate(&program).expect("Should generate");
|
let js_code = backend.generate(&program).expect("Should generate");
|
||||||
|
|
||||||
// Check that Lux runtime includes key functions
|
// Core runtime is always present
|
||||||
assert!(js_code.contains("const Lux = {"), "Lux object should be defined");
|
assert!(js_code.contains("const Lux = {"), "Lux object should be defined");
|
||||||
assert!(js_code.contains("Some:"), "Option Some should be defined");
|
assert!(js_code.contains("Some:"), "Option Some should be defined");
|
||||||
assert!(js_code.contains("None:"), "Option None should be defined");
|
assert!(js_code.contains("None:"), "Option None should be defined");
|
||||||
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined");
|
|
||||||
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined");
|
// Console-only program should NOT include Dom, Html, or TEA sections
|
||||||
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined");
|
assert!(!js_code.contains("Dom:"), "Dom handler should not be in Console-only program");
|
||||||
assert!(js_code.contains("app:"), "TEA app should be defined");
|
assert!(!js_code.contains("renderHtml:"), "renderHtml should not be in Console-only program");
|
||||||
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined");
|
assert!(!js_code.contains("app:"), "TEA app should not be in Console-only program");
|
||||||
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined");
|
assert!(!js_code.contains("Http:"), "Http should not be in Console-only program");
|
||||||
|
|
||||||
|
// Console should be present
|
||||||
|
assert!(js_code.contains("Console:"), "Console handler should exist");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_js_runtime_tree_shaking_all_effects() {
|
||||||
|
// Test that all effects are included when all are used
|
||||||
|
use crate::parser::Parser;
|
||||||
|
|
||||||
|
let source = r#"
|
||||||
|
fn main(): Unit with {Console, Dom} = {
|
||||||
|
Console.print("Hello")
|
||||||
|
let _ = Dom.getElementById("app")
|
||||||
|
()
|
||||||
|
}
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let program = Parser::parse_source(source).expect("Should parse");
|
||||||
|
let mut backend = JsBackend::new();
|
||||||
|
let js_code = backend.generate(&program).expect("Should generate");
|
||||||
|
|
||||||
|
assert!(js_code.contains("Console:"), "Console handler should exist");
|
||||||
|
assert!(js_code.contains("Dom:"), "Dom handler should exist");
|
||||||
|
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined when Dom is used");
|
||||||
|
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined when Dom is used");
|
||||||
|
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined when Dom is used");
|
||||||
|
assert!(js_code.contains("app:"), "TEA app should be defined when Dom is used");
|
||||||
|
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined when Dom is used");
|
||||||
|
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined when Dom is used");
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_js_runtime_default_handlers() {
|
fn test_js_runtime_default_handlers() {
|
||||||
// Test that default handlers are properly generated
|
// Test that only used effect handlers are generated
|
||||||
use crate::parser::Parser;
|
use crate::parser::Parser;
|
||||||
|
|
||||||
let source = r#"
|
let source = r#"
|
||||||
@@ -3768,12 +4152,12 @@ line3"
|
|||||||
let mut backend = JsBackend::new();
|
let mut backend = JsBackend::new();
|
||||||
let js_code = backend.generate(&program).expect("Should generate");
|
let js_code = backend.generate(&program).expect("Should generate");
|
||||||
|
|
||||||
// Check that default handlers include all effects
|
// Only Console should be present
|
||||||
assert!(js_code.contains("Console:"), "Console handler should exist");
|
assert!(js_code.contains("Console:"), "Console handler should exist");
|
||||||
assert!(js_code.contains("Random:"), "Random handler should exist");
|
assert!(!js_code.contains("Random:"), "Random handler should not exist in Console-only program");
|
||||||
assert!(js_code.contains("Time:"), "Time handler should exist");
|
assert!(!js_code.contains("Time:"), "Time handler should not exist in Console-only program");
|
||||||
assert!(js_code.contains("Http:"), "Http handler should exist");
|
assert!(!js_code.contains("Http:"), "Http handler should not exist in Console-only program");
|
||||||
assert!(js_code.contains("Dom:"), "Dom handler should exist");
|
assert!(!js_code.contains("Dom:"), "Dom handler should not exist in Console-only program");
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
|
|||||||
@@ -333,11 +333,13 @@ mod tests {
|
|||||||
fn test_option_exhaustive() {
|
fn test_option_exhaustive() {
|
||||||
let patterns = vec![
|
let patterns = vec![
|
||||||
Pattern::Constructor {
|
Pattern::Constructor {
|
||||||
|
module: None,
|
||||||
name: make_ident("None"),
|
name: make_ident("None"),
|
||||||
fields: vec![],
|
fields: vec![],
|
||||||
span: span(),
|
span: span(),
|
||||||
},
|
},
|
||||||
Pattern::Constructor {
|
Pattern::Constructor {
|
||||||
|
module: None,
|
||||||
name: make_ident("Some"),
|
name: make_ident("Some"),
|
||||||
fields: vec![Pattern::Wildcard(span())],
|
fields: vec![Pattern::Wildcard(span())],
|
||||||
span: span(),
|
span: span(),
|
||||||
@@ -352,6 +354,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_option_missing_none() {
|
fn test_option_missing_none() {
|
||||||
let patterns = vec![Pattern::Constructor {
|
let patterns = vec![Pattern::Constructor {
|
||||||
|
module: None,
|
||||||
name: make_ident("Some"),
|
name: make_ident("Some"),
|
||||||
fields: vec![Pattern::Wildcard(span())],
|
fields: vec![Pattern::Wildcard(span())],
|
||||||
span: span(),
|
span: span(),
|
||||||
@@ -391,11 +394,13 @@ mod tests {
|
|||||||
fn test_result_exhaustive() {
|
fn test_result_exhaustive() {
|
||||||
let patterns = vec![
|
let patterns = vec![
|
||||||
Pattern::Constructor {
|
Pattern::Constructor {
|
||||||
|
module: None,
|
||||||
name: make_ident("Ok"),
|
name: make_ident("Ok"),
|
||||||
fields: vec![Pattern::Wildcard(span())],
|
fields: vec![Pattern::Wildcard(span())],
|
||||||
span: span(),
|
span: span(),
|
||||||
},
|
},
|
||||||
Pattern::Constructor {
|
Pattern::Constructor {
|
||||||
|
module: None,
|
||||||
name: make_ident("Err"),
|
name: make_ident("Err"),
|
||||||
fields: vec![Pattern::Wildcard(span())],
|
fields: vec![Pattern::Wildcard(span())],
|
||||||
span: span(),
|
span: span(),
|
||||||
|
|||||||
117
src/formatter.rs
117
src/formatter.rs
@@ -3,9 +3,9 @@
|
|||||||
//! Formats Lux source code according to standard style guidelines.
|
//! Formats Lux source code according to standard style guidelines.
|
||||||
|
|
||||||
use crate::ast::{
|
use crate::ast::{
|
||||||
BehavioralProperty, BinaryOp, Declaration, EffectDecl, Expr, FunctionDecl, HandlerDecl,
|
BehavioralProperty, BinaryOp, Declaration, EffectDecl, ExternFnDecl, Expr, FunctionDecl,
|
||||||
ImplDecl, ImplMethod, LetDecl, Literal, LiteralKind, Pattern, Program, Statement, TraitDecl,
|
HandlerDecl, ImplDecl, ImplMethod, LetDecl, Literal, LiteralKind, Pattern, Program, Statement,
|
||||||
TypeDecl, TypeDef, TypeExpr, UnaryOp, VariantFields,
|
TraitDecl, TypeDecl, TypeDef, TypeExpr, UnaryOp, VariantFields, Visibility,
|
||||||
};
|
};
|
||||||
use crate::lexer::Lexer;
|
use crate::lexer::Lexer;
|
||||||
use crate::parser::Parser;
|
use crate::parser::Parser;
|
||||||
@@ -103,9 +103,55 @@ impl Formatter {
|
|||||||
Declaration::Handler(h) => self.format_handler(h),
|
Declaration::Handler(h) => self.format_handler(h),
|
||||||
Declaration::Trait(t) => self.format_trait(t),
|
Declaration::Trait(t) => self.format_trait(t),
|
||||||
Declaration::Impl(i) => self.format_impl(i),
|
Declaration::Impl(i) => self.format_impl(i),
|
||||||
|
Declaration::ExternFn(e) => self.format_extern_fn(e),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn format_extern_fn(&mut self, ext: &ExternFnDecl) {
|
||||||
|
let indent = self.indent();
|
||||||
|
self.write(&indent);
|
||||||
|
|
||||||
|
if ext.visibility == Visibility::Public {
|
||||||
|
self.write("pub ");
|
||||||
|
}
|
||||||
|
|
||||||
|
self.write("extern fn ");
|
||||||
|
self.write(&ext.name.name);
|
||||||
|
|
||||||
|
// Type parameters
|
||||||
|
if !ext.type_params.is_empty() {
|
||||||
|
self.write("<");
|
||||||
|
self.write(
|
||||||
|
&ext.type_params
|
||||||
|
.iter()
|
||||||
|
.map(|p| p.name.clone())
|
||||||
|
.collect::<Vec<_>>()
|
||||||
|
.join(", "),
|
||||||
|
);
|
||||||
|
self.write(">");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parameters
|
||||||
|
self.write("(");
|
||||||
|
let params: Vec<String> = ext
|
||||||
|
.params
|
||||||
|
.iter()
|
||||||
|
.map(|p| format!("{}: {}", p.name.name, self.format_type_expr(&p.typ)))
|
||||||
|
.collect();
|
||||||
|
self.write(¶ms.join(", "));
|
||||||
|
self.write("): ");
|
||||||
|
|
||||||
|
// Return type
|
||||||
|
self.write(&self.format_type_expr(&ext.return_type));
|
||||||
|
|
||||||
|
// Optional JS name
|
||||||
|
if let Some(js_name) = &ext.js_name {
|
||||||
|
self.write(&format!(" = \"{}\"", js_name));
|
||||||
|
}
|
||||||
|
|
||||||
|
self.newline();
|
||||||
|
}
|
||||||
|
|
||||||
fn format_function(&mut self, func: &FunctionDecl) {
|
fn format_function(&mut self, func: &FunctionDecl) {
|
||||||
let indent = self.indent();
|
let indent = self.indent();
|
||||||
self.write(&indent);
|
self.write(&indent);
|
||||||
@@ -598,6 +644,9 @@ impl Formatter {
|
|||||||
Expr::Field { object, field, .. } => {
|
Expr::Field { object, field, .. } => {
|
||||||
format!("{}.{}", self.format_expr(object), field.name)
|
format!("{}.{}", self.format_expr(object), field.name)
|
||||||
}
|
}
|
||||||
|
Expr::TupleIndex { object, index, .. } => {
|
||||||
|
format!("{}.{}", self.format_expr(object), index)
|
||||||
|
}
|
||||||
Expr::If { condition, then_branch, else_branch, .. } => {
|
Expr::If { condition, then_branch, else_branch, .. } => {
|
||||||
format!(
|
format!(
|
||||||
"if {} then {} else {}",
|
"if {} then {} else {}",
|
||||||
@@ -685,15 +734,17 @@ impl Formatter {
|
|||||||
.join(", ")
|
.join(", ")
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
format!(
|
spread, fields, ..
|
||||||
"{{ {} }}",
|
} => {
|
||||||
fields
|
let mut parts = Vec::new();
|
||||||
.iter()
|
if let Some(spread_expr) = spread {
|
||||||
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val)))
|
parts.push(format!("...{}", self.format_expr(spread_expr)));
|
||||||
.collect::<Vec<_>>()
|
}
|
||||||
.join(", ")
|
for (name, val) in fields {
|
||||||
)
|
parts.push(format!("{}: {}", name.name, self.format_expr(val)));
|
||||||
|
}
|
||||||
|
format!("{{ {} }}", parts.join(", "))
|
||||||
}
|
}
|
||||||
Expr::EffectOp { effect, operation, args, .. } => {
|
Expr::EffectOp { effect, operation, args, .. } => {
|
||||||
format!(
|
format!(
|
||||||
@@ -728,7 +779,30 @@ impl Formatter {
|
|||||||
match &lit.kind {
|
match &lit.kind {
|
||||||
LiteralKind::Int(n) => n.to_string(),
|
LiteralKind::Int(n) => n.to_string(),
|
||||||
LiteralKind::Float(f) => format!("{}", f),
|
LiteralKind::Float(f) => format!("{}", f),
|
||||||
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"")),
|
LiteralKind::String(s) => {
|
||||||
|
if s.contains('\n') {
|
||||||
|
// Use triple-quoted multiline string
|
||||||
|
let tab = " ".repeat(self.config.indent_size);
|
||||||
|
let base_indent = tab.repeat(self.indent_level);
|
||||||
|
let content_indent = tab.repeat(self.indent_level + 1);
|
||||||
|
let lines: Vec<&str> = s.split('\n').collect();
|
||||||
|
let mut result = String::from("\"\"\"\n");
|
||||||
|
for line in &lines {
|
||||||
|
if line.is_empty() {
|
||||||
|
result.push('\n');
|
||||||
|
} else {
|
||||||
|
result.push_str(&content_indent);
|
||||||
|
result.push_str(&line.replace('{', "\\{").replace('}', "\\}"));
|
||||||
|
result.push('\n');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result.push_str(&base_indent);
|
||||||
|
result.push_str("\"\"\"");
|
||||||
|
result
|
||||||
|
} else {
|
||||||
|
format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"").replace('{', "\\{").replace('}', "\\}"))
|
||||||
|
}
|
||||||
|
},
|
||||||
LiteralKind::Char(c) => format!("'{}'", c),
|
LiteralKind::Char(c) => format!("'{}'", c),
|
||||||
LiteralKind::Bool(b) => b.to_string(),
|
LiteralKind::Bool(b) => b.to_string(),
|
||||||
LiteralKind::Unit => "()".to_string(),
|
LiteralKind::Unit => "()".to_string(),
|
||||||
@@ -750,6 +824,7 @@ impl Formatter {
|
|||||||
BinaryOp::Ge => ">=",
|
BinaryOp::Ge => ">=",
|
||||||
BinaryOp::And => "&&",
|
BinaryOp::And => "&&",
|
||||||
BinaryOp::Or => "||",
|
BinaryOp::Or => "||",
|
||||||
|
BinaryOp::Concat => "++",
|
||||||
BinaryOp::Pipe => "|>",
|
BinaryOp::Pipe => "|>",
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -766,12 +841,22 @@ impl Formatter {
|
|||||||
Pattern::Wildcard(_) => "_".to_string(),
|
Pattern::Wildcard(_) => "_".to_string(),
|
||||||
Pattern::Var(ident) => ident.name.clone(),
|
Pattern::Var(ident) => ident.name.clone(),
|
||||||
Pattern::Literal(lit) => self.format_literal(lit),
|
Pattern::Literal(lit) => self.format_literal(lit),
|
||||||
Pattern::Constructor { name, fields, .. } => {
|
Pattern::Constructor {
|
||||||
|
module,
|
||||||
|
name,
|
||||||
|
fields,
|
||||||
|
..
|
||||||
|
} => {
|
||||||
|
let prefix = match module {
|
||||||
|
Some(m) => format!("{}.", m.name),
|
||||||
|
None => String::new(),
|
||||||
|
};
|
||||||
if fields.is_empty() {
|
if fields.is_empty() {
|
||||||
name.name.clone()
|
format!("{}{}", prefix, name.name)
|
||||||
} else {
|
} else {
|
||||||
format!(
|
format!(
|
||||||
"{}({})",
|
"{}{}({})",
|
||||||
|
prefix,
|
||||||
name.name,
|
name.name,
|
||||||
fields
|
fields
|
||||||
.iter()
|
.iter()
|
||||||
|
|||||||
@@ -28,6 +28,8 @@ pub enum BuiltinFn {
|
|||||||
ListGet,
|
ListGet,
|
||||||
ListRange,
|
ListRange,
|
||||||
ListForEach,
|
ListForEach,
|
||||||
|
ListSort,
|
||||||
|
ListSortBy,
|
||||||
|
|
||||||
// String operations
|
// String operations
|
||||||
StringSplit,
|
StringSplit,
|
||||||
@@ -74,14 +76,21 @@ pub enum BuiltinFn {
|
|||||||
MathFloor,
|
MathFloor,
|
||||||
MathCeil,
|
MathCeil,
|
||||||
MathRound,
|
MathRound,
|
||||||
|
MathSin,
|
||||||
|
MathCos,
|
||||||
|
MathAtan2,
|
||||||
|
|
||||||
// Additional List operations
|
// Additional List operations
|
||||||
ListIsEmpty,
|
ListIsEmpty,
|
||||||
ListFind,
|
ListFind,
|
||||||
|
ListFindIndex,
|
||||||
ListAny,
|
ListAny,
|
||||||
ListAll,
|
ListAll,
|
||||||
ListTake,
|
ListTake,
|
||||||
ListDrop,
|
ListDrop,
|
||||||
|
ListZip,
|
||||||
|
ListFlatten,
|
||||||
|
ListContains,
|
||||||
|
|
||||||
// Additional String operations
|
// Additional String operations
|
||||||
StringStartsWith,
|
StringStartsWith,
|
||||||
@@ -95,6 +104,12 @@ pub enum BuiltinFn {
|
|||||||
StringLastIndexOf,
|
StringLastIndexOf,
|
||||||
StringRepeat,
|
StringRepeat,
|
||||||
|
|
||||||
|
// Int/Float operations
|
||||||
|
IntToString,
|
||||||
|
IntToFloat,
|
||||||
|
FloatToString,
|
||||||
|
FloatToInt,
|
||||||
|
|
||||||
// JSON operations
|
// JSON operations
|
||||||
JsonParse,
|
JsonParse,
|
||||||
JsonStringify,
|
JsonStringify,
|
||||||
@@ -115,6 +130,20 @@ pub enum BuiltinFn {
|
|||||||
JsonString,
|
JsonString,
|
||||||
JsonArray,
|
JsonArray,
|
||||||
JsonObject,
|
JsonObject,
|
||||||
|
|
||||||
|
// Map operations
|
||||||
|
MapNew,
|
||||||
|
MapSet,
|
||||||
|
MapGet,
|
||||||
|
MapContains,
|
||||||
|
MapRemove,
|
||||||
|
MapKeys,
|
||||||
|
MapValues,
|
||||||
|
MapSize,
|
||||||
|
MapIsEmpty,
|
||||||
|
MapFromList,
|
||||||
|
MapToList,
|
||||||
|
MapMerge,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Runtime value
|
/// Runtime value
|
||||||
@@ -129,6 +158,7 @@ pub enum Value {
|
|||||||
List(Vec<Value>),
|
List(Vec<Value>),
|
||||||
Tuple(Vec<Value>),
|
Tuple(Vec<Value>),
|
||||||
Record(HashMap<String, Value>),
|
Record(HashMap<String, Value>),
|
||||||
|
Map(HashMap<String, Value>),
|
||||||
Function(Rc<Closure>),
|
Function(Rc<Closure>),
|
||||||
Handler(Rc<HandlerValue>),
|
Handler(Rc<HandlerValue>),
|
||||||
/// Built-in function
|
/// Built-in function
|
||||||
@@ -146,6 +176,11 @@ pub enum Value {
|
|||||||
},
|
},
|
||||||
/// JSON value (for JSON parsing/manipulation)
|
/// JSON value (for JSON parsing/manipulation)
|
||||||
Json(serde_json::Value),
|
Json(serde_json::Value),
|
||||||
|
/// Extern function (FFI — only callable from JS backend)
|
||||||
|
ExternFn {
|
||||||
|
name: String,
|
||||||
|
arity: usize,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Value {
|
impl Value {
|
||||||
@@ -160,12 +195,14 @@ impl Value {
|
|||||||
Value::List(_) => "List",
|
Value::List(_) => "List",
|
||||||
Value::Tuple(_) => "Tuple",
|
Value::Tuple(_) => "Tuple",
|
||||||
Value::Record(_) => "Record",
|
Value::Record(_) => "Record",
|
||||||
|
Value::Map(_) => "Map",
|
||||||
Value::Function(_) => "Function",
|
Value::Function(_) => "Function",
|
||||||
Value::Handler(_) => "Handler",
|
Value::Handler(_) => "Handler",
|
||||||
Value::Builtin(_) => "Function",
|
Value::Builtin(_) => "Function",
|
||||||
Value::Constructor { .. } => "Constructor",
|
Value::Constructor { .. } => "Constructor",
|
||||||
Value::Versioned { .. } => "Versioned",
|
Value::Versioned { .. } => "Versioned",
|
||||||
Value::Json(_) => "Json",
|
Value::Json(_) => "Json",
|
||||||
|
Value::ExternFn { .. } => "ExternFn",
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -208,6 +245,11 @@ impl Value {
|
|||||||
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
|
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
(Value::Map(xs), Value::Map(ys)) => {
|
||||||
|
xs.len() == ys.len() && xs.iter().all(|(k, v)| {
|
||||||
|
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
|
||||||
|
})
|
||||||
|
}
|
||||||
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
|
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
|
||||||
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
|
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
|
||||||
}
|
}
|
||||||
@@ -278,6 +320,16 @@ impl TryFromValue for Vec<Value> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
impl TryFromValue for HashMap<String, Value> {
|
||||||
|
const TYPE_NAME: &'static str = "Map";
|
||||||
|
fn try_from_value(value: &Value) -> Option<Self> {
|
||||||
|
match value {
|
||||||
|
Value::Map(m) => Some(m.clone()),
|
||||||
|
_ => None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
impl TryFromValue for Value {
|
impl TryFromValue for Value {
|
||||||
const TYPE_NAME: &'static str = "any";
|
const TYPE_NAME: &'static str = "any";
|
||||||
fn try_from_value(value: &Value) -> Option<Self> {
|
fn try_from_value(value: &Value) -> Option<Self> {
|
||||||
@@ -324,6 +376,18 @@ impl fmt::Display for Value {
|
|||||||
}
|
}
|
||||||
write!(f, " }}")
|
write!(f, " }}")
|
||||||
}
|
}
|
||||||
|
Value::Map(entries) => {
|
||||||
|
write!(f, "Map {{")?;
|
||||||
|
let mut sorted: Vec<_> = entries.iter().collect();
|
||||||
|
sorted.sort_by_key(|(k, _)| (*k).clone());
|
||||||
|
for (i, (key, value)) in sorted.iter().enumerate() {
|
||||||
|
if i > 0 {
|
||||||
|
write!(f, ", ")?;
|
||||||
|
}
|
||||||
|
write!(f, "\"{}\": {}", key, value)?;
|
||||||
|
}
|
||||||
|
write!(f, "}}")
|
||||||
|
}
|
||||||
Value::Function(_) => write!(f, "<function>"),
|
Value::Function(_) => write!(f, "<function>"),
|
||||||
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
|
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
|
||||||
Value::Handler(_) => write!(f, "<handler>"),
|
Value::Handler(_) => write!(f, "<handler>"),
|
||||||
@@ -349,6 +413,7 @@ impl fmt::Display for Value {
|
|||||||
write!(f, "{} @v{}", value, version)
|
write!(f, "{} @v{}", value, version)
|
||||||
}
|
}
|
||||||
Value::Json(json) => write!(f, "{}", json),
|
Value::Json(json) => write!(f, "{}", json),
|
||||||
|
Value::ExternFn { name, .. } => write!(f, "<extern fn {}>", name),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -920,14 +985,23 @@ impl Interpreter {
|
|||||||
Value::Builtin(BuiltinFn::ListIsEmpty),
|
Value::Builtin(BuiltinFn::ListIsEmpty),
|
||||||
),
|
),
|
||||||
("find".to_string(), Value::Builtin(BuiltinFn::ListFind)),
|
("find".to_string(), Value::Builtin(BuiltinFn::ListFind)),
|
||||||
|
("findIndex".to_string(), Value::Builtin(BuiltinFn::ListFindIndex)),
|
||||||
("any".to_string(), Value::Builtin(BuiltinFn::ListAny)),
|
("any".to_string(), Value::Builtin(BuiltinFn::ListAny)),
|
||||||
("all".to_string(), Value::Builtin(BuiltinFn::ListAll)),
|
("all".to_string(), Value::Builtin(BuiltinFn::ListAll)),
|
||||||
("take".to_string(), Value::Builtin(BuiltinFn::ListTake)),
|
("take".to_string(), Value::Builtin(BuiltinFn::ListTake)),
|
||||||
("drop".to_string(), Value::Builtin(BuiltinFn::ListDrop)),
|
("drop".to_string(), Value::Builtin(BuiltinFn::ListDrop)),
|
||||||
|
("zip".to_string(), Value::Builtin(BuiltinFn::ListZip)),
|
||||||
|
("flatten".to_string(), Value::Builtin(BuiltinFn::ListFlatten)),
|
||||||
|
("contains".to_string(), Value::Builtin(BuiltinFn::ListContains)),
|
||||||
(
|
(
|
||||||
"forEach".to_string(),
|
"forEach".to_string(),
|
||||||
Value::Builtin(BuiltinFn::ListForEach),
|
Value::Builtin(BuiltinFn::ListForEach),
|
||||||
),
|
),
|
||||||
|
("sort".to_string(), Value::Builtin(BuiltinFn::ListSort)),
|
||||||
|
(
|
||||||
|
"sortBy".to_string(),
|
||||||
|
Value::Builtin(BuiltinFn::ListSortBy),
|
||||||
|
),
|
||||||
]));
|
]));
|
||||||
env.define("List", list_module);
|
env.define("List", list_module);
|
||||||
|
|
||||||
@@ -1068,9 +1142,26 @@ impl Interpreter {
|
|||||||
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
|
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
|
||||||
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
|
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
|
||||||
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
|
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
|
||||||
|
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
|
||||||
|
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
|
||||||
|
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
|
||||||
]));
|
]));
|
||||||
env.define("Math", math_module);
|
env.define("Math", math_module);
|
||||||
|
|
||||||
|
// Int module
|
||||||
|
let int_module = Value::Record(HashMap::from([
|
||||||
|
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
|
||||||
|
("toFloat".to_string(), Value::Builtin(BuiltinFn::IntToFloat)),
|
||||||
|
]));
|
||||||
|
env.define("Int", int_module);
|
||||||
|
|
||||||
|
// Float module
|
||||||
|
let float_module = Value::Record(HashMap::from([
|
||||||
|
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
|
||||||
|
("toInt".to_string(), Value::Builtin(BuiltinFn::FloatToInt)),
|
||||||
|
]));
|
||||||
|
env.define("Float", float_module);
|
||||||
|
|
||||||
// JSON module
|
// JSON module
|
||||||
let json_module = Value::Record(HashMap::from([
|
let json_module = Value::Record(HashMap::from([
|
||||||
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
|
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
|
||||||
@@ -1094,16 +1185,72 @@ impl Interpreter {
|
|||||||
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
|
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
|
||||||
]));
|
]));
|
||||||
env.define("Json", json_module);
|
env.define("Json", json_module);
|
||||||
|
|
||||||
|
// Map module
|
||||||
|
let map_module = Value::Record(HashMap::from([
|
||||||
|
("new".to_string(), Value::Builtin(BuiltinFn::MapNew)),
|
||||||
|
("set".to_string(), Value::Builtin(BuiltinFn::MapSet)),
|
||||||
|
("get".to_string(), Value::Builtin(BuiltinFn::MapGet)),
|
||||||
|
("contains".to_string(), Value::Builtin(BuiltinFn::MapContains)),
|
||||||
|
("remove".to_string(), Value::Builtin(BuiltinFn::MapRemove)),
|
||||||
|
("keys".to_string(), Value::Builtin(BuiltinFn::MapKeys)),
|
||||||
|
("values".to_string(), Value::Builtin(BuiltinFn::MapValues)),
|
||||||
|
("size".to_string(), Value::Builtin(BuiltinFn::MapSize)),
|
||||||
|
("isEmpty".to_string(), Value::Builtin(BuiltinFn::MapIsEmpty)),
|
||||||
|
("fromList".to_string(), Value::Builtin(BuiltinFn::MapFromList)),
|
||||||
|
("toList".to_string(), Value::Builtin(BuiltinFn::MapToList)),
|
||||||
|
("merge".to_string(), Value::Builtin(BuiltinFn::MapMerge)),
|
||||||
|
]));
|
||||||
|
env.define("Map", map_module);
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Execute a program
|
/// Execute a program
|
||||||
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
|
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
|
||||||
let mut last_value = Value::Unit;
|
let mut last_value = Value::Unit;
|
||||||
|
let mut has_main_let = false;
|
||||||
|
|
||||||
for decl in &program.declarations {
|
for decl in &program.declarations {
|
||||||
|
// Track if there's a top-level `let main = ...`
|
||||||
|
if let Declaration::Let(let_decl) = decl {
|
||||||
|
if let_decl.name.name == "main" {
|
||||||
|
has_main_let = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
last_value = self.eval_declaration(decl)?;
|
last_value = self.eval_declaration(decl)?;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Auto-invoke main if it was defined as a let binding with a function value
|
||||||
|
if has_main_let {
|
||||||
|
if let Some(main_val) = self.global_env.get("main") {
|
||||||
|
if let Value::Function(ref closure) = main_val {
|
||||||
|
if closure.params.is_empty() {
|
||||||
|
let span = Span { start: 0, end: 0 };
|
||||||
|
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
|
||||||
|
// Trampoline loop
|
||||||
|
loop {
|
||||||
|
match result {
|
||||||
|
EvalResult::Value(v) => {
|
||||||
|
last_value = v;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
EvalResult::Effect(req) => {
|
||||||
|
last_value = self.handle_effect(req)?;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
EvalResult::TailCall { func, args, span } => {
|
||||||
|
result = self.eval_call(func, args, span)?;
|
||||||
|
}
|
||||||
|
EvalResult::Resume(v) => {
|
||||||
|
last_value = v;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Ok(last_value)
|
Ok(last_value)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1265,6 +1412,25 @@ impl Interpreter {
|
|||||||
Ok(Value::Unit)
|
Ok(Value::Unit)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Declaration::ExternFn(ext) => {
|
||||||
|
// Register a placeholder that errors at runtime
|
||||||
|
let name = ext.name.name.clone();
|
||||||
|
let arity = ext.params.len();
|
||||||
|
// Create a closure that produces a clear error
|
||||||
|
let closure = Closure {
|
||||||
|
params: ext.params.iter().map(|p| p.name.name.clone()).collect(),
|
||||||
|
body: Expr::Literal(crate::ast::Literal {
|
||||||
|
kind: crate::ast::LiteralKind::Unit,
|
||||||
|
span: ext.span,
|
||||||
|
}),
|
||||||
|
env: self.global_env.clone(),
|
||||||
|
};
|
||||||
|
// We store an ExternFn marker value
|
||||||
|
self.global_env
|
||||||
|
.define(&name, Value::ExternFn { name: name.clone(), arity });
|
||||||
|
Ok(Value::Unit)
|
||||||
|
}
|
||||||
|
|
||||||
Declaration::Effect(_) | Declaration::Trait(_) | Declaration::Impl(_) => {
|
Declaration::Effect(_) | Declaration::Trait(_) | Declaration::Impl(_) => {
|
||||||
// These are compile-time only
|
// These are compile-time only
|
||||||
Ok(Value::Unit)
|
Ok(Value::Unit)
|
||||||
@@ -1415,6 +1581,34 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Expr::TupleIndex {
|
||||||
|
object,
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
} => {
|
||||||
|
let obj_val = self.eval_expr(object, env)?;
|
||||||
|
match obj_val {
|
||||||
|
Value::Tuple(elements) => {
|
||||||
|
if *index < elements.len() {
|
||||||
|
Ok(EvalResult::Value(elements[*index].clone()))
|
||||||
|
} else {
|
||||||
|
Err(RuntimeError {
|
||||||
|
message: format!(
|
||||||
|
"Tuple index {} out of bounds for tuple with {} elements",
|
||||||
|
index,
|
||||||
|
elements.len()
|
||||||
|
),
|
||||||
|
span: Some(*span),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => Err(RuntimeError {
|
||||||
|
message: format!("Cannot use tuple index on {}", obj_val.type_name()),
|
||||||
|
span: Some(*span),
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Lambda { params, body, .. } => {
|
Expr::Lambda { params, body, .. } => {
|
||||||
let closure = Closure {
|
let closure = Closure {
|
||||||
params: params.iter().map(|p| p.name.name.clone()).collect(),
|
params: params.iter().map(|p| p.name.name.clone()).collect(),
|
||||||
@@ -1481,8 +1675,28 @@ impl Interpreter {
|
|||||||
self.eval_expr_tail(result, &block_env, tail)
|
self.eval_expr_tail(result, &block_env, tail)
|
||||||
}
|
}
|
||||||
|
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record {
|
||||||
|
spread, fields, ..
|
||||||
|
} => {
|
||||||
let mut record = HashMap::new();
|
let mut record = HashMap::new();
|
||||||
|
|
||||||
|
// If there's a spread, evaluate it and start with its fields
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
let spread_val = self.eval_expr(spread_expr, env)?;
|
||||||
|
if let Value::Record(spread_fields) = spread_val {
|
||||||
|
record = spread_fields;
|
||||||
|
} else {
|
||||||
|
return Err(RuntimeError {
|
||||||
|
message: format!(
|
||||||
|
"Spread expression must evaluate to a record, got {}",
|
||||||
|
spread_val.type_name()
|
||||||
|
),
|
||||||
|
span: Some(expr.span()),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Override with explicit fields
|
||||||
for (name, expr) in fields {
|
for (name, expr) in fields {
|
||||||
let val = self.eval_expr(expr, env)?;
|
let val = self.eval_expr(expr, env)?;
|
||||||
record.insert(name.name.clone(), val);
|
record.insert(name.name.clone(), val);
|
||||||
@@ -1555,6 +1769,18 @@ impl Interpreter {
|
|||||||
span: Some(span),
|
span: Some(span),
|
||||||
}),
|
}),
|
||||||
},
|
},
|
||||||
|
BinaryOp::Concat => match (left, right) {
|
||||||
|
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
|
||||||
|
(Value::List(a), Value::List(b)) => {
|
||||||
|
let mut result = a;
|
||||||
|
result.extend(b);
|
||||||
|
Ok(Value::List(result))
|
||||||
|
}
|
||||||
|
(l, r) => Err(RuntimeError {
|
||||||
|
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
|
||||||
|
span: Some(span),
|
||||||
|
}),
|
||||||
|
},
|
||||||
BinaryOp::Sub => match (left, right) {
|
BinaryOp::Sub => match (left, right) {
|
||||||
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
|
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
|
||||||
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
|
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
|
||||||
@@ -1724,6 +1950,13 @@ impl Interpreter {
|
|||||||
}))
|
}))
|
||||||
}
|
}
|
||||||
Value::Builtin(builtin) => self.eval_builtin(builtin, args, span),
|
Value::Builtin(builtin) => self.eval_builtin(builtin, args, span),
|
||||||
|
Value::ExternFn { name, .. } => Err(RuntimeError {
|
||||||
|
message: format!(
|
||||||
|
"Extern function '{}' can only be called when compiled to JavaScript (use `lux build --target js`)",
|
||||||
|
name
|
||||||
|
),
|
||||||
|
span: Some(span),
|
||||||
|
}),
|
||||||
v => Err(RuntimeError {
|
v => Err(RuntimeError {
|
||||||
message: format!("Cannot call {}", v.type_name()),
|
message: format!("Cannot call {}", v.type_name()),
|
||||||
span: Some(span),
|
span: Some(span),
|
||||||
@@ -2223,6 +2456,46 @@ impl Interpreter {
|
|||||||
Ok(EvalResult::Value(Value::String(result)))
|
Ok(EvalResult::Value(Value::String(result)))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::IntToString => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Int.toString requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::String(format!("{}", n)))),
|
||||||
|
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::FloatToString => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Float.toString requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(f) => Ok(EvalResult::Value(Value::String(format!("{}", f)))),
|
||||||
|
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::IntToFloat => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Int.toFloat requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::Float(*n as f64))),
|
||||||
|
v => Err(err(&format!("Int.toFloat expects Int, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::FloatToInt => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Float.toInt requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(f) => Ok(EvalResult::Value(Value::Int(*f as i64))),
|
||||||
|
v => Err(err(&format!("Float.toInt expects Float, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
BuiltinFn::TypeOf => {
|
BuiltinFn::TypeOf => {
|
||||||
if args.len() != 1 {
|
if args.len() != 1 {
|
||||||
return Err(err("typeOf requires 1 argument"));
|
return Err(err("typeOf requires 1 argument"));
|
||||||
@@ -2399,6 +2672,45 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathSin => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Math.sin requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
|
||||||
|
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathCos => {
|
||||||
|
if args.len() != 1 {
|
||||||
|
return Err(err("Math.cos requires 1 argument"));
|
||||||
|
}
|
||||||
|
match &args[0] {
|
||||||
|
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
|
||||||
|
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
|
||||||
|
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MathAtan2 => {
|
||||||
|
if args.len() != 2 {
|
||||||
|
return Err(err("Math.atan2 requires 2 arguments: y, x"));
|
||||||
|
}
|
||||||
|
let y = match &args[0] {
|
||||||
|
Value::Float(n) => *n,
|
||||||
|
Value::Int(n) => *n as f64,
|
||||||
|
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
let x = match &args[1] {
|
||||||
|
Value::Float(n) => *n,
|
||||||
|
Value::Int(n) => *n as f64,
|
||||||
|
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
|
||||||
|
}
|
||||||
|
|
||||||
// Additional List operations
|
// Additional List operations
|
||||||
BuiltinFn::ListIsEmpty => {
|
BuiltinFn::ListIsEmpty => {
|
||||||
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
|
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
|
||||||
@@ -2452,6 +2764,55 @@ impl Interpreter {
|
|||||||
Ok(EvalResult::Value(Value::Bool(true)))
|
Ok(EvalResult::Value(Value::Bool(true)))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListFindIndex => {
|
||||||
|
let (list, func) = Self::expect_args_2::<Vec<Value>, Value>(&args, "List.findIndex", span)?;
|
||||||
|
for (i, item) in list.iter().enumerate() {
|
||||||
|
let v = self.eval_call_to_value(func.clone(), vec![item.clone()], span)?;
|
||||||
|
match v {
|
||||||
|
Value::Bool(true) => {
|
||||||
|
return Ok(EvalResult::Value(Value::Constructor {
|
||||||
|
name: "Some".to_string(),
|
||||||
|
fields: vec![Value::Int(i as i64)],
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
Value::Bool(false) => {}
|
||||||
|
_ => return Err(err("List.findIndex predicate must return Bool")),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(EvalResult::Value(Value::Constructor {
|
||||||
|
name: "None".to_string(),
|
||||||
|
fields: vec![],
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListZip => {
|
||||||
|
let (list1, list2) = Self::expect_args_2::<Vec<Value>, Vec<Value>>(&args, "List.zip", span)?;
|
||||||
|
let result: Vec<Value> = list1
|
||||||
|
.into_iter()
|
||||||
|
.zip(list2.into_iter())
|
||||||
|
.map(|(a, b)| Value::Tuple(vec![a, b]))
|
||||||
|
.collect();
|
||||||
|
Ok(EvalResult::Value(Value::List(result)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListFlatten => {
|
||||||
|
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.flatten", span)?;
|
||||||
|
let mut result = Vec::new();
|
||||||
|
for item in list {
|
||||||
|
match item {
|
||||||
|
Value::List(inner) => result.extend(inner),
|
||||||
|
other => result.push(other),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(EvalResult::Value(Value::List(result)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListContains => {
|
||||||
|
let (list, target) = Self::expect_args_2::<Vec<Value>, Value>(&args, "List.contains", span)?;
|
||||||
|
let found = list.iter().any(|item| Value::values_equal(item, &target));
|
||||||
|
Ok(EvalResult::Value(Value::Bool(found)))
|
||||||
|
}
|
||||||
|
|
||||||
BuiltinFn::ListTake => {
|
BuiltinFn::ListTake => {
|
||||||
let (list, n) = Self::expect_args_2::<Vec<Value>, i64>(&args, "List.take", span)?;
|
let (list, n) = Self::expect_args_2::<Vec<Value>, i64>(&args, "List.take", span)?;
|
||||||
let n = n.max(0) as usize;
|
let n = n.max(0) as usize;
|
||||||
@@ -2478,6 +2839,67 @@ impl Interpreter {
|
|||||||
Ok(EvalResult::Value(Value::Unit))
|
Ok(EvalResult::Value(Value::Unit))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListSort => {
|
||||||
|
// List.sort(list) - sort using natural ordering (Int, Float, String, Bool)
|
||||||
|
let mut list =
|
||||||
|
Self::expect_arg_1::<Vec<Value>>(&args, "List.sort", span)?;
|
||||||
|
list.sort_by(|a, b| Self::compare_values(a, b));
|
||||||
|
Ok(EvalResult::Value(Value::List(list)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::ListSortBy => {
|
||||||
|
// List.sortBy(list, fn(a, b) => Int) - sort with custom comparator
|
||||||
|
// Comparator returns negative (a < b), 0 (a == b), or positive (a > b)
|
||||||
|
let (list, func) =
|
||||||
|
Self::expect_args_2::<Vec<Value>, Value>(&args, "List.sortBy", span)?;
|
||||||
|
let mut indexed: Vec<(usize, Value)> =
|
||||||
|
list.into_iter().enumerate().collect();
|
||||||
|
let mut err: Option<RuntimeError> = None;
|
||||||
|
let func_ref = &func;
|
||||||
|
let self_ptr = self as *mut Self;
|
||||||
|
indexed.sort_by(|a, b| {
|
||||||
|
if err.is_some() {
|
||||||
|
return std::cmp::Ordering::Equal;
|
||||||
|
}
|
||||||
|
// Safety: we're in a single-threaded context and the closure
|
||||||
|
// needs mutable access to call eval_call_to_value
|
||||||
|
let interp = unsafe { &mut *self_ptr };
|
||||||
|
match interp.eval_call_to_value(
|
||||||
|
func_ref.clone(),
|
||||||
|
vec![a.1.clone(), b.1.clone()],
|
||||||
|
span,
|
||||||
|
) {
|
||||||
|
Ok(Value::Int(n)) => {
|
||||||
|
if n < 0 {
|
||||||
|
std::cmp::Ordering::Less
|
||||||
|
} else if n > 0 {
|
||||||
|
std::cmp::Ordering::Greater
|
||||||
|
} else {
|
||||||
|
std::cmp::Ordering::Equal
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(_) => {
|
||||||
|
err = Some(RuntimeError {
|
||||||
|
message: "List.sortBy comparator must return Int"
|
||||||
|
.to_string(),
|
||||||
|
span: Some(span),
|
||||||
|
});
|
||||||
|
std::cmp::Ordering::Equal
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
err = Some(e);
|
||||||
|
std::cmp::Ordering::Equal
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
if let Some(e) = err {
|
||||||
|
return Err(e);
|
||||||
|
}
|
||||||
|
let result: Vec<Value> =
|
||||||
|
indexed.into_iter().map(|(_, v)| v).collect();
|
||||||
|
Ok(EvalResult::Value(Value::List(result)))
|
||||||
|
}
|
||||||
|
|
||||||
// Additional String operations
|
// Additional String operations
|
||||||
BuiltinFn::StringStartsWith => {
|
BuiltinFn::StringStartsWith => {
|
||||||
let (s, prefix) = Self::expect_args_2::<String, String>(&args, "String.startsWith", span)?;
|
let (s, prefix) = Self::expect_args_2::<String, String>(&args, "String.startsWith", span)?;
|
||||||
@@ -2888,6 +3310,128 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
|
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Map operations
|
||||||
|
BuiltinFn::MapNew => {
|
||||||
|
Ok(EvalResult::Value(Value::Map(HashMap::new())))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapSet => {
|
||||||
|
if args.len() != 3 {
|
||||||
|
return Err(err("Map.set requires 3 arguments: map, key, value"));
|
||||||
|
}
|
||||||
|
let mut map = match &args[0] {
|
||||||
|
Value::Map(m) => m.clone(),
|
||||||
|
v => return Err(err(&format!("Map.set expects Map as first argument, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
let key = match &args[1] {
|
||||||
|
Value::String(s) => s.clone(),
|
||||||
|
v => return Err(err(&format!("Map.set expects String key, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
map.insert(key, args[2].clone());
|
||||||
|
Ok(EvalResult::Value(Value::Map(map)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapGet => {
|
||||||
|
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.get", span)?;
|
||||||
|
match map.get(&key) {
|
||||||
|
Some(v) => Ok(EvalResult::Value(Value::Constructor {
|
||||||
|
name: "Some".to_string(),
|
||||||
|
fields: vec![v.clone()],
|
||||||
|
})),
|
||||||
|
None => Ok(EvalResult::Value(Value::Constructor {
|
||||||
|
name: "None".to_string(),
|
||||||
|
fields: vec![],
|
||||||
|
})),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapContains => {
|
||||||
|
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.contains", span)?;
|
||||||
|
Ok(EvalResult::Value(Value::Bool(map.contains_key(&key))))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapRemove => {
|
||||||
|
let (mut map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.remove", span)?;
|
||||||
|
map.remove(&key);
|
||||||
|
Ok(EvalResult::Value(Value::Map(map)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapKeys => {
|
||||||
|
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.keys", span)?;
|
||||||
|
let mut keys: Vec<String> = map.keys().cloned().collect();
|
||||||
|
keys.sort();
|
||||||
|
Ok(EvalResult::Value(Value::List(
|
||||||
|
keys.into_iter().map(Value::String).collect(),
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapValues => {
|
||||||
|
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.values", span)?;
|
||||||
|
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
|
||||||
|
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
|
||||||
|
Ok(EvalResult::Value(Value::List(
|
||||||
|
entries.into_iter().map(|(_, v)| v).collect(),
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapSize => {
|
||||||
|
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.size", span)?;
|
||||||
|
Ok(EvalResult::Value(Value::Int(map.len() as i64)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapIsEmpty => {
|
||||||
|
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.isEmpty", span)?;
|
||||||
|
Ok(EvalResult::Value(Value::Bool(map.is_empty())))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapFromList => {
|
||||||
|
let list = Self::expect_arg_1::<Vec<Value>>(&args, "Map.fromList", span)?;
|
||||||
|
let mut map = HashMap::new();
|
||||||
|
for item in list {
|
||||||
|
match item {
|
||||||
|
Value::Tuple(fields) if fields.len() == 2 => {
|
||||||
|
let key = match &fields[0] {
|
||||||
|
Value::String(s) => s.clone(),
|
||||||
|
v => return Err(err(&format!("Map.fromList expects (String, V) tuples, got {} key", v.type_name()))),
|
||||||
|
};
|
||||||
|
map.insert(key, fields[1].clone());
|
||||||
|
}
|
||||||
|
_ => return Err(err("Map.fromList expects List<(String, V)>")),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(EvalResult::Value(Value::Map(map)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapToList => {
|
||||||
|
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.toList", span)?;
|
||||||
|
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
|
||||||
|
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
|
||||||
|
Ok(EvalResult::Value(Value::List(
|
||||||
|
entries
|
||||||
|
.into_iter()
|
||||||
|
.map(|(k, v)| Value::Tuple(vec![Value::String(k), v]))
|
||||||
|
.collect(),
|
||||||
|
)))
|
||||||
|
}
|
||||||
|
|
||||||
|
BuiltinFn::MapMerge => {
|
||||||
|
if args.len() != 2 {
|
||||||
|
return Err(err("Map.merge requires 2 arguments: map1, map2"));
|
||||||
|
}
|
||||||
|
let mut map1 = match &args[0] {
|
||||||
|
Value::Map(m) => m.clone(),
|
||||||
|
v => return Err(err(&format!("Map.merge expects Map as first argument, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
let map2 = match &args[1] {
|
||||||
|
Value::Map(m) => m.clone(),
|
||||||
|
v => return Err(err(&format!("Map.merge expects Map as second argument, got {}", v.type_name()))),
|
||||||
|
};
|
||||||
|
for (k, v) in map2 {
|
||||||
|
map1.insert(k, v);
|
||||||
|
}
|
||||||
|
Ok(EvalResult::Value(Value::Map(map1)))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2971,6 +3515,18 @@ impl Interpreter {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Compare two values for natural ordering (used by List.sort)
|
||||||
|
fn compare_values(a: &Value, b: &Value) -> std::cmp::Ordering {
|
||||||
|
match (a, b) {
|
||||||
|
(Value::Int(x), Value::Int(y)) => x.cmp(y),
|
||||||
|
(Value::Float(x), Value::Float(y)) => x.partial_cmp(y).unwrap_or(std::cmp::Ordering::Equal),
|
||||||
|
(Value::String(x), Value::String(y)) => x.cmp(y),
|
||||||
|
(Value::Bool(x), Value::Bool(y)) => x.cmp(y),
|
||||||
|
(Value::Char(x), Value::Char(y)) => x.cmp(y),
|
||||||
|
_ => std::cmp::Ordering::Equal,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn match_pattern(&self, pattern: &Pattern, value: &Value) -> Option<Vec<(String, Value)>> {
|
fn match_pattern(&self, pattern: &Pattern, value: &Value) -> Option<Vec<(String, Value)>> {
|
||||||
match pattern {
|
match pattern {
|
||||||
Pattern::Wildcard(_) => Some(Vec::new()),
|
Pattern::Wildcard(_) => Some(Vec::new()),
|
||||||
@@ -3053,6 +3609,11 @@ impl Interpreter {
|
|||||||
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
|
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
(Value::Map(a), Value::Map(b)) => {
|
||||||
|
a.len() == b.len() && a.iter().all(|(k, v)| {
|
||||||
|
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
|
||||||
|
})
|
||||||
|
}
|
||||||
(
|
(
|
||||||
Value::Constructor {
|
Value::Constructor {
|
||||||
name: n1,
|
name: n1,
|
||||||
@@ -3473,6 +4034,119 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
("File", "copy") => {
|
||||||
|
let source = match request.args.first() {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.copy requires a string source path".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
let dest = match request.args.get(1) {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.copy requires a string destination path".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
match std::fs::copy(&source, &dest) {
|
||||||
|
Ok(_) => Ok(Value::Unit),
|
||||||
|
Err(e) => Err(RuntimeError {
|
||||||
|
message: format!("Failed to copy '{}' to '{}': {}", source, dest, e),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
("File", "glob") => {
|
||||||
|
let pattern = match request.args.first() {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.glob requires a string pattern".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
match glob::glob(&pattern) {
|
||||||
|
Ok(paths) => {
|
||||||
|
let entries: Vec<Value> = paths
|
||||||
|
.filter_map(|entry| entry.ok())
|
||||||
|
.map(|path| Value::String(path.to_string_lossy().to_string()))
|
||||||
|
.collect();
|
||||||
|
Ok(Value::List(entries))
|
||||||
|
}
|
||||||
|
Err(e) => Err(RuntimeError {
|
||||||
|
message: format!("Invalid glob pattern '{}': {}", pattern, e),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ===== File Effect (safe Result-returning variants) =====
|
||||||
|
("File", "tryRead") => {
|
||||||
|
let path = match request.args.first() {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.tryRead requires a string path".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
match std::fs::read_to_string(&path) {
|
||||||
|
Ok(content) => Ok(Value::Constructor {
|
||||||
|
name: "Ok".to_string(),
|
||||||
|
fields: vec![Value::String(content)],
|
||||||
|
}),
|
||||||
|
Err(e) => Ok(Value::Constructor {
|
||||||
|
name: "Err".to_string(),
|
||||||
|
fields: vec![Value::String(format!("Failed to read file '{}': {}", path, e))],
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
("File", "tryWrite") => {
|
||||||
|
let path = match request.args.first() {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.tryWrite requires a string path".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
let content = match request.args.get(1) {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.tryWrite requires string content".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
match std::fs::write(&path, &content) {
|
||||||
|
Ok(()) => Ok(Value::Constructor {
|
||||||
|
name: "Ok".to_string(),
|
||||||
|
fields: vec![Value::Unit],
|
||||||
|
}),
|
||||||
|
Err(e) => Ok(Value::Constructor {
|
||||||
|
name: "Err".to_string(),
|
||||||
|
fields: vec![Value::String(format!("Failed to write file '{}': {}", path, e))],
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
("File", "tryDelete") => {
|
||||||
|
let path = match request.args.first() {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => return Err(RuntimeError {
|
||||||
|
message: "File.tryDelete requires a string path".to_string(),
|
||||||
|
span: None,
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
match std::fs::remove_file(&path) {
|
||||||
|
Ok(()) => Ok(Value::Constructor {
|
||||||
|
name: "Ok".to_string(),
|
||||||
|
fields: vec![Value::Unit],
|
||||||
|
}),
|
||||||
|
Err(e) => Ok(Value::Constructor {
|
||||||
|
name: "Err".to_string(),
|
||||||
|
fields: vec![Value::String(format!("Failed to delete file '{}': {}", path, e))],
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// ===== Process Effect =====
|
// ===== Process Effect =====
|
||||||
("Process", "exec") => {
|
("Process", "exec") => {
|
||||||
use std::process::Command;
|
use std::process::Command;
|
||||||
@@ -3828,6 +4502,26 @@ impl Interpreter {
|
|||||||
}
|
}
|
||||||
Ok(Value::Unit)
|
Ok(Value::Unit)
|
||||||
}
|
}
|
||||||
|
("Test", "assertEqualMsg") => {
|
||||||
|
let expected = request.args.first().cloned().unwrap_or(Value::Unit);
|
||||||
|
let actual = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
||||||
|
let label = match request.args.get(2) {
|
||||||
|
Some(Value::String(s)) => s.clone(),
|
||||||
|
_ => "Values not equal".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
if Value::values_equal(&expected, &actual) {
|
||||||
|
self.test_results.borrow_mut().passed += 1;
|
||||||
|
} else {
|
||||||
|
self.test_results.borrow_mut().failed += 1;
|
||||||
|
self.test_results.borrow_mut().failures.push(TestFailure {
|
||||||
|
message: label,
|
||||||
|
expected: Some(format!("{}", expected)),
|
||||||
|
actual: Some(format!("{}", actual)),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
Ok(Value::Unit)
|
||||||
|
}
|
||||||
("Test", "assertNotEqual") => {
|
("Test", "assertNotEqual") => {
|
||||||
let a = request.args.first().cloned().unwrap_or(Value::Unit);
|
let a = request.args.first().cloned().unwrap_or(Value::Unit);
|
||||||
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
|
||||||
@@ -4960,6 +5654,7 @@ mod tests {
|
|||||||
// Create a simple migration that adds a field
|
// Create a simple migration that adds a field
|
||||||
// Migration: old.name -> { name: old.name, email: "unknown" }
|
// Migration: old.name -> { name: old.name, email: "unknown" }
|
||||||
let migration_body = Expr::Record {
|
let migration_body = Expr::Record {
|
||||||
|
spread: None,
|
||||||
fields: vec![
|
fields: vec![
|
||||||
(
|
(
|
||||||
Ident::new("name", Span::default()),
|
Ident::new("name", Span::default()),
|
||||||
|
|||||||
266
src/lexer.rs
266
src/lexer.rs
@@ -42,6 +42,7 @@ pub enum TokenKind {
|
|||||||
Effect,
|
Effect,
|
||||||
Handler,
|
Handler,
|
||||||
Run,
|
Run,
|
||||||
|
Handle,
|
||||||
Resume,
|
Resume,
|
||||||
Type,
|
Type,
|
||||||
True,
|
True,
|
||||||
@@ -54,6 +55,7 @@ pub enum TokenKind {
|
|||||||
Trait, // trait (for type classes)
|
Trait, // trait (for type classes)
|
||||||
Impl, // impl (for trait implementations)
|
Impl, // impl (for trait implementations)
|
||||||
For, // for (in impl Trait for Type)
|
For, // for (in impl Trait for Type)
|
||||||
|
Extern, // extern (for FFI declarations)
|
||||||
|
|
||||||
// Documentation
|
// Documentation
|
||||||
DocComment(String), // /// doc comment
|
DocComment(String), // /// doc comment
|
||||||
@@ -70,6 +72,7 @@ pub enum TokenKind {
|
|||||||
|
|
||||||
// Operators
|
// Operators
|
||||||
Plus, // +
|
Plus, // +
|
||||||
|
PlusPlus, // ++
|
||||||
Minus, // -
|
Minus, // -
|
||||||
Star, // *
|
Star, // *
|
||||||
Slash, // /
|
Slash, // /
|
||||||
@@ -89,6 +92,7 @@ pub enum TokenKind {
|
|||||||
Arrow, // =>
|
Arrow, // =>
|
||||||
ThinArrow, // ->
|
ThinArrow, // ->
|
||||||
Dot, // .
|
Dot, // .
|
||||||
|
DotDotDot, // ...
|
||||||
Colon, // :
|
Colon, // :
|
||||||
ColonColon, // ::
|
ColonColon, // ::
|
||||||
Comma, // ,
|
Comma, // ,
|
||||||
@@ -138,6 +142,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::Effect => write!(f, "effect"),
|
TokenKind::Effect => write!(f, "effect"),
|
||||||
TokenKind::Handler => write!(f, "handler"),
|
TokenKind::Handler => write!(f, "handler"),
|
||||||
TokenKind::Run => write!(f, "run"),
|
TokenKind::Run => write!(f, "run"),
|
||||||
|
TokenKind::Handle => write!(f, "handle"),
|
||||||
TokenKind::Resume => write!(f, "resume"),
|
TokenKind::Resume => write!(f, "resume"),
|
||||||
TokenKind::Type => write!(f, "type"),
|
TokenKind::Type => write!(f, "type"),
|
||||||
TokenKind::Import => write!(f, "import"),
|
TokenKind::Import => write!(f, "import"),
|
||||||
@@ -148,6 +153,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::Trait => write!(f, "trait"),
|
TokenKind::Trait => write!(f, "trait"),
|
||||||
TokenKind::Impl => write!(f, "impl"),
|
TokenKind::Impl => write!(f, "impl"),
|
||||||
TokenKind::For => write!(f, "for"),
|
TokenKind::For => write!(f, "for"),
|
||||||
|
TokenKind::Extern => write!(f, "extern"),
|
||||||
TokenKind::DocComment(s) => write!(f, "/// {}", s),
|
TokenKind::DocComment(s) => write!(f, "/// {}", s),
|
||||||
TokenKind::Is => write!(f, "is"),
|
TokenKind::Is => write!(f, "is"),
|
||||||
TokenKind::Pure => write!(f, "pure"),
|
TokenKind::Pure => write!(f, "pure"),
|
||||||
@@ -160,6 +166,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::True => write!(f, "true"),
|
TokenKind::True => write!(f, "true"),
|
||||||
TokenKind::False => write!(f, "false"),
|
TokenKind::False => write!(f, "false"),
|
||||||
TokenKind::Plus => write!(f, "+"),
|
TokenKind::Plus => write!(f, "+"),
|
||||||
|
TokenKind::PlusPlus => write!(f, "++"),
|
||||||
TokenKind::Minus => write!(f, "-"),
|
TokenKind::Minus => write!(f, "-"),
|
||||||
TokenKind::Star => write!(f, "*"),
|
TokenKind::Star => write!(f, "*"),
|
||||||
TokenKind::Slash => write!(f, "/"),
|
TokenKind::Slash => write!(f, "/"),
|
||||||
@@ -179,6 +186,7 @@ impl fmt::Display for TokenKind {
|
|||||||
TokenKind::Arrow => write!(f, "=>"),
|
TokenKind::Arrow => write!(f, "=>"),
|
||||||
TokenKind::ThinArrow => write!(f, "->"),
|
TokenKind::ThinArrow => write!(f, "->"),
|
||||||
TokenKind::Dot => write!(f, "."),
|
TokenKind::Dot => write!(f, "."),
|
||||||
|
TokenKind::DotDotDot => write!(f, "..."),
|
||||||
TokenKind::Colon => write!(f, ":"),
|
TokenKind::Colon => write!(f, ":"),
|
||||||
TokenKind::ColonColon => write!(f, "::"),
|
TokenKind::ColonColon => write!(f, "::"),
|
||||||
TokenKind::Comma => write!(f, ","),
|
TokenKind::Comma => write!(f, ","),
|
||||||
@@ -268,7 +276,14 @@ impl<'a> Lexer<'a> {
|
|||||||
|
|
||||||
let kind = match c {
|
let kind = match c {
|
||||||
// Single-character tokens
|
// Single-character tokens
|
||||||
'+' => TokenKind::Plus,
|
'+' => {
|
||||||
|
if self.peek() == Some('+') {
|
||||||
|
self.advance();
|
||||||
|
TokenKind::PlusPlus
|
||||||
|
} else {
|
||||||
|
TokenKind::Plus
|
||||||
|
}
|
||||||
|
}
|
||||||
'*' => TokenKind::Star,
|
'*' => TokenKind::Star,
|
||||||
'%' => TokenKind::Percent,
|
'%' => TokenKind::Percent,
|
||||||
'(' => TokenKind::LParen,
|
'(' => TokenKind::LParen,
|
||||||
@@ -364,7 +379,22 @@ impl<'a> Lexer<'a> {
|
|||||||
TokenKind::Pipe
|
TokenKind::Pipe
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
'.' => TokenKind::Dot,
|
'.' => {
|
||||||
|
if self.peek() == Some('.') {
|
||||||
|
// Check for ... (need to peek past second dot)
|
||||||
|
// We look at source directly since we can only peek one ahead
|
||||||
|
let next_next = self.source[self.pos..].chars().nth(1);
|
||||||
|
if next_next == Some('.') {
|
||||||
|
self.advance(); // consume second '.'
|
||||||
|
self.advance(); // consume third '.'
|
||||||
|
TokenKind::DotDotDot
|
||||||
|
} else {
|
||||||
|
TokenKind::Dot
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
TokenKind::Dot
|
||||||
|
}
|
||||||
|
}
|
||||||
':' => {
|
':' => {
|
||||||
if self.peek() == Some(':') {
|
if self.peek() == Some(':') {
|
||||||
self.advance();
|
self.advance();
|
||||||
@@ -383,7 +413,26 @@ impl<'a> Lexer<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// String literals
|
// String literals
|
||||||
'"' => self.scan_string(start)?,
|
'"' => {
|
||||||
|
// Check for triple-quote multiline string """
|
||||||
|
if self.peek() == Some('"') {
|
||||||
|
// Clone to peek at the second char
|
||||||
|
let mut lookahead = self.chars.clone();
|
||||||
|
lookahead.next(); // consume first peeked "
|
||||||
|
if lookahead.peek() == Some(&'"') {
|
||||||
|
// It's a triple-quote: consume both remaining quotes
|
||||||
|
self.advance(); // second "
|
||||||
|
self.advance(); // third "
|
||||||
|
self.scan_multiline_string(start)?
|
||||||
|
} else {
|
||||||
|
// It's an empty string ""
|
||||||
|
self.advance(); // consume closing "
|
||||||
|
TokenKind::String(String::new())
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
self.scan_string(start)?
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Char literals
|
// Char literals
|
||||||
'\'' => self.scan_char(start)?,
|
'\'' => self.scan_char(start)?,
|
||||||
@@ -493,6 +542,8 @@ impl<'a> Lexer<'a> {
|
|||||||
Some('"') => '"',
|
Some('"') => '"',
|
||||||
Some('0') => '\0',
|
Some('0') => '\0',
|
||||||
Some('\'') => '\'',
|
Some('\'') => '\'',
|
||||||
|
Some('{') => '{',
|
||||||
|
Some('}') => '}',
|
||||||
Some('x') => {
|
Some('x') => {
|
||||||
// Hex escape \xNN
|
// Hex escape \xNN
|
||||||
let h1 = self.advance().and_then(|c| c.to_digit(16));
|
let h1 = self.advance().and_then(|c| c.to_digit(16));
|
||||||
@@ -639,6 +690,211 @@ impl<'a> Lexer<'a> {
|
|||||||
Ok(TokenKind::InterpolatedString(parts))
|
Ok(TokenKind::InterpolatedString(parts))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn scan_multiline_string(&mut self, _start: usize) -> Result<TokenKind, LexError> {
|
||||||
|
let mut parts: Vec<StringPart> = Vec::new();
|
||||||
|
let mut current_literal = String::new();
|
||||||
|
|
||||||
|
// Skip the first newline after opening """ if present
|
||||||
|
if self.peek() == Some('\n') {
|
||||||
|
self.advance();
|
||||||
|
} else if self.peek() == Some('\r') {
|
||||||
|
self.advance();
|
||||||
|
if self.peek() == Some('\n') {
|
||||||
|
self.advance();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
loop {
|
||||||
|
match self.advance() {
|
||||||
|
Some('"') => {
|
||||||
|
// Check for closing """
|
||||||
|
if self.peek() == Some('"') {
|
||||||
|
let mut lookahead = self.chars.clone();
|
||||||
|
lookahead.next(); // consume first peeked "
|
||||||
|
if lookahead.peek() == Some(&'"') {
|
||||||
|
// Closing """ found
|
||||||
|
self.advance(); // second "
|
||||||
|
self.advance(); // third "
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Not closing triple-quote, just a regular " in the string
|
||||||
|
current_literal.push('"');
|
||||||
|
}
|
||||||
|
Some('\\') => {
|
||||||
|
// Handle escape sequences (same as regular strings)
|
||||||
|
match self.peek() {
|
||||||
|
Some('{') => {
|
||||||
|
self.advance();
|
||||||
|
current_literal.push('{');
|
||||||
|
}
|
||||||
|
Some('}') => {
|
||||||
|
self.advance();
|
||||||
|
current_literal.push('}');
|
||||||
|
}
|
||||||
|
_ => {
|
||||||
|
let escape_start = self.pos;
|
||||||
|
let escaped = match self.advance() {
|
||||||
|
Some('n') => '\n',
|
||||||
|
Some('r') => '\r',
|
||||||
|
Some('t') => '\t',
|
||||||
|
Some('\\') => '\\',
|
||||||
|
Some('"') => '"',
|
||||||
|
Some('0') => '\0',
|
||||||
|
Some('\'') => '\'',
|
||||||
|
Some(c) => {
|
||||||
|
return Err(LexError {
|
||||||
|
message: format!("Invalid escape sequence: \\{}", c),
|
||||||
|
span: Span::new(escape_start - 1, self.pos),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
None => {
|
||||||
|
return Err(LexError {
|
||||||
|
message: "Unterminated multiline string".into(),
|
||||||
|
span: Span::new(_start, self.pos),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
};
|
||||||
|
current_literal.push(escaped);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Some('{') => {
|
||||||
|
// Interpolation (same as regular strings)
|
||||||
|
if !current_literal.is_empty() {
|
||||||
|
parts.push(StringPart::Literal(std::mem::take(&mut current_literal)));
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut expr_text = String::new();
|
||||||
|
let mut brace_depth = 1;
|
||||||
|
|
||||||
|
loop {
|
||||||
|
match self.advance() {
|
||||||
|
Some('{') => {
|
||||||
|
brace_depth += 1;
|
||||||
|
expr_text.push('{');
|
||||||
|
}
|
||||||
|
Some('}') => {
|
||||||
|
brace_depth -= 1;
|
||||||
|
if brace_depth == 0 {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
expr_text.push('}');
|
||||||
|
}
|
||||||
|
Some(c) => expr_text.push(c),
|
||||||
|
None => {
|
||||||
|
return Err(LexError {
|
||||||
|
message: "Unterminated interpolation in multiline string"
|
||||||
|
.into(),
|
||||||
|
span: Span::new(_start, self.pos),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parts.push(StringPart::Expr(expr_text));
|
||||||
|
}
|
||||||
|
Some(c) => current_literal.push(c),
|
||||||
|
None => {
|
||||||
|
return Err(LexError {
|
||||||
|
message: "Unterminated multiline string".into(),
|
||||||
|
span: Span::new(_start, self.pos),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strip common leading whitespace from all lines
|
||||||
|
let strip_indent = |s: &str| -> String {
|
||||||
|
if s.is_empty() {
|
||||||
|
return String::new();
|
||||||
|
}
|
||||||
|
let lines: Vec<&str> = s.split('\n').collect();
|
||||||
|
// Find minimum indentation of non-empty lines
|
||||||
|
let min_indent = lines
|
||||||
|
.iter()
|
||||||
|
.filter(|line| !line.trim().is_empty())
|
||||||
|
.map(|line| line.len() - line.trim_start().len())
|
||||||
|
.min()
|
||||||
|
.unwrap_or(0);
|
||||||
|
// Strip that indentation from each line
|
||||||
|
lines
|
||||||
|
.iter()
|
||||||
|
.map(|line| {
|
||||||
|
if line.len() >= min_indent {
|
||||||
|
&line[min_indent..]
|
||||||
|
} else {
|
||||||
|
line.trim_start()
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.collect::<Vec<_>>()
|
||||||
|
.join("\n")
|
||||||
|
};
|
||||||
|
|
||||||
|
// Strip trailing whitespace-only line before closing """
|
||||||
|
let trim_trailing = |s: &mut String| {
|
||||||
|
// Remove trailing spaces/tabs (indent before closing """)
|
||||||
|
while s.ends_with(' ') || s.ends_with('\t') {
|
||||||
|
s.pop();
|
||||||
|
}
|
||||||
|
// Remove the trailing newline
|
||||||
|
if s.ends_with('\n') {
|
||||||
|
s.pop();
|
||||||
|
if s.ends_with('\r') {
|
||||||
|
s.pop();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if parts.is_empty() {
|
||||||
|
trim_trailing(&mut current_literal);
|
||||||
|
let result = strip_indent(¤t_literal);
|
||||||
|
return Ok(TokenKind::String(result));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add remaining literal
|
||||||
|
if !current_literal.is_empty() {
|
||||||
|
trim_trailing(&mut current_literal);
|
||||||
|
parts.push(StringPart::Literal(current_literal));
|
||||||
|
}
|
||||||
|
|
||||||
|
// For interpolated multiline strings, strip indent from literal parts
|
||||||
|
// First, collect all literal content to find min indent
|
||||||
|
let mut all_text = String::new();
|
||||||
|
for part in &parts {
|
||||||
|
if let StringPart::Literal(lit) = part {
|
||||||
|
all_text.push_str(lit);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let lines: Vec<&str> = all_text.split('\n').collect();
|
||||||
|
let min_indent = lines
|
||||||
|
.iter()
|
||||||
|
.filter(|line| !line.trim().is_empty())
|
||||||
|
.map(|line| line.len() - line.trim_start().len())
|
||||||
|
.min()
|
||||||
|
.unwrap_or(0);
|
||||||
|
|
||||||
|
if min_indent > 0 {
|
||||||
|
for part in &mut parts {
|
||||||
|
if let StringPart::Literal(lit) = part {
|
||||||
|
let stripped_lines: Vec<&str> = lit
|
||||||
|
.split('\n')
|
||||||
|
.map(|line| {
|
||||||
|
if line.len() >= min_indent {
|
||||||
|
&line[min_indent..]
|
||||||
|
} else {
|
||||||
|
line.trim_start()
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
*lit = stripped_lines.join("\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(TokenKind::InterpolatedString(parts))
|
||||||
|
}
|
||||||
|
|
||||||
fn scan_char(&mut self, start: usize) -> Result<TokenKind, LexError> {
|
fn scan_char(&mut self, start: usize) -> Result<TokenKind, LexError> {
|
||||||
let c = match self.advance() {
|
let c = match self.advance() {
|
||||||
Some('\\') => match self.advance() {
|
Some('\\') => match self.advance() {
|
||||||
@@ -743,6 +999,7 @@ impl<'a> Lexer<'a> {
|
|||||||
"effect" => TokenKind::Effect,
|
"effect" => TokenKind::Effect,
|
||||||
"handler" => TokenKind::Handler,
|
"handler" => TokenKind::Handler,
|
||||||
"run" => TokenKind::Run,
|
"run" => TokenKind::Run,
|
||||||
|
"handle" => TokenKind::Handle,
|
||||||
"resume" => TokenKind::Resume,
|
"resume" => TokenKind::Resume,
|
||||||
"type" => TokenKind::Type,
|
"type" => TokenKind::Type,
|
||||||
"import" => TokenKind::Import,
|
"import" => TokenKind::Import,
|
||||||
@@ -753,6 +1010,7 @@ impl<'a> Lexer<'a> {
|
|||||||
"trait" => TokenKind::Trait,
|
"trait" => TokenKind::Trait,
|
||||||
"impl" => TokenKind::Impl,
|
"impl" => TokenKind::Impl,
|
||||||
"for" => TokenKind::For,
|
"for" => TokenKind::For,
|
||||||
|
"extern" => TokenKind::Extern,
|
||||||
"is" => TokenKind::Is,
|
"is" => TokenKind::Is,
|
||||||
"pure" => TokenKind::Pure,
|
"pure" => TokenKind::Pure,
|
||||||
"total" => TokenKind::Total,
|
"total" => TokenKind::Total,
|
||||||
@@ -761,6 +1019,8 @@ impl<'a> Lexer<'a> {
|
|||||||
"commutative" => TokenKind::Commutative,
|
"commutative" => TokenKind::Commutative,
|
||||||
"where" => TokenKind::Where,
|
"where" => TokenKind::Where,
|
||||||
"assume" => TokenKind::Assume,
|
"assume" => TokenKind::Assume,
|
||||||
|
"and" => TokenKind::And,
|
||||||
|
"or" => TokenKind::Or,
|
||||||
"true" => TokenKind::Bool(true),
|
"true" => TokenKind::Bool(true),
|
||||||
"false" => TokenKind::Bool(false),
|
"false" => TokenKind::Bool(false),
|
||||||
_ => TokenKind::Ident(ident.to_string()),
|
_ => TokenKind::Ident(ident.to_string()),
|
||||||
|
|||||||
@@ -403,6 +403,9 @@ impl Linter {
|
|||||||
Declaration::Function(f) => {
|
Declaration::Function(f) => {
|
||||||
self.defined_functions.insert(f.name.name.clone());
|
self.defined_functions.insert(f.name.name.clone());
|
||||||
}
|
}
|
||||||
|
Declaration::ExternFn(e) => {
|
||||||
|
self.defined_functions.insert(e.name.name.clone());
|
||||||
|
}
|
||||||
Declaration::Let(l) => {
|
Declaration::Let(l) => {
|
||||||
self.define_var(&l.name.name);
|
self.define_var(&l.name.name);
|
||||||
}
|
}
|
||||||
@@ -510,10 +513,13 @@ impl Linter {
|
|||||||
self.collect_refs_expr(&arm.body);
|
self.collect_refs_expr(&arm.body);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
self.collect_refs_expr(object);
|
self.collect_refs_expr(object);
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
self.collect_refs_expr(spread_expr);
|
||||||
|
}
|
||||||
for (_, val) in fields {
|
for (_, val) in fields {
|
||||||
self.collect_refs_expr(val);
|
self.collect_refs_expr(val);
|
||||||
}
|
}
|
||||||
|
|||||||
294
src/lsp.rs
294
src/lsp.rs
@@ -317,66 +317,227 @@ impl LspServer {
|
|||||||
let doc = self.documents.get(&uri)?;
|
let doc = self.documents.get(&uri)?;
|
||||||
let source = &doc.text;
|
let source = &doc.text;
|
||||||
|
|
||||||
// Try to get info from symbol table first
|
// Try to get info from symbol table first (position-based lookup)
|
||||||
if let Some(ref table) = doc.symbol_table {
|
if let Some(ref table) = doc.symbol_table {
|
||||||
let offset = self.position_to_offset(source, position);
|
let offset = self.position_to_offset(source, position);
|
||||||
if let Some(symbol) = table.definition_at_position(offset) {
|
if let Some(symbol) = table.definition_at_position(offset) {
|
||||||
let signature = symbol.type_signature.as_ref()
|
return Some(self.format_symbol_hover(symbol));
|
||||||
.map(|s| s.as_str())
|
|
||||||
.unwrap_or(&symbol.name);
|
|
||||||
let kind_str = match symbol.kind {
|
|
||||||
SymbolKind::Function => "function",
|
|
||||||
SymbolKind::Variable => "variable",
|
|
||||||
SymbolKind::Parameter => "parameter",
|
|
||||||
SymbolKind::Type => "type",
|
|
||||||
SymbolKind::TypeParameter => "type parameter",
|
|
||||||
SymbolKind::Variant => "variant",
|
|
||||||
SymbolKind::Effect => "effect",
|
|
||||||
SymbolKind::EffectOperation => "effect operation",
|
|
||||||
SymbolKind::Field => "field",
|
|
||||||
SymbolKind::Module => "module",
|
|
||||||
};
|
|
||||||
let doc_str = symbol.documentation.as_ref()
|
|
||||||
.map(|d| format!("\n\n{}", d))
|
|
||||||
.unwrap_or_default();
|
|
||||||
|
|
||||||
// Format signature: wrap long signatures onto multiple lines
|
|
||||||
let formatted_sig = format_signature_for_hover(signature);
|
|
||||||
|
|
||||||
// Add behavioral property documentation if present
|
|
||||||
let property_docs = extract_property_docs(signature);
|
|
||||||
|
|
||||||
return Some(Hover {
|
|
||||||
contents: HoverContents::Markup(MarkupContent {
|
|
||||||
kind: MarkupKind::Markdown,
|
|
||||||
value: format!("```lux\n{}\n```\n\n*{}*{}{}", formatted_sig, kind_str, property_docs, doc_str),
|
|
||||||
}),
|
|
||||||
range: None,
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Fall back to hardcoded info
|
// Get the word under cursor
|
||||||
|
|
||||||
// Extract the word at the cursor position
|
|
||||||
let word = self.get_word_at_position(source, position)?;
|
let word = self.get_word_at_position(source, position)?;
|
||||||
|
|
||||||
// Look up rich documentation for known symbols
|
// When hovering on a keyword like 'fn', 'type', 'effect', 'let', 'trait',
|
||||||
let info = self.get_rich_symbol_info(&word)
|
// look ahead to find the declaration name and show that symbol's info
|
||||||
.or_else(|| self.get_symbol_info(&word).map(|(s, d)| (s.to_string(), d.to_string())));
|
if let Some(ref table) = doc.symbol_table {
|
||||||
|
if matches!(word.as_str(), "fn" | "type" | "effect" | "let" | "trait" | "handler" | "impl") {
|
||||||
|
let offset = self.position_to_offset(source, position);
|
||||||
|
if let Some(name) = self.find_next_ident(source, offset + word.len()) {
|
||||||
|
for sym in table.global_symbols() {
|
||||||
|
if sym.name == name {
|
||||||
|
return Some(self.format_symbol_hover(sym));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if let Some((signature, doc)) = info {
|
// Try name-based lookup in symbol table (for usage sites)
|
||||||
let formatted_sig = format_signature_for_hover(&signature);
|
for sym in table.global_symbols() {
|
||||||
Some(Hover {
|
if sym.name == word {
|
||||||
|
return Some(self.format_symbol_hover(sym));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for module names (Console, List, String, etc.)
|
||||||
|
if let Some(hover) = self.get_module_hover(&word) {
|
||||||
|
return Some(hover);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rich documentation for behavioral property keywords
|
||||||
|
if let Some((signature, doc_text)) = self.get_rich_symbol_info(&word) {
|
||||||
|
return Some(Hover {
|
||||||
contents: HoverContents::Markup(MarkupContent {
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
kind: MarkupKind::Markdown,
|
kind: MarkupKind::Markdown,
|
||||||
value: format!("```lux\n{}\n```\n\n{}", formatted_sig, doc),
|
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
|
||||||
}),
|
}),
|
||||||
range: None,
|
range: None,
|
||||||
})
|
});
|
||||||
} else {
|
|
||||||
None
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Builtin keyword/function info
|
||||||
|
if let Some((signature, doc_text)) = self.get_symbol_info(&word) {
|
||||||
|
return Some(Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Format a symbol into a hover response
|
||||||
|
fn format_symbol_hover(&self, symbol: &crate::symbol_table::Symbol) -> Hover {
|
||||||
|
let signature = symbol.type_signature.as_ref()
|
||||||
|
.map(|s| s.as_str())
|
||||||
|
.unwrap_or(&symbol.name);
|
||||||
|
let kind_str = match symbol.kind {
|
||||||
|
SymbolKind::Function => "function",
|
||||||
|
SymbolKind::Variable => "variable",
|
||||||
|
SymbolKind::Parameter => "parameter",
|
||||||
|
SymbolKind::Type => "type",
|
||||||
|
SymbolKind::TypeParameter => "type parameter",
|
||||||
|
SymbolKind::Variant => "variant",
|
||||||
|
SymbolKind::Effect => "effect",
|
||||||
|
SymbolKind::EffectOperation => "effect operation",
|
||||||
|
SymbolKind::Field => "field",
|
||||||
|
SymbolKind::Module => "module",
|
||||||
|
};
|
||||||
|
let doc_str = symbol.documentation.as_ref()
|
||||||
|
.map(|d| format!("\n\n{}", d))
|
||||||
|
.unwrap_or_default();
|
||||||
|
let formatted_sig = format_signature_for_hover(signature);
|
||||||
|
let property_docs = extract_property_docs(signature);
|
||||||
|
|
||||||
|
Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!(
|
||||||
|
"```lux\n{}\n```\n*{}*{}{}",
|
||||||
|
formatted_sig, kind_str, property_docs, doc_str
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get hover info for built-in module names
|
||||||
|
fn get_module_hover(&self, name: &str) -> Option<Hover> {
|
||||||
|
let (sig, doc) = match name {
|
||||||
|
"Console" => (
|
||||||
|
"effect Console",
|
||||||
|
"**Console I/O**\n\n\
|
||||||
|
- `Console.print(msg: String): Unit` — print to stdout\n\
|
||||||
|
- `Console.readLine(): String` — read a line from stdin\n\
|
||||||
|
- `Console.readInt(): Int` — read an integer from stdin",
|
||||||
|
),
|
||||||
|
"File" => (
|
||||||
|
"effect File",
|
||||||
|
"**File System**\n\n\
|
||||||
|
- `File.read(path: String): String` — read file contents\n\
|
||||||
|
- `File.write(path: String, content: String): Unit` — write to file\n\
|
||||||
|
- `File.append(path: String, content: String): Unit` — append to file\n\
|
||||||
|
- `File.exists(path: String): Bool` — check if file exists\n\
|
||||||
|
- `File.delete(path: String): Unit` — delete a file\n\
|
||||||
|
- `File.list(path: String): List<String>` — list directory",
|
||||||
|
),
|
||||||
|
"Http" => (
|
||||||
|
"effect Http",
|
||||||
|
"**HTTP Client**\n\n\
|
||||||
|
- `Http.get(url: String): String` — GET request\n\
|
||||||
|
- `Http.post(url: String, body: String): String` — POST request\n\
|
||||||
|
- `Http.put(url: String, body: String): String` — PUT request\n\
|
||||||
|
- `Http.delete(url: String): String` — DELETE request",
|
||||||
|
),
|
||||||
|
"Sql" => (
|
||||||
|
"effect Sql",
|
||||||
|
"**SQL Database**\n\n\
|
||||||
|
- `Sql.open(path: String): Connection` — open database\n\
|
||||||
|
- `Sql.execute(conn: Connection, sql: String): Unit` — execute SQL\n\
|
||||||
|
- `Sql.query(conn: Connection, sql: String): List<Row>` — query rows\n\
|
||||||
|
- `Sql.close(conn: Connection): Unit` — close connection",
|
||||||
|
),
|
||||||
|
"Random" => (
|
||||||
|
"effect Random",
|
||||||
|
"**Random Number Generation**\n\n\
|
||||||
|
- `Random.int(min: Int, max: Int): Int` — random integer\n\
|
||||||
|
- `Random.float(): Float` — random float 0.0–1.0\n\
|
||||||
|
- `Random.bool(): Bool` — random boolean",
|
||||||
|
),
|
||||||
|
"Time" => (
|
||||||
|
"effect Time",
|
||||||
|
"**Time**\n\n\
|
||||||
|
- `Time.now(): Int` — current Unix timestamp (ms)\n\
|
||||||
|
- `Time.sleep(ms: Int): Unit` — sleep for milliseconds",
|
||||||
|
),
|
||||||
|
"Process" => (
|
||||||
|
"effect Process",
|
||||||
|
"**Process / System**\n\n\
|
||||||
|
- `Process.exec(cmd: String): String` — run shell command\n\
|
||||||
|
- `Process.env(name: String): String` — get env variable\n\
|
||||||
|
- `Process.args(): List<String>` — command-line arguments\n\
|
||||||
|
- `Process.exit(code: Int): Unit` — exit with code",
|
||||||
|
),
|
||||||
|
"Math" => (
|
||||||
|
"module Math",
|
||||||
|
"**Math Functions**\n\n\
|
||||||
|
- `Math.abs(n: Int): Int` — absolute value\n\
|
||||||
|
- `Math.min(a: Int, b: Int): Int` — minimum\n\
|
||||||
|
- `Math.max(a: Int, b: Int): Int` — maximum\n\
|
||||||
|
- `Math.sqrt(n: Float): Float` — square root\n\
|
||||||
|
- `Math.pow(base: Float, exp: Float): Float` — power\n\
|
||||||
|
- `Math.floor(n: Float): Int` — round down\n\
|
||||||
|
- `Math.ceil(n: Float): Int` — round up",
|
||||||
|
),
|
||||||
|
"List" => (
|
||||||
|
"module List",
|
||||||
|
"**List Operations**\n\n\
|
||||||
|
- `List.map(list, f)` — transform each element\n\
|
||||||
|
- `List.filter(list, p)` — keep matching elements\n\
|
||||||
|
- `List.fold(list, init, f)` — reduce to single value\n\
|
||||||
|
- `List.head(list)` — first element (Option)\n\
|
||||||
|
- `List.tail(list)` — all except first (Option)\n\
|
||||||
|
- `List.length(list)` — number of elements\n\
|
||||||
|
- `List.concat(a, b)` — concatenate lists\n\
|
||||||
|
- `List.range(start, end)` — integer range\n\
|
||||||
|
- `List.reverse(list)` — reverse order\n\
|
||||||
|
- `List.get(list, i)` — element at index (Option)",
|
||||||
|
),
|
||||||
|
"String" => (
|
||||||
|
"module String",
|
||||||
|
"**String Operations**\n\n\
|
||||||
|
- `String.length(s)` — string length\n\
|
||||||
|
- `String.split(s, delim)` — split by delimiter\n\
|
||||||
|
- `String.join(list, delim)` — join with delimiter\n\
|
||||||
|
- `String.trim(s)` — trim whitespace\n\
|
||||||
|
- `String.contains(s, sub)` — check substring\n\
|
||||||
|
- `String.replace(s, from, to)` — replace occurrences\n\
|
||||||
|
- `String.startsWith(s, prefix)` — check prefix\n\
|
||||||
|
- `String.endsWith(s, suffix)` — check suffix\n\
|
||||||
|
- `String.substring(s, start, end)` — extract range\n\
|
||||||
|
- `String.chars(s)` — list of characters",
|
||||||
|
),
|
||||||
|
"Option" => (
|
||||||
|
"type Option<A> = Some(A) | None",
|
||||||
|
"**Optional Value**\n\n\
|
||||||
|
- `Option.isSome(opt)` — has a value?\n\
|
||||||
|
- `Option.isNone(opt)` — is empty?\n\
|
||||||
|
- `Option.getOrElse(opt, default)` — unwrap or default\n\
|
||||||
|
- `Option.map(opt, f)` — transform if present\n\
|
||||||
|
- `Option.flatMap(opt, f)` — chain operations",
|
||||||
|
),
|
||||||
|
"Result" => (
|
||||||
|
"type Result<A, E> = Ok(A) | Err(E)",
|
||||||
|
"**Result of Fallible Operation**\n\n\
|
||||||
|
- `Result.isOk(r)` — succeeded?\n\
|
||||||
|
- `Result.isErr(r)` — failed?\n\
|
||||||
|
- `Result.map(r, f)` — transform success value\n\
|
||||||
|
- `Result.mapErr(r, f)` — transform error value",
|
||||||
|
),
|
||||||
|
_ => return None,
|
||||||
|
};
|
||||||
|
|
||||||
|
Some(Hover {
|
||||||
|
contents: HoverContents::Markup(MarkupContent {
|
||||||
|
kind: MarkupKind::Markdown,
|
||||||
|
value: format!("```lux\n{}\n```\n{}", sig, doc),
|
||||||
|
}),
|
||||||
|
range: None,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
|
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
|
||||||
@@ -402,6 +563,26 @@ impl LspServer {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Find the next identifier in source after the given offset (skipping whitespace)
|
||||||
|
fn find_next_ident(&self, source: &str, start: usize) -> Option<String> {
|
||||||
|
let chars: Vec<char> = source.chars().collect();
|
||||||
|
let mut pos = start;
|
||||||
|
// Skip whitespace
|
||||||
|
while pos < chars.len() && (chars[pos] == ' ' || chars[pos] == '\t' || chars[pos] == '\n' || chars[pos] == '\r') {
|
||||||
|
pos += 1;
|
||||||
|
}
|
||||||
|
// Collect identifier
|
||||||
|
let ident_start = pos;
|
||||||
|
while pos < chars.len() && (chars[pos].is_alphanumeric() || chars[pos] == '_') {
|
||||||
|
pos += 1;
|
||||||
|
}
|
||||||
|
if pos > ident_start {
|
||||||
|
Some(chars[ident_start..pos].iter().collect())
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
|
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
|
||||||
match word {
|
match word {
|
||||||
// Keywords
|
// Keywords
|
||||||
@@ -607,17 +788,11 @@ impl LspServer {
|
|||||||
|
|
||||||
fn position_to_offset(&self, source: &str, position: Position) -> usize {
|
fn position_to_offset(&self, source: &str, position: Position) -> usize {
|
||||||
let mut offset = 0;
|
let mut offset = 0;
|
||||||
let mut line = 0u32;
|
for (line_idx, line) in source.lines().enumerate() {
|
||||||
|
if line_idx == position.line as usize {
|
||||||
for (i, c) in source.char_indices() {
|
return offset + (position.character as usize).min(line.len());
|
||||||
if line == position.line {
|
|
||||||
let col = i - offset;
|
|
||||||
return offset + (position.character as usize).min(col + 1);
|
|
||||||
}
|
|
||||||
if c == '\n' {
|
|
||||||
line += 1;
|
|
||||||
offset = i + 1;
|
|
||||||
}
|
}
|
||||||
|
offset += line.len() + 1; // +1 for newline
|
||||||
}
|
}
|
||||||
source.len()
|
source.len()
|
||||||
}
|
}
|
||||||
@@ -1396,12 +1571,15 @@ fn collect_call_site_hints(
|
|||||||
collect_call_site_hints(source, e, param_names, hints);
|
collect_call_site_hints(source, e, param_names, hints);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
collect_call_site_hints(source, spread_expr, param_names, hints);
|
||||||
|
}
|
||||||
for (_, e) in fields {
|
for (_, e) in fields {
|
||||||
collect_call_site_hints(source, e, param_names, hints);
|
collect_call_site_hints(source, e, param_names, hints);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
collect_call_site_hints(source, object, param_names, hints);
|
collect_call_site_hints(source, object, param_names, hints);
|
||||||
}
|
}
|
||||||
Expr::Run { expr, handlers, .. } => {
|
Expr::Run { expr, handlers, .. } => {
|
||||||
|
|||||||
640
src/main.rs
640
src/main.rs
@@ -1,4 +1,7 @@
|
|||||||
//! Lux - A functional programming language with first-class effects
|
//! Lux — Make the important things visible.
|
||||||
|
//!
|
||||||
|
//! A functional programming language with first-class effects, schema evolution,
|
||||||
|
//! and behavioral types. See `lux philosophy` or docs/PHILOSOPHY.md.
|
||||||
|
|
||||||
mod analysis;
|
mod analysis;
|
||||||
mod ast;
|
mod ast;
|
||||||
@@ -34,7 +37,7 @@ use std::borrow::Cow;
|
|||||||
use std::collections::HashSet;
|
use std::collections::HashSet;
|
||||||
use typechecker::TypeChecker;
|
use typechecker::TypeChecker;
|
||||||
|
|
||||||
const VERSION: &str = "0.1.0";
|
const VERSION: &str = env!("CARGO_PKG_VERSION");
|
||||||
|
|
||||||
const HELP: &str = r#"
|
const HELP: &str = r#"
|
||||||
Lux - A functional language with first-class effects
|
Lux - A functional language with first-class effects
|
||||||
@@ -171,9 +174,14 @@ fn main() {
|
|||||||
.and_then(|s| s.parse::<u16>().ok())
|
.and_then(|s| s.parse::<u16>().ok())
|
||||||
.unwrap_or(8080);
|
.unwrap_or(8080);
|
||||||
|
|
||||||
let dir = args.get(2)
|
let port_value_idx = args.iter()
|
||||||
.filter(|a| !a.starts_with('-'))
|
.position(|a| a == "--port" || a == "-p")
|
||||||
.map(|s| s.as_str())
|
.map(|i| i + 1);
|
||||||
|
let dir = args.iter().enumerate()
|
||||||
|
.skip(2)
|
||||||
|
.filter(|(i, a)| !a.starts_with('-') && Some(*i) != port_value_idx)
|
||||||
|
.map(|(_, a)| a.as_str())
|
||||||
|
.next()
|
||||||
.unwrap_or(".");
|
.unwrap_or(".");
|
||||||
|
|
||||||
serve_static_files(dir, port);
|
serve_static_files(dir, port);
|
||||||
@@ -185,10 +193,12 @@ fn main() {
|
|||||||
eprintln!(" lux compile <file.lux> --run");
|
eprintln!(" lux compile <file.lux> --run");
|
||||||
eprintln!(" lux compile <file.lux> --emit-c [-o file.c]");
|
eprintln!(" lux compile <file.lux> --emit-c [-o file.c]");
|
||||||
eprintln!(" lux compile <file.lux> --target js [-o file.js]");
|
eprintln!(" lux compile <file.lux> --target js [-o file.js]");
|
||||||
|
eprintln!(" lux compile <file.lux> --watch");
|
||||||
std::process::exit(1);
|
std::process::exit(1);
|
||||||
}
|
}
|
||||||
let run_after = args.iter().any(|a| a == "--run");
|
let run_after = args.iter().any(|a| a == "--run");
|
||||||
let emit_c = args.iter().any(|a| a == "--emit-c");
|
let emit_c = args.iter().any(|a| a == "--emit-c");
|
||||||
|
let watch = args.iter().any(|a| a == "--watch");
|
||||||
let target_js = args.iter()
|
let target_js = args.iter()
|
||||||
.position(|a| a == "--target")
|
.position(|a| a == "--target")
|
||||||
.and_then(|i| args.get(i + 1))
|
.and_then(|i| args.get(i + 1))
|
||||||
@@ -204,17 +214,34 @@ fn main() {
|
|||||||
} else {
|
} else {
|
||||||
compile_to_c(&args[2], output_path, run_after, emit_c);
|
compile_to_c(&args[2], output_path, run_after, emit_c);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if watch {
|
||||||
|
// Build the args to replay for each recompilation (without --watch)
|
||||||
|
let compile_args: Vec<String> = args.iter()
|
||||||
|
.skip(1)
|
||||||
|
.filter(|a| a.as_str() != "--watch")
|
||||||
|
.cloned()
|
||||||
|
.collect();
|
||||||
|
watch_and_rerun(&args[2], &compile_args);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
"repl" => {
|
||||||
|
// Start REPL
|
||||||
|
run_repl();
|
||||||
}
|
}
|
||||||
"doc" => {
|
"doc" => {
|
||||||
// Generate API documentation
|
// Generate API documentation
|
||||||
generate_docs(&args[2..]);
|
generate_docs(&args[2..]);
|
||||||
}
|
}
|
||||||
|
"philosophy" => {
|
||||||
|
print_philosophy();
|
||||||
|
}
|
||||||
cmd => {
|
cmd => {
|
||||||
// Check if it looks like a command typo
|
// Check if it looks like a command typo
|
||||||
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
|
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
|
||||||
let known_commands = vec![
|
let known_commands = vec![
|
||||||
"fmt", "lint", "test", "watch", "init", "check", "debug",
|
"fmt", "lint", "test", "watch", "init", "check", "debug",
|
||||||
"pkg", "registry", "serve", "compile", "doc",
|
"pkg", "registry", "serve", "compile", "doc", "repl", "philosophy",
|
||||||
];
|
];
|
||||||
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
|
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
|
||||||
if !suggestions.is_empty() {
|
if !suggestions.is_empty() {
|
||||||
@@ -229,18 +256,24 @@ fn main() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Start REPL
|
// No arguments — show help
|
||||||
run_repl();
|
print_help();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn print_help() {
|
fn print_help() {
|
||||||
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
|
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
|
||||||
println!("{}", c(colors::DIM, "A functional language with first-class effects"));
|
println!("{}", c(colors::DIM, "Make the important things visible."));
|
||||||
|
println!();
|
||||||
|
println!(" {} Effects in types — see what code does", c(colors::DIM, "·"));
|
||||||
|
println!(" {} Composition over configuration — no DI frameworks", c(colors::DIM, "·"));
|
||||||
|
println!(" {} Safety without ceremony — inference where it helps", c(colors::DIM, "·"));
|
||||||
|
println!(" {} One right way — opinionated formatter, integrated tools", c(colors::DIM, "·"));
|
||||||
println!();
|
println!();
|
||||||
println!("{}", bc("", "Usage:"));
|
println!("{}", bc("", "Usage:"));
|
||||||
println!();
|
println!();
|
||||||
println!(" {} Start the REPL", bc(colors::CYAN, "lux"));
|
println!(" {} Show this help", bc(colors::CYAN, "lux"));
|
||||||
|
println!(" {} Start the REPL", bc(colors::CYAN, "lux repl"));
|
||||||
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
|
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
|
||||||
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
|
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
|
||||||
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
|
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
|
||||||
@@ -275,6 +308,8 @@ fn print_help() {
|
|||||||
c(colors::DIM, "(alias: s)"));
|
c(colors::DIM, "(alias: s)"));
|
||||||
println!(" {} {} {} Generate API documentation",
|
println!(" {} {} {} Generate API documentation",
|
||||||
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
|
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
|
||||||
|
println!(" {} {} Show language philosophy",
|
||||||
|
bc(colors::CYAN, "lux"), bc(colors::CYAN, "philosophy"));
|
||||||
println!(" {} {} Start LSP server",
|
println!(" {} {} Start LSP server",
|
||||||
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
|
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
|
||||||
println!(" {} {} Show this help",
|
println!(" {} {} Show this help",
|
||||||
@@ -283,6 +318,36 @@ fn print_help() {
|
|||||||
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
|
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn print_philosophy() {
|
||||||
|
println!("{}", bc(colors::GREEN, &format!("The Lux Philosophy")));
|
||||||
|
println!();
|
||||||
|
println!(" {}", bc("", "Make the important things visible."));
|
||||||
|
println!();
|
||||||
|
println!(" Most languages hide what matters most in production: what code");
|
||||||
|
println!(" can do, how data changes over time, and what guarantees functions");
|
||||||
|
println!(" provide. Lux makes all three first-class, compiler-checked features.");
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "1. Explicit over implicit"), c(colors::DIM, "— effects in types, not hidden behind interfaces"));
|
||||||
|
println!(" fn processOrder(order: Order): Receipt {} {}", c(colors::YELLOW, "with {Database, Email}"), c(colors::DIM, "// signature IS documentation"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "2. Composition over configuration"), c(colors::DIM, "— no DI frameworks, no monad transformers"));
|
||||||
|
println!(" run app() {} {}", c(colors::YELLOW, "with { Database = mock, Http = mock }"), c(colors::DIM, "// swap handlers, not libraries"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "3. Safety without ceremony"), c(colors::DIM, "— type inference where it helps, annotations where they document"));
|
||||||
|
println!(" let x = 42 {}", c(colors::DIM, "// inferred"));
|
||||||
|
println!(" fn f(x: Int): Int = x * 2 {}", c(colors::DIM, "// annotated: API contract"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "4. Practical over academic"), c(colors::DIM, "— ML semantics in C-family syntax, no monads to learn"));
|
||||||
|
println!(" {} {} {}", c(colors::DIM, "fn main(): Unit"), c(colors::YELLOW, "with {Console}"), c(colors::DIM, "= Console.print(\"Hello!\")"));
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "5. One right way"), c(colors::DIM, "— opinionated formatter, integrated tooling, built-in testing"));
|
||||||
|
println!(" lux fmt | lux lint | lux check | lux test | lux compile");
|
||||||
|
println!();
|
||||||
|
println!(" {} {}", bc(colors::CYAN, "6. Tools are the language"), c(colors::DIM, "— formatter knows the AST, linter knows the types, LSP knows the effects"));
|
||||||
|
println!();
|
||||||
|
println!(" See {} for the full philosophy with language comparisons.", c(colors::CYAN, "docs/PHILOSOPHY.md"));
|
||||||
|
}
|
||||||
|
|
||||||
fn format_files(args: &[String]) {
|
fn format_files(args: &[String]) {
|
||||||
use formatter::{format, FormatConfig};
|
use formatter::{format, FormatConfig};
|
||||||
use std::path::Path;
|
use std::path::Path;
|
||||||
@@ -721,6 +786,36 @@ fn collect_lux_files_nonrecursive(dir: &str, pattern: Option<&str>, files: &mut
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Find a C compiler. Priority: $CC env var, build-time embedded path, PATH search.
|
||||||
|
fn find_c_compiler() -> String {
|
||||||
|
// 1. Explicit env var
|
||||||
|
if let Ok(cc) = std::env::var("CC") {
|
||||||
|
if !cc.is_empty() {
|
||||||
|
return cc;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// 2. Path captured at build time (e.g. absolute nix store path)
|
||||||
|
let built_in = env!("LUX_CC_PATH");
|
||||||
|
if !built_in.is_empty() && std::path::Path::new(built_in).exists() {
|
||||||
|
return built_in.to_string();
|
||||||
|
}
|
||||||
|
// 3. Search PATH
|
||||||
|
for name in &["cc", "gcc", "clang"] {
|
||||||
|
if let Ok(output) = std::process::Command::new("which").arg(name).output() {
|
||||||
|
if output.status.success() {
|
||||||
|
if let Ok(p) = String::from_utf8(output.stdout) {
|
||||||
|
let p = p.trim();
|
||||||
|
if !p.is_empty() {
|
||||||
|
return p.to_string();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// 4. Last resort
|
||||||
|
"cc".to_string()
|
||||||
|
}
|
||||||
|
|
||||||
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
|
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
|
||||||
use codegen::c_backend::CBackend;
|
use codegen::c_backend::CBackend;
|
||||||
use modules::ModuleLoader;
|
use modules::ModuleLoader;
|
||||||
@@ -764,7 +859,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
|
|||||||
|
|
||||||
// Generate C code
|
// Generate C code
|
||||||
let mut backend = CBackend::new();
|
let mut backend = CBackend::new();
|
||||||
let c_code = match backend.generate(&program) {
|
let c_code = match backend.generate(&program, loader.module_cache()) {
|
||||||
Ok(code) => code,
|
Ok(code) => code,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
|
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
|
||||||
@@ -812,13 +907,14 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
|
|||||||
std::process::exit(1);
|
std::process::exit(1);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Find C compiler
|
// Find C compiler: $CC env var > embedded build-time path > PATH search
|
||||||
let cc = std::env::var("CC").unwrap_or_else(|_| "cc".to_string());
|
let cc = find_c_compiler();
|
||||||
|
|
||||||
let compile_result = Command::new(&cc)
|
let compile_result = Command::new(&cc)
|
||||||
.args(["-O2", "-o"])
|
.args(["-O2", "-o"])
|
||||||
.arg(&output_bin)
|
.arg(&output_bin)
|
||||||
.arg(&temp_c)
|
.arg(&temp_c)
|
||||||
|
.arg("-lm")
|
||||||
.output();
|
.output();
|
||||||
|
|
||||||
match compile_result {
|
match compile_result {
|
||||||
@@ -1002,7 +1098,7 @@ fn run_tests(args: &[String]) {
|
|||||||
for test_file in &test_files {
|
for test_file in &test_files {
|
||||||
let path_str = test_file.to_string_lossy().to_string();
|
let path_str = test_file.to_string_lossy().to_string();
|
||||||
|
|
||||||
// Read and parse the file
|
// Read and parse the file (with module loading)
|
||||||
let source = match fs::read_to_string(test_file) {
|
let source = match fs::read_to_string(test_file) {
|
||||||
Ok(s) => s,
|
Ok(s) => s,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
@@ -1012,7 +1108,13 @@ fn run_tests(args: &[String]) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
let program = match Parser::parse_source(&source) {
|
use modules::ModuleLoader;
|
||||||
|
let mut loader = ModuleLoader::new();
|
||||||
|
if let Some(parent) = test_file.parent() {
|
||||||
|
loader.add_search_path(parent.to_path_buf());
|
||||||
|
}
|
||||||
|
|
||||||
|
let program = match loader.load_source(&source, Some(test_file.as_path())) {
|
||||||
Ok(p) => p,
|
Ok(p) => p,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
|
||||||
@@ -1021,9 +1123,9 @@ fn run_tests(args: &[String]) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
// Type check
|
// Type check with module support
|
||||||
let mut checker = typechecker::TypeChecker::new();
|
let mut checker = typechecker::TypeChecker::new();
|
||||||
if let Err(errors) = checker.check_program(&program) {
|
if let Err(errors) = checker.check_program_with_modules(&program, &loader) {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
|
||||||
for err in errors {
|
for err in errors {
|
||||||
eprintln!(" {}", err);
|
eprintln!(" {}", err);
|
||||||
@@ -1051,7 +1153,7 @@ fn run_tests(args: &[String]) {
|
|||||||
interp.register_auto_migrations(&auto_migrations);
|
interp.register_auto_migrations(&auto_migrations);
|
||||||
interp.reset_test_results();
|
interp.reset_test_results();
|
||||||
|
|
||||||
match interp.run(&program) {
|
match interp.run_with_modules(&program, &loader) {
|
||||||
Ok(_) => {
|
Ok(_) => {
|
||||||
let results = interp.get_test_results();
|
let results = interp.get_test_results();
|
||||||
if results.failed == 0 && results.passed == 0 {
|
if results.failed == 0 && results.passed == 0 {
|
||||||
@@ -1085,8 +1187,8 @@ fn run_tests(args: &[String]) {
|
|||||||
interp.register_auto_migrations(&auto_migrations);
|
interp.register_auto_migrations(&auto_migrations);
|
||||||
interp.reset_test_results();
|
interp.reset_test_results();
|
||||||
|
|
||||||
// First run the file to define all functions
|
// First run the file to define all functions and load imports
|
||||||
if let Err(e) = interp.run(&program) {
|
if let Err(e) = interp.run_with_modules(&program, &loader) {
|
||||||
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
|
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
|
||||||
total_failed += 1;
|
total_failed += 1;
|
||||||
continue;
|
continue;
|
||||||
@@ -1261,6 +1363,64 @@ fn watch_file(path: &str) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn watch_and_rerun(path: &str, compile_args: &[String]) {
|
||||||
|
use std::time::{Duration, SystemTime};
|
||||||
|
use std::path::Path;
|
||||||
|
|
||||||
|
let file_path = Path::new(path);
|
||||||
|
if !file_path.exists() {
|
||||||
|
eprintln!("File not found: {}", path);
|
||||||
|
std::process::exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
println!();
|
||||||
|
println!("Watching {} for changes (Ctrl+C to stop)...", path);
|
||||||
|
|
||||||
|
let mut last_modified = std::fs::metadata(file_path)
|
||||||
|
.and_then(|m| m.modified())
|
||||||
|
.unwrap_or(SystemTime::UNIX_EPOCH);
|
||||||
|
|
||||||
|
loop {
|
||||||
|
std::thread::sleep(Duration::from_millis(500));
|
||||||
|
|
||||||
|
let modified = match std::fs::metadata(file_path).and_then(|m| m.modified()) {
|
||||||
|
Ok(m) => m,
|
||||||
|
Err(_) => continue,
|
||||||
|
};
|
||||||
|
|
||||||
|
if modified > last_modified {
|
||||||
|
last_modified = modified;
|
||||||
|
|
||||||
|
// Clear screen
|
||||||
|
print!("\x1B[2J\x1B[H");
|
||||||
|
|
||||||
|
println!("=== Compiling {} ===", path);
|
||||||
|
println!();
|
||||||
|
|
||||||
|
let result = std::process::Command::new(std::env::current_exe().unwrap())
|
||||||
|
.args(compile_args)
|
||||||
|
.status();
|
||||||
|
|
||||||
|
match result {
|
||||||
|
Ok(status) if status.success() => {
|
||||||
|
println!();
|
||||||
|
println!("=== Success ===");
|
||||||
|
}
|
||||||
|
Ok(_) => {
|
||||||
|
println!();
|
||||||
|
println!("=== Failed ===");
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
eprintln!("Error running compiler: {}", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
println!();
|
||||||
|
println!("Watching for changes...");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn serve_static_files(dir: &str, port: u16) {
|
fn serve_static_files(dir: &str, port: u16) {
|
||||||
use std::io::{Write, BufRead, BufReader};
|
use std::io::{Write, BufRead, BufReader};
|
||||||
use std::net::TcpListener;
|
use std::net::TcpListener;
|
||||||
@@ -2128,6 +2288,29 @@ fn extract_module_doc(source: &str, path: &str) -> Result<ModuleDoc, String> {
|
|||||||
is_public: matches!(t.visibility, ast::Visibility::Public),
|
is_public: matches!(t.visibility, ast::Visibility::Public),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
ast::Declaration::ExternFn(ext) => {
|
||||||
|
let params: Vec<String> = ext.params.iter()
|
||||||
|
.map(|p| format!("{}: {}", p.name.name, format_type(&p.typ)))
|
||||||
|
.collect();
|
||||||
|
let js_note = ext.js_name.as_ref()
|
||||||
|
.map(|n| format!(" = \"{}\"", n))
|
||||||
|
.unwrap_or_default();
|
||||||
|
let signature = format!(
|
||||||
|
"extern fn {}({}): {}{}",
|
||||||
|
ext.name.name,
|
||||||
|
params.join(", "),
|
||||||
|
format_type(&ext.return_type),
|
||||||
|
js_note
|
||||||
|
);
|
||||||
|
let doc = extract_doc_comment(source, ext.span.start);
|
||||||
|
functions.push(FunctionDoc {
|
||||||
|
name: ext.name.name.clone(),
|
||||||
|
signature,
|
||||||
|
description: doc,
|
||||||
|
is_public: matches!(ext.visibility, ast::Visibility::Public),
|
||||||
|
properties: vec![],
|
||||||
|
});
|
||||||
|
}
|
||||||
ast::Declaration::Effect(e) => {
|
ast::Declaration::Effect(e) => {
|
||||||
let doc = extract_doc_comment(source, e.span.start);
|
let doc = extract_doc_comment(source, e.span.start);
|
||||||
let ops: Vec<String> = e.operations.iter()
|
let ops: Vec<String> = e.operations.iter()
|
||||||
@@ -3765,6 +3948,49 @@ c")"#;
|
|||||||
assert_eq!(eval(source).unwrap(), r#""literal {braces}""#);
|
assert_eq!(eval(source).unwrap(), r#""literal {braces}""#);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_string() {
|
||||||
|
let source = r#"
|
||||||
|
let s = """
|
||||||
|
hello
|
||||||
|
world
|
||||||
|
"""
|
||||||
|
let result = String.length(s)
|
||||||
|
"#;
|
||||||
|
// "hello\nworld" = 11 chars
|
||||||
|
assert_eq!(eval(source).unwrap(), "11");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_string_with_quotes() {
|
||||||
|
// Quotes are fine in the middle of triple-quoted strings
|
||||||
|
let source = "let s = \"\"\"\n She said \"hello\" to him.\n\"\"\"";
|
||||||
|
assert_eq!(eval(source).unwrap(), r#""She said "hello" to him.""#);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_string_interpolation() {
|
||||||
|
let source = r#"
|
||||||
|
let name = "Lux"
|
||||||
|
let s = """
|
||||||
|
Hello, {name}!
|
||||||
|
"""
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), r#""Hello, Lux!""#);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_string_empty() {
|
||||||
|
let source = r#"let s = """""""#;
|
||||||
|
assert_eq!(eval(source).unwrap(), r#""""#);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_string_inline() {
|
||||||
|
let source = r#"let s = """hello world""""#;
|
||||||
|
assert_eq!(eval(source).unwrap(), r#""hello world""#);
|
||||||
|
}
|
||||||
|
|
||||||
// Option tests
|
// Option tests
|
||||||
#[test]
|
#[test]
|
||||||
fn test_option_constructors() {
|
fn test_option_constructors() {
|
||||||
@@ -3878,6 +4104,146 @@ c")"#;
|
|||||||
assert_eq!(eval("let x = { a: 1, b: 2 } == { a: 1, b: 3 }").unwrap(), "false");
|
assert_eq!(eval("let x = { a: 1, b: 2 } == { a: 1, b: 3 }").unwrap(), "false");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_record_spread() {
|
||||||
|
let source = r#"
|
||||||
|
let base = { x: 1, y: 2, z: 3 }
|
||||||
|
let updated = { ...base, y: 20 }
|
||||||
|
let result = updated.y
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "20");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_deep_path_record_update() {
|
||||||
|
// Basic deep path: { ...base, pos.x: val } desugars to { ...base, pos: { ...base.pos, x: val } }
|
||||||
|
let source = r#"
|
||||||
|
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
|
||||||
|
let moved = { ...npc, pos.x: 50, pos.y: 60 }
|
||||||
|
let result = moved.pos.x
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "50");
|
||||||
|
|
||||||
|
// Verify other fields are preserved through spread
|
||||||
|
let source2 = r#"
|
||||||
|
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
|
||||||
|
let moved = { ...npc, pos.x: 50 }
|
||||||
|
let result = moved.pos.y
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source2).unwrap(), "20");
|
||||||
|
|
||||||
|
// Verify top-level spread fields preserved
|
||||||
|
let source3 = r#"
|
||||||
|
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
|
||||||
|
let moved = { ...npc, pos.x: 50 }
|
||||||
|
let result = moved.name
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source3).unwrap(), "\"Goblin\"");
|
||||||
|
|
||||||
|
// Mix of flat and deep path fields
|
||||||
|
let source4 = r#"
|
||||||
|
let npc = { name: "Goblin", pos: { x: 10, y: 20 }, hp: 100 }
|
||||||
|
let updated = { ...npc, pos.x: 50, hp: 80 }
|
||||||
|
let result = (updated.pos.x, updated.hp, updated.name)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source4).unwrap(), "(50, 80, \"Goblin\")");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_deep_path_record_multilevel() {
|
||||||
|
// Multi-level deep path: world.physics.gravity
|
||||||
|
let source = r#"
|
||||||
|
let world = { name: "Earth", physics: { gravity: { x: 0, y: -10 }, drag: 1 } }
|
||||||
|
let updated = { ...world, physics.gravity.y: -20 }
|
||||||
|
let result = (updated.physics.gravity.y, updated.physics.drag, updated.name)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(-20, 1, \"Earth\")");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_deep_path_conflict_error() {
|
||||||
|
// Field appears as both flat and deep path — should error
|
||||||
|
let result = eval(r#"
|
||||||
|
let base = { pos: { x: 1, y: 2 } }
|
||||||
|
let bad = { ...base, pos: { x: 10, y: 20 }, pos.x: 30 }
|
||||||
|
"#);
|
||||||
|
assert!(result.is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extern_fn_parse() {
|
||||||
|
// Extern fn should parse successfully
|
||||||
|
let source = r#"
|
||||||
|
extern fn getElementById(id: String): String
|
||||||
|
let x = 42
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extern_fn_with_js_name() {
|
||||||
|
// Extern fn with JS name override
|
||||||
|
let source = r#"
|
||||||
|
extern fn getCtx(el: String, kind: String): String = "getContext"
|
||||||
|
let x = 42
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extern_fn_call_errors_in_interpreter() {
|
||||||
|
// Calling an extern fn in the interpreter should produce a clear error
|
||||||
|
let source = r#"
|
||||||
|
extern fn alert(msg: String): Unit
|
||||||
|
let x = alert("hello")
|
||||||
|
"#;
|
||||||
|
let result = eval(source);
|
||||||
|
assert!(result.is_err());
|
||||||
|
let err = result.unwrap_err();
|
||||||
|
assert!(err.contains("extern") || err.contains("Extern") || err.contains("JavaScript"),
|
||||||
|
"Error should mention extern/JavaScript: {}", err);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_pub_extern_fn() {
|
||||||
|
// pub extern fn should parse
|
||||||
|
let source = r#"
|
||||||
|
pub extern fn requestAnimationFrame(callback: fn(): Unit): Int
|
||||||
|
let x = 42
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extern_fn_js_codegen() {
|
||||||
|
// Verify JS backend emits extern fn calls without _lux suffix
|
||||||
|
use crate::codegen::js_backend::JsBackend;
|
||||||
|
use crate::parser::Parser;
|
||||||
|
use crate::lexer::Lexer;
|
||||||
|
|
||||||
|
let source = r#"
|
||||||
|
extern fn getElementById(id: String): String
|
||||||
|
extern fn getContext(el: String, kind: String): String = "getContext"
|
||||||
|
fn main(): Unit = {
|
||||||
|
let el = getElementById("canvas")
|
||||||
|
let ctx = getContext(el, "2d")
|
||||||
|
()
|
||||||
|
}
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let tokens = Lexer::new(source).tokenize().unwrap();
|
||||||
|
let program = Parser::new(tokens).parse_program().unwrap();
|
||||||
|
let mut backend = JsBackend::new();
|
||||||
|
let js = backend.generate(&program).unwrap();
|
||||||
|
|
||||||
|
// getElementById should appear as-is (no _lux suffix)
|
||||||
|
assert!(js.contains("getElementById("), "JS should call getElementById directly: {}", js);
|
||||||
|
// getContext should use the JS name override
|
||||||
|
assert!(js.contains("getContext("), "JS should call getContext directly: {}", js);
|
||||||
|
// main should still be mangled
|
||||||
|
assert!(js.contains("main_lux"), "main should be mangled: {}", js);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_invalid_escape_sequence() {
|
fn test_invalid_escape_sequence() {
|
||||||
let result = eval(r#"let x = "\z""#);
|
let result = eval(r#"let x = "\z""#);
|
||||||
@@ -4831,6 +5197,71 @@ c")"#;
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============ Multi-line Arguments Tests ============
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_function_args() {
|
||||||
|
let source = r#"
|
||||||
|
fn add(a: Int, b: Int): Int = a + b
|
||||||
|
let result = add(
|
||||||
|
1,
|
||||||
|
2
|
||||||
|
)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "3");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_function_args_with_lambda() {
|
||||||
|
let source = r#"
|
||||||
|
let xs = List.map(
|
||||||
|
[1, 2, 3],
|
||||||
|
fn(x) => x * 2
|
||||||
|
)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "[2, 4, 6]");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============ Tuple Index Tests ============
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_access() {
|
||||||
|
let source = r#"
|
||||||
|
let pair = (42, "hello")
|
||||||
|
let first = pair.0
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_access_second() {
|
||||||
|
let source = r#"
|
||||||
|
let pair = (42, "hello")
|
||||||
|
let second = pair.1
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "\"hello\"");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_triple() {
|
||||||
|
let source = r#"
|
||||||
|
let triple = (1, 2, 3)
|
||||||
|
let sum = triple.0 + triple.1 + triple.2
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "6");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_tuple_index_in_function() {
|
||||||
|
let source = r#"
|
||||||
|
fn first(pair: (Int, String)): Int = pair.0
|
||||||
|
fn second(pair: (Int, String)): String = pair.1
|
||||||
|
let p = (42, "hello")
|
||||||
|
let result = first(p)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "42");
|
||||||
|
}
|
||||||
|
|
||||||
// Exhaustiveness checking tests
|
// Exhaustiveness checking tests
|
||||||
mod exhaustiveness_tests {
|
mod exhaustiveness_tests {
|
||||||
use super::*;
|
use super::*;
|
||||||
@@ -5286,4 +5717,173 @@ c")"#;
|
|||||||
check_file("projects/rest-api/main.lux").unwrap();
|
check_file("projects/rest-api/main.lux").unwrap();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// === Map type tests ===
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_new_and_size() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.new()
|
||||||
|
let result = Map.size(m)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "0");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_set_and_get() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.new()
|
||||||
|
let m2 = Map.set(m, "name", "Alice")
|
||||||
|
let result = Map.get(m2, "name")
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "Some(\"Alice\")");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_get_missing() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.new()
|
||||||
|
let result = Map.get(m, "missing")
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "None");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_contains() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.set(Map.new(), "x", 1)
|
||||||
|
let result = (Map.contains(m, "x"), Map.contains(m, "y"))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(true, false)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_remove() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.set(Map.set(Map.new(), "a", 1), "b", 2)
|
||||||
|
let m2 = Map.remove(m, "a")
|
||||||
|
let result = (Map.size(m2), Map.contains(m2, "a"), Map.contains(m2, "b"))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(1, false, true)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_keys_and_values() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
|
||||||
|
let result = Map.keys(m)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "[\"a\", \"b\"]");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_from_list() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.fromList([("x", 10), ("y", 20)])
|
||||||
|
let result = (Map.get(m, "x"), Map.size(m))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(Some(10), 2)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_to_list() {
|
||||||
|
let source = r#"
|
||||||
|
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
|
||||||
|
let result = Map.toList(m)
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "[(\"a\", 1), (\"b\", 2)]");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_merge() {
|
||||||
|
let source = r#"
|
||||||
|
let m1 = Map.fromList([("a", 1), ("b", 2)])
|
||||||
|
let m2 = Map.fromList([("b", 3), ("c", 4)])
|
||||||
|
let merged = Map.merge(m1, m2)
|
||||||
|
let result = (Map.get(merged, "a"), Map.get(merged, "b"), Map.get(merged, "c"))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(Some(1), Some(3), Some(4))");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_immutability() {
|
||||||
|
let source = r#"
|
||||||
|
let m1 = Map.fromList([("a", 1)])
|
||||||
|
let m2 = Map.set(m1, "b", 2)
|
||||||
|
let result = (Map.size(m1), Map.size(m2))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(1, 2)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_is_empty() {
|
||||||
|
let source = r#"
|
||||||
|
let m1 = Map.new()
|
||||||
|
let m2 = Map.set(m1, "x", 1)
|
||||||
|
let result = (Map.isEmpty(m1), Map.isEmpty(m2))
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "(true, false)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_map_type_annotation() {
|
||||||
|
let source = r#"
|
||||||
|
fn lookup(m: Map<String, Int>, key: String): Option<Int> =
|
||||||
|
Map.get(m, key)
|
||||||
|
let m = Map.fromList([("age", 30)])
|
||||||
|
let result = lookup(m, "age")
|
||||||
|
"#;
|
||||||
|
assert_eq!(eval(source).unwrap(), "Some(30)");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_file_copy() {
|
||||||
|
use std::io::Write;
|
||||||
|
// Create a temp file, copy it, verify contents
|
||||||
|
let dir = std::env::temp_dir().join("lux_test_file_copy");
|
||||||
|
let _ = std::fs::create_dir_all(&dir);
|
||||||
|
let src = dir.join("src.txt");
|
||||||
|
let dst = dir.join("dst.txt");
|
||||||
|
std::fs::File::create(&src).unwrap().write_all(b"hello copy").unwrap();
|
||||||
|
let _ = std::fs::remove_file(&dst);
|
||||||
|
|
||||||
|
let source = format!(r#"
|
||||||
|
fn main(): Unit with {{File}} =
|
||||||
|
File.copy("{}", "{}")
|
||||||
|
let _ = run main() with {{}}
|
||||||
|
let result = "done"
|
||||||
|
"#, src.display(), dst.display());
|
||||||
|
let result = eval(&source);
|
||||||
|
assert!(result.is_ok(), "File.copy failed: {:?}", result);
|
||||||
|
let contents = std::fs::read_to_string(&dst).unwrap();
|
||||||
|
assert_eq!(contents, "hello copy");
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
let _ = std::fs::remove_dir_all(&dir);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_effectful_callback_propagation() {
|
||||||
|
// WISH-7: effectful callbacks in List.forEach should propagate effects
|
||||||
|
// This should type-check successfully because Console effect is inferred
|
||||||
|
let source = r#"
|
||||||
|
fn printAll(items: List<String>): Unit =
|
||||||
|
List.forEach(items, fn(x: String): Unit => Console.print(x))
|
||||||
|
let result = "ok"
|
||||||
|
"#;
|
||||||
|
let result = eval(source);
|
||||||
|
assert!(result.is_ok(), "Effectful callback should type-check: {:?}", result);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_effectful_callback_in_map() {
|
||||||
|
// Effectful callback in List.map should propagate effects
|
||||||
|
let source = r#"
|
||||||
|
fn readAll(paths: List<String>): List<String> =
|
||||||
|
List.map(paths, fn(p: String): String => File.read(p))
|
||||||
|
let result = "ok"
|
||||||
|
"#;
|
||||||
|
let result = eval(source);
|
||||||
|
assert!(result.is_ok(), "Effectful callback in map should type-check: {:?}", result);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -52,6 +52,7 @@ impl Module {
|
|||||||
Declaration::Let(l) => l.visibility == Visibility::Public,
|
Declaration::Let(l) => l.visibility == Visibility::Public,
|
||||||
Declaration::Type(t) => t.visibility == Visibility::Public,
|
Declaration::Type(t) => t.visibility == Visibility::Public,
|
||||||
Declaration::Trait(t) => t.visibility == Visibility::Public,
|
Declaration::Trait(t) => t.visibility == Visibility::Public,
|
||||||
|
Declaration::ExternFn(e) => e.visibility == Visibility::Public,
|
||||||
// Effects, handlers, and impls are always public for now
|
// Effects, handlers, and impls are always public for now
|
||||||
Declaration::Effect(_) | Declaration::Handler(_) | Declaration::Impl(_) => true,
|
Declaration::Effect(_) | Declaration::Handler(_) | Declaration::Impl(_) => true,
|
||||||
}
|
}
|
||||||
@@ -279,6 +280,12 @@ impl ModuleLoader {
|
|||||||
}
|
}
|
||||||
Declaration::Type(t) if t.visibility == Visibility::Public => {
|
Declaration::Type(t) if t.visibility == Visibility::Public => {
|
||||||
exports.insert(t.name.name.clone());
|
exports.insert(t.name.name.clone());
|
||||||
|
// Also export constructors for ADT types
|
||||||
|
if let crate::ast::TypeDef::Enum(variants) = &t.definition {
|
||||||
|
for variant in variants {
|
||||||
|
exports.insert(variant.name.name.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
Declaration::Effect(e) => {
|
Declaration::Effect(e) => {
|
||||||
// Effects are always exported
|
// Effects are always exported
|
||||||
@@ -288,6 +295,9 @@ impl ModuleLoader {
|
|||||||
// Handlers are always exported
|
// Handlers are always exported
|
||||||
exports.insert(h.name.name.clone());
|
exports.insert(h.name.name.clone());
|
||||||
}
|
}
|
||||||
|
Declaration::ExternFn(e) if e.visibility == Visibility::Public => {
|
||||||
|
exports.insert(e.name.name.clone());
|
||||||
|
}
|
||||||
_ => {}
|
_ => {}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -305,6 +315,11 @@ impl ModuleLoader {
|
|||||||
self.cache.iter()
|
self.cache.iter()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Get the module cache (for passing to C backend)
|
||||||
|
pub fn module_cache(&self) -> &HashMap<String, Module> {
|
||||||
|
&self.cache
|
||||||
|
}
|
||||||
|
|
||||||
/// Clear the module cache
|
/// Clear the module cache
|
||||||
pub fn clear_cache(&mut self) {
|
pub fn clear_cache(&mut self) {
|
||||||
self.cache.clear();
|
self.cache.clear();
|
||||||
|
|||||||
342
src/parser.rs
342
src/parser.rs
@@ -238,6 +238,7 @@ impl Parser {
|
|||||||
|
|
||||||
match self.peek_kind() {
|
match self.peek_kind() {
|
||||||
TokenKind::Fn => Ok(Declaration::Function(self.parse_function_decl(visibility, doc)?)),
|
TokenKind::Fn => Ok(Declaration::Function(self.parse_function_decl(visibility, doc)?)),
|
||||||
|
TokenKind::Extern => Ok(Declaration::ExternFn(self.parse_extern_fn_decl(visibility, doc)?)),
|
||||||
TokenKind::Effect => Ok(Declaration::Effect(self.parse_effect_decl(doc)?)),
|
TokenKind::Effect => Ok(Declaration::Effect(self.parse_effect_decl(doc)?)),
|
||||||
TokenKind::Handler => Ok(Declaration::Handler(self.parse_handler_decl()?)),
|
TokenKind::Handler => Ok(Declaration::Handler(self.parse_handler_decl()?)),
|
||||||
TokenKind::Type => Ok(Declaration::Type(self.parse_type_decl(visibility, doc)?)),
|
TokenKind::Type => Ok(Declaration::Type(self.parse_type_decl(visibility, doc)?)),
|
||||||
@@ -245,7 +246,8 @@ impl Parser {
|
|||||||
TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)),
|
TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)),
|
||||||
TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)),
|
TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)),
|
||||||
TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")),
|
TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")),
|
||||||
_ => Err(self.error("Expected declaration (fn, effect, handler, type, trait, impl, or let)")),
|
TokenKind::Handle => Err(self.error("Bare 'handle' expressions are not allowed at top level. Use 'let _ = handle ...' or 'let result = handle ...'")),
|
||||||
|
_ => Err(self.error("Expected declaration (fn, extern, effect, handler, type, trait, impl, or let)")),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -322,6 +324,57 @@ impl Parser {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Parse extern function declaration: extern fn name<T>(params): ReturnType = "jsName"
|
||||||
|
fn parse_extern_fn_decl(&mut self, visibility: Visibility, doc: Option<String>) -> Result<ExternFnDecl, ParseError> {
|
||||||
|
let start = self.current_span();
|
||||||
|
self.expect(TokenKind::Extern)?;
|
||||||
|
self.expect(TokenKind::Fn)?;
|
||||||
|
|
||||||
|
let name = self.parse_ident()?;
|
||||||
|
|
||||||
|
// Optional type parameters
|
||||||
|
let type_params = if self.check(TokenKind::Lt) {
|
||||||
|
self.parse_type_params()?
|
||||||
|
} else {
|
||||||
|
Vec::new()
|
||||||
|
};
|
||||||
|
|
||||||
|
self.expect(TokenKind::LParen)?;
|
||||||
|
let params = self.parse_params()?;
|
||||||
|
self.expect(TokenKind::RParen)?;
|
||||||
|
|
||||||
|
// Return type
|
||||||
|
self.expect(TokenKind::Colon)?;
|
||||||
|
let return_type = self.parse_type()?;
|
||||||
|
|
||||||
|
// Optional JS name override: = "jsName"
|
||||||
|
let js_name = if self.check(TokenKind::Eq) {
|
||||||
|
self.advance();
|
||||||
|
match self.peek_kind() {
|
||||||
|
TokenKind::String(s) => {
|
||||||
|
let name = s.clone();
|
||||||
|
self.advance();
|
||||||
|
Some(name)
|
||||||
|
}
|
||||||
|
_ => return Err(self.error("Expected string literal for JS name in extern fn")),
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
};
|
||||||
|
|
||||||
|
let span = start.merge(self.previous_span());
|
||||||
|
Ok(ExternFnDecl {
|
||||||
|
visibility,
|
||||||
|
doc,
|
||||||
|
name,
|
||||||
|
type_params,
|
||||||
|
params,
|
||||||
|
return_type,
|
||||||
|
js_name,
|
||||||
|
span,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
/// Parse effect declaration
|
/// Parse effect declaration
|
||||||
fn parse_effect_decl(&mut self, doc: Option<String>) -> Result<EffectDecl, ParseError> {
|
fn parse_effect_decl(&mut self, doc: Option<String>) -> Result<EffectDecl, ParseError> {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
@@ -845,6 +898,7 @@ impl Parser {
|
|||||||
/// Parse function parameters
|
/// Parse function parameters
|
||||||
fn parse_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
|
fn parse_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
|
||||||
let mut params = Vec::new();
|
let mut params = Vec::new();
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
@@ -854,9 +908,11 @@ impl Parser {
|
|||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
|
|
||||||
params.push(Parameter { name, typ, span });
|
params.push(Parameter { name, typ, span });
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1558,6 +1614,7 @@ impl Parser {
|
|||||||
loop {
|
loop {
|
||||||
let op = match self.peek_kind() {
|
let op = match self.peek_kind() {
|
||||||
TokenKind::Plus => BinaryOp::Add,
|
TokenKind::Plus => BinaryOp::Add,
|
||||||
|
TokenKind::PlusPlus => BinaryOp::Concat,
|
||||||
TokenKind::Minus => BinaryOp::Sub,
|
TokenKind::Minus => BinaryOp::Sub,
|
||||||
_ => break,
|
_ => break,
|
||||||
};
|
};
|
||||||
@@ -1646,6 +1703,20 @@ impl Parser {
|
|||||||
} else if self.check(TokenKind::Dot) {
|
} else if self.check(TokenKind::Dot) {
|
||||||
let start = expr.span();
|
let start = expr.span();
|
||||||
self.advance();
|
self.advance();
|
||||||
|
|
||||||
|
// Check for tuple index access: expr.0, expr.1, etc.
|
||||||
|
if let TokenKind::Int(n) = self.peek_kind() {
|
||||||
|
let index = n as usize;
|
||||||
|
self.advance();
|
||||||
|
let span = start.merge(self.previous_span());
|
||||||
|
expr = Expr::TupleIndex {
|
||||||
|
object: Box::new(expr),
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
};
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
let field = self.parse_ident()?;
|
let field = self.parse_ident()?;
|
||||||
|
|
||||||
// Check if this is an effect operation: Effect.operation(args)
|
// Check if this is an effect operation: Effect.operation(args)
|
||||||
@@ -1681,11 +1752,14 @@ impl Parser {
|
|||||||
|
|
||||||
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
|
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
|
||||||
let mut args = Vec::new();
|
let mut args = Vec::new();
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
args.push(self.parse_expr()?);
|
args.push(self.parse_expr()?);
|
||||||
|
self.skip_newlines();
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1757,6 +1831,7 @@ impl Parser {
|
|||||||
TokenKind::Let => self.parse_let_expr(),
|
TokenKind::Let => self.parse_let_expr(),
|
||||||
TokenKind::Fn => self.parse_lambda_expr(),
|
TokenKind::Fn => self.parse_lambda_expr(),
|
||||||
TokenKind::Run => self.parse_run_expr(),
|
TokenKind::Run => self.parse_run_expr(),
|
||||||
|
TokenKind::Handle => self.parse_handle_expr(),
|
||||||
TokenKind::Resume => self.parse_resume_expr(),
|
TokenKind::Resume => self.parse_resume_expr(),
|
||||||
|
|
||||||
// Delimiters
|
// Delimiters
|
||||||
@@ -1774,6 +1849,7 @@ impl Parser {
|
|||||||
|
|
||||||
let condition = Box::new(self.parse_expr()?);
|
let condition = Box::new(self.parse_expr()?);
|
||||||
|
|
||||||
|
self.skip_newlines();
|
||||||
self.expect(TokenKind::Then)?;
|
self.expect(TokenKind::Then)?;
|
||||||
self.skip_newlines();
|
self.skip_newlines();
|
||||||
let then_branch = Box::new(self.parse_expr()?);
|
let then_branch = Box::new(self.parse_expr()?);
|
||||||
@@ -1898,9 +1974,27 @@ impl Parser {
|
|||||||
TokenKind::Ident(name) => {
|
TokenKind::Ident(name) => {
|
||||||
// Check if it starts with uppercase (constructor) or lowercase (variable)
|
// Check if it starts with uppercase (constructor) or lowercase (variable)
|
||||||
if name.chars().next().map_or(false, |c| c.is_uppercase()) {
|
if name.chars().next().map_or(false, |c| c.is_uppercase()) {
|
||||||
self.parse_constructor_pattern()
|
self.parse_constructor_pattern_with_module(None)
|
||||||
} else {
|
} else {
|
||||||
let ident = self.parse_ident()?;
|
let ident = self.parse_ident()?;
|
||||||
|
// Check for module-qualified constructor: module.Constructor
|
||||||
|
if self.check(TokenKind::Dot) {
|
||||||
|
// Peek ahead to see if next is an uppercase identifier
|
||||||
|
let dot_pos = self.pos;
|
||||||
|
self.advance(); // skip dot
|
||||||
|
if let TokenKind::Ident(next_name) = self.peek_kind() {
|
||||||
|
if next_name
|
||||||
|
.chars()
|
||||||
|
.next()
|
||||||
|
.map_or(false, |c| c.is_uppercase())
|
||||||
|
{
|
||||||
|
return self
|
||||||
|
.parse_constructor_pattern_with_module(Some(ident));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Not a module-qualified constructor, backtrack
|
||||||
|
self.pos = dot_pos;
|
||||||
|
}
|
||||||
Ok(Pattern::Var(ident))
|
Ok(Pattern::Var(ident))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1910,25 +2004,40 @@ impl Parser {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn parse_constructor_pattern(&mut self) -> Result<Pattern, ParseError> {
|
fn parse_constructor_pattern_with_module(
|
||||||
let start = self.current_span();
|
&mut self,
|
||||||
|
module: Option<Ident>,
|
||||||
|
) -> Result<Pattern, ParseError> {
|
||||||
|
let start = module
|
||||||
|
.as_ref()
|
||||||
|
.map(|m| m.span)
|
||||||
|
.unwrap_or_else(|| self.current_span());
|
||||||
let name = self.parse_ident()?;
|
let name = self.parse_ident()?;
|
||||||
|
|
||||||
if self.check(TokenKind::LParen) {
|
if self.check(TokenKind::LParen) {
|
||||||
self.advance();
|
self.advance();
|
||||||
|
self.skip_newlines();
|
||||||
let mut fields = Vec::new();
|
let mut fields = Vec::new();
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
fields.push(self.parse_pattern()?);
|
fields.push(self.parse_pattern()?);
|
||||||
|
self.skip_newlines();
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
self.expect(TokenKind::RParen)?;
|
self.expect(TokenKind::RParen)?;
|
||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
Ok(Pattern::Constructor { name, fields, span })
|
|
||||||
} else {
|
|
||||||
let span = name.span;
|
|
||||||
Ok(Pattern::Constructor {
|
Ok(Pattern::Constructor {
|
||||||
|
module,
|
||||||
|
name,
|
||||||
|
fields,
|
||||||
|
span,
|
||||||
|
})
|
||||||
|
} else {
|
||||||
|
let span = start.merge(name.span);
|
||||||
|
Ok(Pattern::Constructor {
|
||||||
|
module,
|
||||||
name,
|
name,
|
||||||
fields: Vec::new(),
|
fields: Vec::new(),
|
||||||
span,
|
span,
|
||||||
@@ -1939,12 +2048,15 @@ impl Parser {
|
|||||||
fn parse_tuple_pattern(&mut self) -> Result<Pattern, ParseError> {
|
fn parse_tuple_pattern(&mut self) -> Result<Pattern, ParseError> {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
self.expect(TokenKind::LParen)?;
|
self.expect(TokenKind::LParen)?;
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
let mut elements = Vec::new();
|
let mut elements = Vec::new();
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
elements.push(self.parse_pattern()?);
|
elements.push(self.parse_pattern()?);
|
||||||
|
self.skip_newlines();
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2074,6 +2186,7 @@ impl Parser {
|
|||||||
|
|
||||||
fn parse_lambda_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
|
fn parse_lambda_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
|
||||||
let mut params = Vec::new();
|
let mut params = Vec::new();
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
while !self.check(TokenKind::RParen) {
|
while !self.check(TokenKind::RParen) {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
@@ -2089,9 +2202,11 @@ impl Parser {
|
|||||||
|
|
||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
params.push(Parameter { name, typ, span });
|
params.push(Parameter { name, typ, span });
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
if !self.check(TokenKind::RParen) {
|
if !self.check(TokenKind::RParen) {
|
||||||
self.expect(TokenKind::Comma)?;
|
self.expect(TokenKind::Comma)?;
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2132,6 +2247,40 @@ impl Parser {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn parse_handle_expr(&mut self) -> Result<Expr, ParseError> {
|
||||||
|
let start = self.current_span();
|
||||||
|
self.expect(TokenKind::Handle)?;
|
||||||
|
|
||||||
|
let expr = Box::new(self.parse_call_expr()?);
|
||||||
|
|
||||||
|
self.expect(TokenKind::With)?;
|
||||||
|
self.expect(TokenKind::LBrace)?;
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
|
let mut handlers = Vec::new();
|
||||||
|
while !self.check(TokenKind::RBrace) {
|
||||||
|
let effect = self.parse_ident()?;
|
||||||
|
self.expect(TokenKind::Eq)?;
|
||||||
|
let handler = self.parse_expr()?;
|
||||||
|
handlers.push((effect, handler));
|
||||||
|
|
||||||
|
self.skip_newlines();
|
||||||
|
if self.check(TokenKind::Comma) {
|
||||||
|
self.advance();
|
||||||
|
}
|
||||||
|
self.skip_newlines();
|
||||||
|
}
|
||||||
|
|
||||||
|
let end = self.current_span();
|
||||||
|
self.expect(TokenKind::RBrace)?;
|
||||||
|
|
||||||
|
Ok(Expr::Run {
|
||||||
|
expr,
|
||||||
|
handlers,
|
||||||
|
span: start.merge(end),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> {
|
fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
self.expect(TokenKind::Resume)?;
|
self.expect(TokenKind::Resume)?;
|
||||||
@@ -2145,6 +2294,7 @@ impl Parser {
|
|||||||
fn parse_tuple_or_paren_expr(&mut self) -> Result<Expr, ParseError> {
|
fn parse_tuple_or_paren_expr(&mut self) -> Result<Expr, ParseError> {
|
||||||
let start = self.current_span();
|
let start = self.current_span();
|
||||||
self.expect(TokenKind::LParen)?;
|
self.expect(TokenKind::LParen)?;
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
if self.check(TokenKind::RParen) {
|
if self.check(TokenKind::RParen) {
|
||||||
self.advance();
|
self.advance();
|
||||||
@@ -2155,16 +2305,19 @@ impl Parser {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let first = self.parse_expr()?;
|
let first = self.parse_expr()?;
|
||||||
|
self.skip_newlines();
|
||||||
|
|
||||||
if self.check(TokenKind::Comma) {
|
if self.check(TokenKind::Comma) {
|
||||||
// Tuple
|
// Tuple
|
||||||
let mut elements = vec![first];
|
let mut elements = vec![first];
|
||||||
while self.check(TokenKind::Comma) {
|
while self.check(TokenKind::Comma) {
|
||||||
self.advance();
|
self.advance();
|
||||||
|
self.skip_newlines();
|
||||||
if self.check(TokenKind::RParen) {
|
if self.check(TokenKind::RParen) {
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
elements.push(self.parse_expr()?);
|
elements.push(self.parse_expr()?);
|
||||||
|
self.skip_newlines();
|
||||||
}
|
}
|
||||||
self.expect(TokenKind::RParen)?;
|
self.expect(TokenKind::RParen)?;
|
||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
@@ -2190,12 +2343,39 @@ impl Parser {
|
|||||||
}));
|
}));
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check if it's a record (ident: expr) or block
|
// Check for record spread: { ...expr, field: val }
|
||||||
|
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
|
||||||
|
return self.parse_record_expr_rest(start);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if it's a record (ident: expr or ident.path: expr) or block
|
||||||
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
|
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
|
||||||
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
|
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
|
||||||
if matches!(lookahead, Some(TokenKind::Colon)) {
|
if matches!(lookahead, Some(TokenKind::Colon)) {
|
||||||
return self.parse_record_expr_rest(start);
|
return self.parse_record_expr_rest(start);
|
||||||
}
|
}
|
||||||
|
// Check for deep path record: { ident.ident...: expr }
|
||||||
|
if matches!(lookahead, Some(TokenKind::Dot)) {
|
||||||
|
let mut look = self.pos + 2;
|
||||||
|
loop {
|
||||||
|
match self.tokens.get(look).map(|t| &t.kind) {
|
||||||
|
Some(TokenKind::Ident(_)) => {
|
||||||
|
look += 1;
|
||||||
|
match self.tokens.get(look).map(|t| &t.kind) {
|
||||||
|
Some(TokenKind::Colon) => {
|
||||||
|
return self.parse_record_expr_rest(start);
|
||||||
|
}
|
||||||
|
Some(TokenKind::Dot) => {
|
||||||
|
look += 1;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
_ => break,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => break,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// It's a block
|
// It's a block
|
||||||
@@ -2203,13 +2383,40 @@ impl Parser {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
||||||
let mut fields = Vec::new();
|
let mut raw_fields: Vec<(Vec<Ident>, Expr)> = Vec::new();
|
||||||
|
let mut spread = None;
|
||||||
|
let mut has_deep_paths = false;
|
||||||
|
|
||||||
|
// Check for spread: { ...expr, ... }
|
||||||
|
if self.check(TokenKind::DotDotDot) {
|
||||||
|
self.advance(); // consume ...
|
||||||
|
let spread_expr = self.parse_expr()?;
|
||||||
|
spread = Some(Box::new(spread_expr));
|
||||||
|
|
||||||
|
self.skip_newlines();
|
||||||
|
if self.check(TokenKind::Comma) {
|
||||||
|
self.advance();
|
||||||
|
}
|
||||||
|
self.skip_newlines();
|
||||||
|
}
|
||||||
|
|
||||||
while !self.check(TokenKind::RBrace) {
|
while !self.check(TokenKind::RBrace) {
|
||||||
let name = self.parse_ident()?;
|
let name = self.parse_ident()?;
|
||||||
|
|
||||||
|
// Check for dotted path: pos.x, pos.x.y, etc.
|
||||||
|
let mut path = vec![name];
|
||||||
|
while self.check(TokenKind::Dot) {
|
||||||
|
self.advance(); // consume .
|
||||||
|
let segment = self.parse_ident()?;
|
||||||
|
path.push(segment);
|
||||||
|
}
|
||||||
|
if path.len() > 1 {
|
||||||
|
has_deep_paths = true;
|
||||||
|
}
|
||||||
|
|
||||||
self.expect(TokenKind::Colon)?;
|
self.expect(TokenKind::Colon)?;
|
||||||
let value = self.parse_expr()?;
|
let value = self.parse_expr()?;
|
||||||
fields.push((name, value));
|
raw_fields.push((path, value));
|
||||||
|
|
||||||
self.skip_newlines();
|
self.skip_newlines();
|
||||||
if self.check(TokenKind::Comma) {
|
if self.check(TokenKind::Comma) {
|
||||||
@@ -2220,7 +2427,120 @@ impl Parser {
|
|||||||
|
|
||||||
self.expect(TokenKind::RBrace)?;
|
self.expect(TokenKind::RBrace)?;
|
||||||
let span = start.merge(self.previous_span());
|
let span = start.merge(self.previous_span());
|
||||||
Ok(Expr::Record { fields, span })
|
|
||||||
|
if has_deep_paths {
|
||||||
|
Self::desugar_deep_fields(spread, raw_fields, span)
|
||||||
|
} else {
|
||||||
|
// No deep paths — use flat fields directly (common case, no allocation overhead)
|
||||||
|
let fields = raw_fields
|
||||||
|
.into_iter()
|
||||||
|
.map(|(mut path, value)| (path.remove(0), value))
|
||||||
|
.collect();
|
||||||
|
Ok(Expr::Record {
|
||||||
|
spread,
|
||||||
|
fields,
|
||||||
|
span,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Desugar deep path record fields into nested record spread expressions.
|
||||||
|
/// `{ ...base, pos.x: vx, pos.y: vy }` becomes `{ ...base, pos: { ...base.pos, x: vx, y: vy } }`
|
||||||
|
fn desugar_deep_fields(
|
||||||
|
spread: Option<Box<Expr>>,
|
||||||
|
raw_fields: Vec<(Vec<Ident>, Expr)>,
|
||||||
|
outer_span: Span,
|
||||||
|
) -> Result<Expr, ParseError> {
|
||||||
|
use std::collections::HashMap;
|
||||||
|
|
||||||
|
// Group fields by first path segment, preserving order
|
||||||
|
let mut groups: Vec<(String, Vec<(Vec<Ident>, Expr)>)> = Vec::new();
|
||||||
|
let mut group_map: HashMap<String, usize> = HashMap::new();
|
||||||
|
|
||||||
|
for (path, value) in raw_fields {
|
||||||
|
let key = path[0].name.clone();
|
||||||
|
if let Some(&idx) = group_map.get(&key) {
|
||||||
|
groups[idx].1.push((path, value));
|
||||||
|
} else {
|
||||||
|
group_map.insert(key.clone(), groups.len());
|
||||||
|
groups.push((key, vec![(path, value)]));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut fields = Vec::new();
|
||||||
|
for (_, group) in groups {
|
||||||
|
let first_ident = group[0].0[0].clone();
|
||||||
|
|
||||||
|
let has_flat = group.iter().any(|(p, _)| p.len() == 1);
|
||||||
|
let has_deep = group.iter().any(|(p, _)| p.len() > 1);
|
||||||
|
|
||||||
|
if has_flat && has_deep {
|
||||||
|
return Err(ParseError {
|
||||||
|
message: format!(
|
||||||
|
"Field '{}' appears as both a direct field and a deep path prefix",
|
||||||
|
first_ident.name
|
||||||
|
),
|
||||||
|
span: first_ident.span,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if has_flat {
|
||||||
|
if group.len() > 1 {
|
||||||
|
return Err(ParseError {
|
||||||
|
message: format!("Duplicate field '{}'", first_ident.name),
|
||||||
|
span: group[1].0[0].span,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
let (_, value) = group.into_iter().next().unwrap();
|
||||||
|
fields.push((first_ident, value));
|
||||||
|
} else {
|
||||||
|
// Deep paths — create nested record with spread from parent
|
||||||
|
let sub_spread = spread.as_ref().map(|s| {
|
||||||
|
Box::new(Expr::Field {
|
||||||
|
object: s.clone(),
|
||||||
|
field: first_ident.clone(),
|
||||||
|
span: first_ident.span,
|
||||||
|
})
|
||||||
|
});
|
||||||
|
|
||||||
|
// Strip first segment from all paths
|
||||||
|
let sub_fields: Vec<(Vec<Ident>, Expr)> = group
|
||||||
|
.into_iter()
|
||||||
|
.map(|(mut path, value)| {
|
||||||
|
path.remove(0);
|
||||||
|
(path, value)
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
let has_nested_deep = sub_fields.iter().any(|(p, _)| p.len() > 1);
|
||||||
|
if has_nested_deep {
|
||||||
|
// Recursively desugar deeper paths
|
||||||
|
let nested =
|
||||||
|
Self::desugar_deep_fields(sub_spread, sub_fields, first_ident.span)?;
|
||||||
|
fields.push((first_ident, nested));
|
||||||
|
} else {
|
||||||
|
// All sub-paths are single-segment — build Record directly
|
||||||
|
let flat_fields: Vec<(Ident, Expr)> = sub_fields
|
||||||
|
.into_iter()
|
||||||
|
.map(|(mut path, value)| (path.remove(0), value))
|
||||||
|
.collect();
|
||||||
|
fields.push((
|
||||||
|
first_ident.clone(),
|
||||||
|
Expr::Record {
|
||||||
|
spread: sub_spread,
|
||||||
|
fields: flat_fields,
|
||||||
|
span: first_ident.span,
|
||||||
|
},
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(Expr::Record {
|
||||||
|
spread,
|
||||||
|
fields,
|
||||||
|
span: outer_span,
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
|
||||||
|
|||||||
@@ -228,13 +228,14 @@ impl SymbolTable {
|
|||||||
Declaration::Let(let_decl) => {
|
Declaration::Let(let_decl) => {
|
||||||
let is_public = matches!(let_decl.visibility, Visibility::Public);
|
let is_public = matches!(let_decl.visibility, Visibility::Public);
|
||||||
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
|
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
let_decl.name.name.clone(),
|
let_decl.name.name.clone(),
|
||||||
SymbolKind::Variable,
|
SymbolKind::Variable,
|
||||||
let_decl.span,
|
let_decl.span,
|
||||||
type_sig,
|
type_sig,
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = let_decl.doc.clone();
|
||||||
let id = self.add_symbol(scope_idx, symbol);
|
let id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(id, let_decl.name.span, true, true);
|
self.add_reference(id, let_decl.name.span, true, true);
|
||||||
|
|
||||||
@@ -244,6 +245,30 @@ impl SymbolTable {
|
|||||||
Declaration::Handler(h) => self.visit_handler(h, scope_idx),
|
Declaration::Handler(h) => self.visit_handler(h, scope_idx),
|
||||||
Declaration::Trait(t) => self.visit_trait(t, scope_idx),
|
Declaration::Trait(t) => self.visit_trait(t, scope_idx),
|
||||||
Declaration::Impl(i) => self.visit_impl(i, scope_idx),
|
Declaration::Impl(i) => self.visit_impl(i, scope_idx),
|
||||||
|
Declaration::ExternFn(ext) => {
|
||||||
|
let is_public = matches!(ext.visibility, Visibility::Public);
|
||||||
|
let params: Vec<String> = ext
|
||||||
|
.params
|
||||||
|
.iter()
|
||||||
|
.map(|p| format!("{}: {}", p.name.name, self.type_expr_to_string(&p.typ)))
|
||||||
|
.collect();
|
||||||
|
let sig = format!(
|
||||||
|
"extern fn {}({}): {}",
|
||||||
|
ext.name.name,
|
||||||
|
params.join(", "),
|
||||||
|
self.type_expr_to_string(&ext.return_type)
|
||||||
|
);
|
||||||
|
let mut symbol = self.new_symbol(
|
||||||
|
ext.name.name.clone(),
|
||||||
|
SymbolKind::Function,
|
||||||
|
ext.span,
|
||||||
|
Some(sig),
|
||||||
|
is_public,
|
||||||
|
);
|
||||||
|
symbol.documentation = ext.doc.clone();
|
||||||
|
let id = self.add_symbol(scope_idx, symbol);
|
||||||
|
self.add_reference(id, ext.name.span, true, true);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -279,13 +304,14 @@ impl SymbolTable {
|
|||||||
};
|
};
|
||||||
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
|
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
f.name.name.clone(),
|
f.name.name.clone(),
|
||||||
SymbolKind::Function,
|
SymbolKind::Function,
|
||||||
f.name.span,
|
f.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = f.doc.clone();
|
||||||
let fn_id = self.add_symbol(scope_idx, symbol);
|
let fn_id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(fn_id, f.name.span, true, false);
|
self.add_reference(fn_id, f.name.span, true, false);
|
||||||
|
|
||||||
@@ -326,13 +352,14 @@ impl SymbolTable {
|
|||||||
let is_public = matches!(t.visibility, Visibility::Public);
|
let is_public = matches!(t.visibility, Visibility::Public);
|
||||||
let type_sig = format!("type {}", t.name.name);
|
let type_sig = format!("type {}", t.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
t.name.name.clone(),
|
t.name.name.clone(),
|
||||||
SymbolKind::Type,
|
SymbolKind::Type,
|
||||||
t.name.span,
|
t.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = t.doc.clone();
|
||||||
let type_id = self.add_symbol(scope_idx, symbol);
|
let type_id = self.add_symbol(scope_idx, symbol);
|
||||||
self.add_reference(type_id, t.name.span, true, false);
|
self.add_reference(type_id, t.name.span, true, false);
|
||||||
|
|
||||||
@@ -372,13 +399,14 @@ impl SymbolTable {
|
|||||||
let is_public = true; // Effects are typically public
|
let is_public = true; // Effects are typically public
|
||||||
let type_sig = format!("effect {}", e.name.name);
|
let type_sig = format!("effect {}", e.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
e.name.name.clone(),
|
e.name.name.clone(),
|
||||||
SymbolKind::Effect,
|
SymbolKind::Effect,
|
||||||
e.name.span,
|
e.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = e.doc.clone();
|
||||||
let effect_id = self.add_symbol(scope_idx, symbol);
|
let effect_id = self.add_symbol(scope_idx, symbol);
|
||||||
|
|
||||||
// Add operations
|
// Add operations
|
||||||
@@ -409,13 +437,14 @@ impl SymbolTable {
|
|||||||
let is_public = matches!(t.visibility, Visibility::Public);
|
let is_public = matches!(t.visibility, Visibility::Public);
|
||||||
let type_sig = format!("trait {}", t.name.name);
|
let type_sig = format!("trait {}", t.name.name);
|
||||||
|
|
||||||
let symbol = self.new_symbol(
|
let mut symbol = self.new_symbol(
|
||||||
t.name.name.clone(),
|
t.name.name.clone(),
|
||||||
SymbolKind::Type, // Traits are like types
|
SymbolKind::Type, // Traits are like types
|
||||||
t.name.span,
|
t.name.span,
|
||||||
Some(type_sig),
|
Some(type_sig),
|
||||||
is_public,
|
is_public,
|
||||||
);
|
);
|
||||||
|
symbol.documentation = t.doc.clone();
|
||||||
self.add_symbol(scope_idx, symbol);
|
self.add_symbol(scope_idx, symbol);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -479,7 +508,7 @@ impl SymbolTable {
|
|||||||
self.visit_expr(arg, scope_idx);
|
self.visit_expr(arg, scope_idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => {
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
|
||||||
self.visit_expr(object, scope_idx);
|
self.visit_expr(object, scope_idx);
|
||||||
}
|
}
|
||||||
Expr::If { condition, then_branch, else_branch, .. } => {
|
Expr::If { condition, then_branch, else_branch, .. } => {
|
||||||
@@ -522,7 +551,10 @@ impl SymbolTable {
|
|||||||
self.visit_expr(e, scope_idx);
|
self.visit_expr(e, scope_idx);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
if let Some(spread_expr) = spread {
|
||||||
|
self.visit_expr(spread_expr, scope_idx);
|
||||||
|
}
|
||||||
for (_, e) in fields {
|
for (_, e) in fields {
|
||||||
self.visit_expr(e, scope_idx);
|
self.visit_expr(e, scope_idx);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,9 +5,9 @@
|
|||||||
use std::collections::HashMap;
|
use std::collections::HashMap;
|
||||||
|
|
||||||
use crate::ast::{
|
use crate::ast::{
|
||||||
self, BinaryOp, Declaration, EffectDecl, Expr, FunctionDecl, HandlerDecl, Ident, ImplDecl,
|
self, BinaryOp, Declaration, EffectDecl, ExternFnDecl, Expr, FunctionDecl, HandlerDecl, Ident,
|
||||||
ImportDecl, LetDecl, Literal, LiteralKind, MatchArm, Parameter, Pattern, Program, Span,
|
ImplDecl, ImportDecl, LetDecl, Literal, LiteralKind, MatchArm, Parameter, Pattern, Program,
|
||||||
Statement, TraitDecl, TypeDecl, TypeExpr, UnaryOp, VariantFields,
|
Span, Statement, TraitDecl, TypeDecl, TypeExpr, UnaryOp, VariantFields,
|
||||||
};
|
};
|
||||||
use crate::diagnostics::{find_similar_names, format_did_you_mean, Diagnostic, ErrorCode, Severity};
|
use crate::diagnostics::{find_similar_names, format_did_you_mean, Diagnostic, ErrorCode, Severity};
|
||||||
use crate::exhaustiveness::{check_exhaustiveness, missing_patterns_hint};
|
use crate::exhaustiveness::{check_exhaustiveness, missing_patterns_hint};
|
||||||
@@ -335,11 +335,14 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
|
|||||||
Statement::Expr(e) => references_params(e, params),
|
Statement::Expr(e) => references_params(e, params),
|
||||||
}) || references_params(result, params)
|
}) || references_params(result, params)
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => references_params(object, params),
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => references_params(object, params),
|
||||||
Expr::Lambda { body, .. } => references_params(body, params),
|
Expr::Lambda { body, .. } => references_params(body, params),
|
||||||
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
||||||
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
|
||||||
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)),
|
Expr::Record { spread, fields, .. } => {
|
||||||
|
spread.as_ref().is_some_and(|s| references_params(s, params))
|
||||||
|
|| fields.iter().any(|(_, e)| references_params(e, params))
|
||||||
|
}
|
||||||
Expr::Match { scrutinee, arms, .. } => {
|
Expr::Match { scrutinee, arms, .. } => {
|
||||||
references_params(scrutinee, params)
|
references_params(scrutinee, params)
|
||||||
|| arms.iter().any(|a| references_params(&a.body, params))
|
|| arms.iter().any(|a| references_params(&a.body, params))
|
||||||
@@ -516,10 +519,11 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
|
|||||||
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
|
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
|
||||||
elements.iter().any(|e| has_recursive_calls(func_name, e))
|
elements.iter().any(|e| has_recursive_calls(func_name, e))
|
||||||
}
|
}
|
||||||
Expr::Record { fields, .. } => {
|
Expr::Record { spread, fields, .. } => {
|
||||||
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
|
spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|
||||||
|
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
|
||||||
}
|
}
|
||||||
Expr::Field { object, .. } => has_recursive_calls(func_name, object),
|
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
|
||||||
Expr::Let { value, body, .. } => {
|
Expr::Let { value, body, .. } => {
|
||||||
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
|
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
|
||||||
}
|
}
|
||||||
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
|
|||||||
|
|
||||||
// Build the record expression
|
// Build the record expression
|
||||||
Some(Expr::Record {
|
Some(Expr::Record {
|
||||||
|
spread: None,
|
||||||
fields: field_exprs,
|
fields: field_exprs,
|
||||||
span,
|
span,
|
||||||
})
|
})
|
||||||
@@ -976,6 +981,13 @@ impl TypeChecker {
|
|||||||
if !fields.is_empty() {
|
if !fields.is_empty() {
|
||||||
self.env.bind(&name, TypeScheme::mono(Type::Record(fields)));
|
self.env.bind(&name, TypeScheme::mono(Type::Record(fields)));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Also copy type definitions so imported types are usable
|
||||||
|
for (type_name, type_def) in &module_checker.env.types {
|
||||||
|
if !self.env.types.contains_key(type_name) {
|
||||||
|
self.env.types.insert(type_name.clone(), type_def.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
ImportKind::Direct => {
|
ImportKind::Direct => {
|
||||||
// Import a specific name directly
|
// Import a specific name directly
|
||||||
@@ -1215,6 +1227,17 @@ impl TypeChecker {
|
|||||||
let trait_impl = self.collect_impl(impl_decl);
|
let trait_impl = self.collect_impl(impl_decl);
|
||||||
self.env.trait_impls.push(trait_impl);
|
self.env.trait_impls.push(trait_impl);
|
||||||
}
|
}
|
||||||
|
Declaration::ExternFn(ext) => {
|
||||||
|
// Register extern fn type signature (like a regular function but no body)
|
||||||
|
let param_types: Vec<Type> = ext
|
||||||
|
.params
|
||||||
|
.iter()
|
||||||
|
.map(|p| self.resolve_type(&p.typ))
|
||||||
|
.collect();
|
||||||
|
let return_type = self.resolve_type(&ext.return_type);
|
||||||
|
let fn_type = Type::function(param_types, return_type);
|
||||||
|
self.env.bind(&ext.name.name, TypeScheme::mono(fn_type));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1536,7 +1559,7 @@ impl TypeChecker {
|
|||||||
// Use the declared type if present, otherwise use inferred
|
// Use the declared type if present, otherwise use inferred
|
||||||
let final_type = if let Some(ref type_expr) = let_decl.typ {
|
let final_type = if let Some(ref type_expr) = let_decl.typ {
|
||||||
let declared = self.resolve_type(type_expr);
|
let declared = self.resolve_type(type_expr);
|
||||||
if let Err(e) = unify(&inferred, &declared) {
|
if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Variable '{}' has type {}, but declared type is {}: {}",
|
"Variable '{}' has type {}, but declared type is {}: {}",
|
||||||
@@ -1673,6 +1696,42 @@ impl TypeChecker {
|
|||||||
span,
|
span,
|
||||||
} => self.infer_field(object, field, *span),
|
} => self.infer_field(object, field, *span),
|
||||||
|
|
||||||
|
Expr::TupleIndex {
|
||||||
|
object,
|
||||||
|
index,
|
||||||
|
span,
|
||||||
|
} => {
|
||||||
|
let object_type = self.infer_expr(object);
|
||||||
|
match &object_type {
|
||||||
|
Type::Tuple(types) => {
|
||||||
|
if *index < types.len() {
|
||||||
|
types[*index].clone()
|
||||||
|
} else {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Tuple index {} out of bounds for tuple with {} elements",
|
||||||
|
index,
|
||||||
|
types.len()
|
||||||
|
),
|
||||||
|
span: *span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Type::Var(_) => Type::var(),
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Cannot use tuple index on non-tuple type {}",
|
||||||
|
object_type
|
||||||
|
),
|
||||||
|
span: *span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Expr::Lambda {
|
Expr::Lambda {
|
||||||
params,
|
params,
|
||||||
return_type,
|
return_type,
|
||||||
@@ -1708,7 +1767,11 @@ impl TypeChecker {
|
|||||||
span,
|
span,
|
||||||
} => self.infer_block(statements, result, *span),
|
} => self.infer_block(statements, result, *span),
|
||||||
|
|
||||||
Expr::Record { fields, span } => self.infer_record(fields, *span),
|
Expr::Record {
|
||||||
|
spread,
|
||||||
|
fields,
|
||||||
|
span,
|
||||||
|
} => self.infer_record(spread.as_deref(), fields, *span),
|
||||||
|
|
||||||
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
|
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
|
||||||
|
|
||||||
@@ -1747,7 +1810,7 @@ impl TypeChecker {
|
|||||||
match op {
|
match op {
|
||||||
BinaryOp::Add => {
|
BinaryOp::Add => {
|
||||||
// Add supports both numeric types and string concatenation
|
// Add supports both numeric types and string concatenation
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1768,9 +1831,32 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
BinaryOp::Concat => {
|
||||||
|
// Concat (++) supports strings and lists
|
||||||
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!("Operands of '++' must have same type: {}", e),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
match &left_type {
|
||||||
|
Type::String | Type::List(_) | Type::Var(_) => left_type,
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Operator '++' requires String or List operands, got {}",
|
||||||
|
left_type
|
||||||
|
),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
Type::Error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
|
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
|
||||||
// Arithmetic: both operands must be same numeric type
|
// Arithmetic: both operands must be same numeric type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1794,7 +1880,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::Eq | BinaryOp::Ne => {
|
BinaryOp::Eq | BinaryOp::Ne => {
|
||||||
// Equality: operands must have same type
|
// Equality: operands must have same type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1805,7 +1891,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
|
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
|
||||||
// Comparison: operands must be same orderable type
|
// Comparison: operands must be same orderable type
|
||||||
if let Err(e) = unify(&left_type, &right_type) {
|
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operands of '{}' must have same type: {}", op, e),
|
message: format!("Operands of '{}' must have same type: {}", op, e),
|
||||||
span,
|
span,
|
||||||
@@ -1816,13 +1902,13 @@ impl TypeChecker {
|
|||||||
|
|
||||||
BinaryOp::And | BinaryOp::Or => {
|
BinaryOp::And | BinaryOp::Or => {
|
||||||
// Logical: both must be Bool
|
// Logical: both must be Bool
|
||||||
if let Err(e) = unify(&left_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Left operand of '{}' must be Bool: {}", op, e),
|
message: format!("Left operand of '{}' must be Bool: {}", op, e),
|
||||||
span: left.span(),
|
span: left.span(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
if let Err(e) = unify(&right_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Right operand of '{}' must be Bool: {}", op, e),
|
message: format!("Right operand of '{}' must be Bool: {}", op, e),
|
||||||
span: right.span(),
|
span: right.span(),
|
||||||
@@ -1836,7 +1922,7 @@ impl TypeChecker {
|
|||||||
// right must be a function that accepts left's type
|
// right must be a function that accepts left's type
|
||||||
let result_type = Type::var();
|
let result_type = Type::var();
|
||||||
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
|
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
|
||||||
if let Err(e) = unify(&right_type, &expected_fn) {
|
if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Pipe target must be a function accepting {}: {}",
|
"Pipe target must be a function accepting {}: {}",
|
||||||
@@ -1868,7 +1954,7 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
UnaryOp::Not => {
|
UnaryOp::Not => {
|
||||||
if let Err(e) = unify(&operand_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Operator '!' requires Bool operand: {}", e),
|
message: format!("Operator '!' requires Bool operand: {}", e),
|
||||||
span,
|
span,
|
||||||
@@ -1883,6 +1969,17 @@ impl TypeChecker {
|
|||||||
let func_type = self.infer_expr(func);
|
let func_type = self.infer_expr(func);
|
||||||
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
||||||
|
|
||||||
|
// Propagate effects from callback arguments to enclosing scope
|
||||||
|
for arg_type in &arg_types {
|
||||||
|
if let Type::Function { effects, .. } = arg_type {
|
||||||
|
for effect in &effects.effects {
|
||||||
|
if self.inferring_effects {
|
||||||
|
self.inferred_effects.insert(effect.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Check property constraints from where clauses
|
// Check property constraints from where clauses
|
||||||
if let Expr::Var(func_id) = func {
|
if let Expr::Var(func_id) = func {
|
||||||
if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() {
|
if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() {
|
||||||
@@ -1919,7 +2016,7 @@ impl TypeChecker {
|
|||||||
self.current_effects.clone(),
|
self.current_effects.clone(),
|
||||||
);
|
);
|
||||||
|
|
||||||
match unify(&func_type, &expected_fn) {
|
match unify_with_env(&func_type, &expected_fn, &self.env) {
|
||||||
Ok(subst) => result_type.apply(&subst),
|
Ok(subst) => result_type.apply(&subst),
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
// Provide more detailed error message based on the type of mismatch
|
// Provide more detailed error message based on the type of mismatch
|
||||||
@@ -1993,10 +2090,22 @@ impl TypeChecker {
|
|||||||
if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) {
|
if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) {
|
||||||
// It's a function call on a module field
|
// It's a function call on a module field
|
||||||
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
||||||
|
|
||||||
|
// Propagate effects from callback arguments to enclosing scope
|
||||||
|
for arg_type in &arg_types {
|
||||||
|
if let Type::Function { effects, .. } = arg_type {
|
||||||
|
for effect in &effects.effects {
|
||||||
|
if self.inferring_effects {
|
||||||
|
self.inferred_effects.insert(effect.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
let result_type = Type::var();
|
let result_type = Type::var();
|
||||||
let expected_fn = Type::function(arg_types, result_type.clone());
|
let expected_fn = Type::function(arg_types, result_type.clone());
|
||||||
|
|
||||||
if let Err(e) = unify(field_type, &expected_fn) {
|
if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Type mismatch in {}.{} call: {}",
|
"Type mismatch in {}.{} call: {}",
|
||||||
@@ -2052,6 +2161,17 @@ impl TypeChecker {
|
|||||||
// Check argument types
|
// Check argument types
|
||||||
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
|
||||||
|
|
||||||
|
// Propagate effects from callback arguments to enclosing scope
|
||||||
|
for arg_type in &arg_types {
|
||||||
|
if let Type::Function { effects, .. } = arg_type {
|
||||||
|
for effect in &effects.effects {
|
||||||
|
if self.inferring_effects {
|
||||||
|
self.inferred_effects.insert(effect.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if arg_types.len() != op.params.len() {
|
if arg_types.len() != op.params.len() {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
@@ -2068,7 +2188,7 @@ impl TypeChecker {
|
|||||||
for (i, (arg_type, (_, param_type))) in
|
for (i, (arg_type, (_, param_type))) in
|
||||||
arg_types.iter().zip(op.params.iter()).enumerate()
|
arg_types.iter().zip(op.params.iter()).enumerate()
|
||||||
{
|
{
|
||||||
if let Err(e) = unify(arg_type, param_type) {
|
if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Argument {} of '{}.{}' has type {}, expected {}: {}",
|
"Argument {} of '{}.{}' has type {}, expected {}: {}",
|
||||||
@@ -2101,6 +2221,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
|
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
|
||||||
let object_type = self.infer_expr(object);
|
let object_type = self.infer_expr(object);
|
||||||
|
let object_type = self.env.expand_type_alias(&object_type);
|
||||||
|
|
||||||
match &object_type {
|
match &object_type {
|
||||||
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
|
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
|
||||||
@@ -2181,7 +2302,7 @@ impl TypeChecker {
|
|||||||
// Check return type if specified
|
// Check return type if specified
|
||||||
let ret_type = if let Some(rt) = return_type {
|
let ret_type = if let Some(rt) = return_type {
|
||||||
let declared = self.resolve_type(rt);
|
let declared = self.resolve_type(rt);
|
||||||
if let Err(e) = unify(&body_type, &declared) {
|
if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Lambda body type {} doesn't match declared {}: {}",
|
"Lambda body type {} doesn't match declared {}: {}",
|
||||||
@@ -2247,7 +2368,7 @@ impl TypeChecker {
|
|||||||
span: Span,
|
span: Span,
|
||||||
) -> Type {
|
) -> Type {
|
||||||
let cond_type = self.infer_expr(condition);
|
let cond_type = self.infer_expr(condition);
|
||||||
if let Err(e) = unify(&cond_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
|
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
|
||||||
span: condition.span(),
|
span: condition.span(),
|
||||||
@@ -2257,7 +2378,7 @@ impl TypeChecker {
|
|||||||
let then_type = self.infer_expr(then_branch);
|
let then_type = self.infer_expr(then_branch);
|
||||||
let else_type = self.infer_expr(else_branch);
|
let else_type = self.infer_expr(else_branch);
|
||||||
|
|
||||||
match unify(&then_type, &else_type) {
|
match unify_with_env(&then_type, &else_type, &self.env) {
|
||||||
Ok(subst) => then_type.apply(&subst),
|
Ok(subst) => then_type.apply(&subst),
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
@@ -2298,7 +2419,7 @@ impl TypeChecker {
|
|||||||
// Check guard if present
|
// Check guard if present
|
||||||
if let Some(ref guard) = arm.guard {
|
if let Some(ref guard) = arm.guard {
|
||||||
let guard_type = self.infer_expr(guard);
|
let guard_type = self.infer_expr(guard);
|
||||||
if let Err(e) = unify(&guard_type, &Type::Bool) {
|
if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Match guard must be Bool: {}", e),
|
message: format!("Match guard must be Bool: {}", e),
|
||||||
span: guard.span(),
|
span: guard.span(),
|
||||||
@@ -2314,7 +2435,7 @@ impl TypeChecker {
|
|||||||
match &result_type {
|
match &result_type {
|
||||||
None => result_type = Some(body_type),
|
None => result_type = Some(body_type),
|
||||||
Some(prev) => {
|
Some(prev) => {
|
||||||
if let Err(e) = unify(prev, &body_type) {
|
if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Match arm has incompatible type: expected {}, got {}: {}",
|
"Match arm has incompatible type: expected {}, got {}: {}",
|
||||||
@@ -2364,7 +2485,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
Pattern::Literal(lit) => {
|
Pattern::Literal(lit) => {
|
||||||
let lit_type = self.infer_literal(lit);
|
let lit_type = self.infer_literal(lit);
|
||||||
if let Err(e) = unify(&lit_type, expected) {
|
if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Pattern literal type mismatch: {}", e),
|
message: format!("Pattern literal type mismatch: {}", e),
|
||||||
span: lit.span,
|
span: lit.span,
|
||||||
@@ -2373,12 +2494,12 @@ impl TypeChecker {
|
|||||||
Vec::new()
|
Vec::new()
|
||||||
}
|
}
|
||||||
|
|
||||||
Pattern::Constructor { name, fields, span } => {
|
Pattern::Constructor { name, fields, span, .. } => {
|
||||||
// Look up constructor
|
// Look up constructor
|
||||||
// For now, handle Option specially
|
// For now, handle Option specially
|
||||||
match name.name.as_str() {
|
match name.name.as_str() {
|
||||||
"None" => {
|
"None" => {
|
||||||
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) {
|
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"None pattern doesn't match type {}: {}",
|
"None pattern doesn't match type {}: {}",
|
||||||
@@ -2391,7 +2512,7 @@ impl TypeChecker {
|
|||||||
}
|
}
|
||||||
"Some" => {
|
"Some" => {
|
||||||
let inner_type = Type::var();
|
let inner_type = Type::var();
|
||||||
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone())))
|
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
|
||||||
{
|
{
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
@@ -2420,7 +2541,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
Pattern::Tuple { elements, span } => {
|
Pattern::Tuple { elements, span } => {
|
||||||
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
|
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
|
||||||
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) {
|
if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
|
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
|
||||||
span: *span,
|
span: *span,
|
||||||
@@ -2470,7 +2591,7 @@ impl TypeChecker {
|
|||||||
|
|
||||||
if let Some(type_expr) = typ {
|
if let Some(type_expr) = typ {
|
||||||
let declared = self.resolve_type(type_expr);
|
let declared = self.resolve_type(type_expr);
|
||||||
if let Err(e) = unify(&value_type, &declared) {
|
if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Variable '{}' has type {}, but declared type is {}: {}",
|
"Variable '{}' has type {}, but declared type is {}: {}",
|
||||||
@@ -2491,12 +2612,47 @@ impl TypeChecker {
|
|||||||
self.infer_expr(result)
|
self.infer_expr(result)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type {
|
fn infer_record(
|
||||||
let field_types: Vec<(String, Type)> = fields
|
&mut self,
|
||||||
|
spread: Option<&Expr>,
|
||||||
|
fields: &[(Ident, Expr)],
|
||||||
|
span: Span,
|
||||||
|
) -> Type {
|
||||||
|
// Start with spread fields if present
|
||||||
|
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
|
||||||
|
let spread_type = self.infer_expr(spread_expr);
|
||||||
|
let spread_type = self.env.expand_type_alias(&spread_type);
|
||||||
|
match spread_type {
|
||||||
|
Type::Record(spread_fields) => spread_fields,
|
||||||
|
_ => {
|
||||||
|
self.errors.push(TypeError {
|
||||||
|
message: format!(
|
||||||
|
"Spread expression must be a record type, got {}",
|
||||||
|
spread_type
|
||||||
|
),
|
||||||
|
span,
|
||||||
|
});
|
||||||
|
Vec::new()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
Vec::new()
|
||||||
|
};
|
||||||
|
|
||||||
|
// Apply explicit field overrides
|
||||||
|
let explicit_types: Vec<(String, Type)> = fields
|
||||||
.iter()
|
.iter()
|
||||||
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
|
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
|
||||||
.collect();
|
.collect();
|
||||||
|
|
||||||
|
for (name, typ) in explicit_types {
|
||||||
|
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
|
||||||
|
existing.1 = typ;
|
||||||
|
} else {
|
||||||
|
field_types.push((name, typ));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Type::Record(field_types)
|
Type::Record(field_types)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -2513,7 +2669,7 @@ impl TypeChecker {
|
|||||||
let first_type = self.infer_expr(&elements[0]);
|
let first_type = self.infer_expr(&elements[0]);
|
||||||
for elem in &elements[1..] {
|
for elem in &elements[1..] {
|
||||||
let elem_type = self.infer_expr(elem);
|
let elem_type = self.infer_expr(elem);
|
||||||
if let Err(e) = unify(&first_type, &elem_type) {
|
if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!("List elements must have same type: {}", e),
|
message: format!("List elements must have same type: {}", e),
|
||||||
span,
|
span,
|
||||||
@@ -2819,7 +2975,7 @@ impl TypeChecker {
|
|||||||
// Check return type matches if specified
|
// Check return type matches if specified
|
||||||
if let Some(ref return_type_expr) = impl_method.return_type {
|
if let Some(ref return_type_expr) = impl_method.return_type {
|
||||||
let return_type = self.resolve_type(return_type_expr);
|
let return_type = self.resolve_type(return_type_expr);
|
||||||
if let Err(e) = unify(&body_type, &return_type) {
|
if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
|
||||||
self.errors.push(TypeError {
|
self.errors.push(TypeError {
|
||||||
message: format!(
|
message: format!(
|
||||||
"Method '{}' body has type {}, but declared return type is {}: {}",
|
"Method '{}' body has type {}, but declared return type is {}: {}",
|
||||||
@@ -2862,6 +3018,9 @@ impl TypeChecker {
|
|||||||
"Option" if resolved_args.len() == 1 => {
|
"Option" if resolved_args.len() == 1 => {
|
||||||
return Type::Option(Box::new(resolved_args[0].clone()));
|
return Type::Option(Box::new(resolved_args[0].clone()));
|
||||||
}
|
}
|
||||||
|
"Map" if resolved_args.len() == 2 => {
|
||||||
|
return Type::Map(Box::new(resolved_args[0].clone()), Box::new(resolved_args[1].clone()));
|
||||||
|
}
|
||||||
_ => {}
|
_ => {}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
240
src/types.rs
240
src/types.rs
@@ -47,6 +47,8 @@ pub enum Type {
|
|||||||
List(Box<Type>),
|
List(Box<Type>),
|
||||||
/// Option type (sugar for App(Option, [T]))
|
/// Option type (sugar for App(Option, [T]))
|
||||||
Option(Box<Type>),
|
Option(Box<Type>),
|
||||||
|
/// Map type (sugar for App(Map, [K, V]))
|
||||||
|
Map(Box<Type>, Box<Type>),
|
||||||
/// Versioned type (e.g., User @v2)
|
/// Versioned type (e.g., User @v2)
|
||||||
Versioned {
|
Versioned {
|
||||||
base: Box<Type>,
|
base: Box<Type>,
|
||||||
@@ -119,6 +121,7 @@ impl Type {
|
|||||||
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
|
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
|
||||||
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
|
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
|
||||||
Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
|
Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
|
||||||
|
Type::Map(k, v) => k.contains_var(var) || v.contains_var(var),
|
||||||
Type::Versioned { base, .. } => base.contains_var(var),
|
Type::Versioned { base, .. } => base.contains_var(var),
|
||||||
_ => false,
|
_ => false,
|
||||||
}
|
}
|
||||||
@@ -158,6 +161,7 @@ impl Type {
|
|||||||
),
|
),
|
||||||
Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
|
Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
|
||||||
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
|
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
|
||||||
|
Type::Map(k, v) => Type::Map(Box::new(k.apply(subst)), Box::new(v.apply(subst))),
|
||||||
Type::Versioned { base, version } => Type::Versioned {
|
Type::Versioned { base, version } => Type::Versioned {
|
||||||
base: Box::new(base.apply(subst)),
|
base: Box::new(base.apply(subst)),
|
||||||
version: version.clone(),
|
version: version.clone(),
|
||||||
@@ -208,6 +212,11 @@ impl Type {
|
|||||||
vars
|
vars
|
||||||
}
|
}
|
||||||
Type::List(inner) | Type::Option(inner) => inner.free_vars(),
|
Type::List(inner) | Type::Option(inner) => inner.free_vars(),
|
||||||
|
Type::Map(k, v) => {
|
||||||
|
let mut vars = k.free_vars();
|
||||||
|
vars.extend(v.free_vars());
|
||||||
|
vars
|
||||||
|
}
|
||||||
Type::Versioned { base, .. } => base.free_vars(),
|
Type::Versioned { base, .. } => base.free_vars(),
|
||||||
_ => HashSet::new(),
|
_ => HashSet::new(),
|
||||||
}
|
}
|
||||||
@@ -279,6 +288,7 @@ impl fmt::Display for Type {
|
|||||||
}
|
}
|
||||||
Type::List(inner) => write!(f, "List<{}>", inner),
|
Type::List(inner) => write!(f, "List<{}>", inner),
|
||||||
Type::Option(inner) => write!(f, "Option<{}>", inner),
|
Type::Option(inner) => write!(f, "Option<{}>", inner),
|
||||||
|
Type::Map(k, v) => write!(f, "Map<{}, {}>", k, v),
|
||||||
Type::Versioned { base, version } => {
|
Type::Versioned { base, version } => {
|
||||||
write!(f, "{} {}", base, version)
|
write!(f, "{} {}", base, version)
|
||||||
}
|
}
|
||||||
@@ -946,6 +956,46 @@ impl TypeEnv {
|
|||||||
params: vec![("path".to_string(), Type::String)],
|
params: vec![("path".to_string(), Type::String)],
|
||||||
return_type: Type::Unit,
|
return_type: Type::Unit,
|
||||||
},
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "copy".to_string(),
|
||||||
|
params: vec![
|
||||||
|
("source".to_string(), Type::String),
|
||||||
|
("dest".to_string(), Type::String),
|
||||||
|
],
|
||||||
|
return_type: Type::Unit,
|
||||||
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "glob".to_string(),
|
||||||
|
params: vec![("pattern".to_string(), Type::String)],
|
||||||
|
return_type: Type::List(Box::new(Type::String)),
|
||||||
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "tryRead".to_string(),
|
||||||
|
params: vec![("path".to_string(), Type::String)],
|
||||||
|
return_type: Type::App {
|
||||||
|
constructor: Box::new(Type::Named("Result".to_string())),
|
||||||
|
args: vec![Type::String, Type::String],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "tryWrite".to_string(),
|
||||||
|
params: vec![
|
||||||
|
("path".to_string(), Type::String),
|
||||||
|
("content".to_string(), Type::String),
|
||||||
|
],
|
||||||
|
return_type: Type::App {
|
||||||
|
constructor: Box::new(Type::Named("Result".to_string())),
|
||||||
|
args: vec![Type::Unit, Type::String],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "tryDelete".to_string(),
|
||||||
|
params: vec![("path".to_string(), Type::String)],
|
||||||
|
return_type: Type::App {
|
||||||
|
constructor: Box::new(Type::Named("Result".to_string())),
|
||||||
|
args: vec![Type::Unit, Type::String],
|
||||||
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
);
|
);
|
||||||
@@ -1146,6 +1196,15 @@ impl TypeEnv {
|
|||||||
],
|
],
|
||||||
return_type: Type::Unit,
|
return_type: Type::Unit,
|
||||||
},
|
},
|
||||||
|
EffectOpDef {
|
||||||
|
name: "assertEqualMsg".to_string(),
|
||||||
|
params: vec![
|
||||||
|
("expected".to_string(), Type::Var(0)),
|
||||||
|
("actual".to_string(), Type::Var(0)),
|
||||||
|
("label".to_string(), Type::String),
|
||||||
|
],
|
||||||
|
return_type: Type::Unit,
|
||||||
|
},
|
||||||
EffectOpDef {
|
EffectOpDef {
|
||||||
name: "assertNotEqual".to_string(),
|
name: "assertNotEqual".to_string(),
|
||||||
params: vec![
|
params: vec![
|
||||||
@@ -1480,6 +1539,16 @@ impl TypeEnv {
|
|||||||
Type::Option(Box::new(Type::var())),
|
Type::Option(Box::new(Type::var())),
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"findIndex".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
Type::function(vec![Type::var()], Type::Bool),
|
||||||
|
],
|
||||||
|
Type::Option(Box::new(Type::Int)),
|
||||||
|
),
|
||||||
|
),
|
||||||
(
|
(
|
||||||
"any".to_string(),
|
"any".to_string(),
|
||||||
Type::function(
|
Type::function(
|
||||||
@@ -1524,6 +1593,50 @@ impl TypeEnv {
|
|||||||
Type::Unit,
|
Type::Unit,
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"sort".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![Type::List(Box::new(Type::var()))],
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"sortBy".to_string(),
|
||||||
|
{
|
||||||
|
let elem = Type::var();
|
||||||
|
Type::function(
|
||||||
|
vec![
|
||||||
|
Type::List(Box::new(elem.clone())),
|
||||||
|
Type::function(vec![elem.clone(), elem], Type::Int),
|
||||||
|
],
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"zip".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
],
|
||||||
|
Type::List(Box::new(Type::Tuple(vec![Type::var(), Type::var()]))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"flatten".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![Type::List(Box::new(Type::List(Box::new(Type::var()))))],
|
||||||
|
Type::List(Box::new(Type::var())),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"contains".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![Type::List(Box::new(Type::var())), Type::var()],
|
||||||
|
Type::Bool,
|
||||||
|
),
|
||||||
|
),
|
||||||
]);
|
]);
|
||||||
env.bind("List", TypeScheme::mono(list_module_type));
|
env.bind("List", TypeScheme::mono(list_module_type));
|
||||||
|
|
||||||
@@ -1599,6 +1712,14 @@ impl TypeEnv {
|
|||||||
"parseFloat".to_string(),
|
"parseFloat".to_string(),
|
||||||
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
|
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"indexOf".to_string(),
|
||||||
|
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"lastIndexOf".to_string(),
|
||||||
|
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
|
||||||
|
),
|
||||||
]);
|
]);
|
||||||
env.bind("String", TypeScheme::mono(string_module_type));
|
env.bind("String", TypeScheme::mono(string_module_type));
|
||||||
|
|
||||||
@@ -1758,6 +1879,73 @@ impl TypeEnv {
|
|||||||
]);
|
]);
|
||||||
env.bind("Option", TypeScheme::mono(option_module_type));
|
env.bind("Option", TypeScheme::mono(option_module_type));
|
||||||
|
|
||||||
|
// Map module
|
||||||
|
let map_v = || Type::var();
|
||||||
|
let map_type = || Type::Map(Box::new(Type::String), Box::new(Type::var()));
|
||||||
|
let map_module_type = Type::Record(vec![
|
||||||
|
(
|
||||||
|
"new".to_string(),
|
||||||
|
Type::function(vec![], map_type()),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"set".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![map_type(), Type::String, map_v()],
|
||||||
|
map_type(),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"get".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![map_type(), Type::String],
|
||||||
|
Type::Option(Box::new(map_v())),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"contains".to_string(),
|
||||||
|
Type::function(vec![map_type(), Type::String], Type::Bool),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"remove".to_string(),
|
||||||
|
Type::function(vec![map_type(), Type::String], map_type()),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"keys".to_string(),
|
||||||
|
Type::function(vec![map_type()], Type::List(Box::new(Type::String))),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"values".to_string(),
|
||||||
|
Type::function(vec![map_type()], Type::List(Box::new(map_v()))),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"size".to_string(),
|
||||||
|
Type::function(vec![map_type()], Type::Int),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"isEmpty".to_string(),
|
||||||
|
Type::function(vec![map_type()], Type::Bool),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"fromList".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()])))],
|
||||||
|
map_type(),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"toList".to_string(),
|
||||||
|
Type::function(
|
||||||
|
vec![map_type()],
|
||||||
|
Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()]))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"merge".to_string(),
|
||||||
|
Type::function(vec![map_type(), map_type()], map_type()),
|
||||||
|
),
|
||||||
|
]);
|
||||||
|
env.bind("Map", TypeScheme::mono(map_module_type));
|
||||||
|
|
||||||
// Result module
|
// Result module
|
||||||
let result_type = Type::App {
|
let result_type = Type::App {
|
||||||
constructor: Box::new(Type::Named("Result".to_string())),
|
constructor: Box::new(Type::Named("Result".to_string())),
|
||||||
@@ -1870,9 +2058,47 @@ impl TypeEnv {
|
|||||||
"round".to_string(),
|
"round".to_string(),
|
||||||
Type::function(vec![Type::var()], Type::Int),
|
Type::function(vec![Type::var()], Type::Int),
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
"sin".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::Float),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"cos".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::Float),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"atan2".to_string(),
|
||||||
|
Type::function(vec![Type::Float, Type::Float], Type::Float),
|
||||||
|
),
|
||||||
]);
|
]);
|
||||||
env.bind("Math", TypeScheme::mono(math_module_type));
|
env.bind("Math", TypeScheme::mono(math_module_type));
|
||||||
|
|
||||||
|
// Int module
|
||||||
|
let int_module_type = Type::Record(vec![
|
||||||
|
(
|
||||||
|
"toString".to_string(),
|
||||||
|
Type::function(vec![Type::Int], Type::String),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"toFloat".to_string(),
|
||||||
|
Type::function(vec![Type::Int], Type::Float),
|
||||||
|
),
|
||||||
|
]);
|
||||||
|
env.bind("Int", TypeScheme::mono(int_module_type));
|
||||||
|
|
||||||
|
// Float module
|
||||||
|
let float_module_type = Type::Record(vec![
|
||||||
|
(
|
||||||
|
"toString".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::String),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"toInt".to_string(),
|
||||||
|
Type::function(vec![Type::Float], Type::Int),
|
||||||
|
),
|
||||||
|
]);
|
||||||
|
env.bind("Float", TypeScheme::mono(float_module_type));
|
||||||
|
|
||||||
env
|
env
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1956,6 +2182,9 @@ impl TypeEnv {
|
|||||||
Type::Option(inner) => {
|
Type::Option(inner) => {
|
||||||
Type::Option(Box::new(self.expand_type_alias(inner)))
|
Type::Option(Box::new(self.expand_type_alias(inner)))
|
||||||
}
|
}
|
||||||
|
Type::Map(k, v) => {
|
||||||
|
Type::Map(Box::new(self.expand_type_alias(k)), Box::new(self.expand_type_alias(v)))
|
||||||
|
}
|
||||||
Type::Versioned { base, version } => {
|
Type::Versioned { base, version } => {
|
||||||
Type::Versioned {
|
Type::Versioned {
|
||||||
base: Box::new(self.expand_type_alias(base)),
|
base: Box::new(self.expand_type_alias(base)),
|
||||||
@@ -2032,7 +2261,9 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
|
|||||||
// Function's required effects (e1) must be a subset of available effects (e2)
|
// Function's required effects (e1) must be a subset of available effects (e2)
|
||||||
// A pure function (empty effects) can be called anywhere
|
// A pure function (empty effects) can be called anywhere
|
||||||
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
|
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
|
||||||
if !e1.is_subset(&e2) {
|
// When expected effects (e2) are empty, it means "no constraint" (e.g., callback parameter)
|
||||||
|
// so we allow any actual effects through
|
||||||
|
if !e2.is_empty() && !e1.is_subset(&e2) {
|
||||||
return Err(format!(
|
return Err(format!(
|
||||||
"Effect mismatch: expected {{{}}}, got {{{}}}",
|
"Effect mismatch: expected {{{}}}, got {{{}}}",
|
||||||
e1, e2
|
e1, e2
|
||||||
@@ -2114,6 +2345,13 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
|
|||||||
// Option
|
// Option
|
||||||
(Type::Option(a), Type::Option(b)) => unify(a, b),
|
(Type::Option(a), Type::Option(b)) => unify(a, b),
|
||||||
|
|
||||||
|
// Map
|
||||||
|
(Type::Map(k1, v1), Type::Map(k2, v2)) => {
|
||||||
|
let s1 = unify(k1, k2)?;
|
||||||
|
let s2 = unify(&v1.apply(&s1), &v2.apply(&s1))?;
|
||||||
|
Ok(s1.compose(&s2))
|
||||||
|
}
|
||||||
|
|
||||||
// Versioned types
|
// Versioned types
|
||||||
(
|
(
|
||||||
Type::Versioned {
|
Type::Versioned {
|
||||||
|
|||||||
@@ -14,6 +14,7 @@
|
|||||||
pub type Html<M> =
|
pub type Html<M> =
|
||||||
| Element(String, List<Attr<M>>, List<Html<M>>)
|
| Element(String, List<Attr<M>>, List<Html<M>>)
|
||||||
| Text(String)
|
| Text(String)
|
||||||
|
| RawHtml(String)
|
||||||
| Empty
|
| Empty
|
||||||
|
|
||||||
// Attributes that can be applied to elements
|
// Attributes that can be applied to elements
|
||||||
@@ -41,6 +42,7 @@ pub type Attr<M> =
|
|||||||
| OnKeyDown(fn(String): M)
|
| OnKeyDown(fn(String): M)
|
||||||
| OnKeyUp(fn(String): M)
|
| OnKeyUp(fn(String): M)
|
||||||
| DataAttr(String, String)
|
| DataAttr(String, String)
|
||||||
|
| Attribute(String, String)
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
// Element builders - Container elements
|
// Element builders - Container elements
|
||||||
@@ -180,6 +182,28 @@ pub fn video<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
|||||||
pub fn audio<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
pub fn audio<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
||||||
Element("audio", attrs, children)
|
Element("audio", attrs, children)
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Element builders - Document / Head elements
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
pub fn meta<M>(attrs: List<Attr<M>>): Html<M> =
|
||||||
|
Element("meta", attrs, [])
|
||||||
|
|
||||||
|
pub fn link<M>(attrs: List<Attr<M>>): Html<M> =
|
||||||
|
Element("link", attrs, [])
|
||||||
|
|
||||||
|
pub fn script<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
||||||
|
Element("script", attrs, children)
|
||||||
|
|
||||||
|
pub fn iframe<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
||||||
|
Element("iframe", attrs, children)
|
||||||
|
|
||||||
|
pub fn figure<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
||||||
|
Element("figure", attrs, children)
|
||||||
|
|
||||||
|
pub fn figcaption<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
|
||||||
|
Element("figcaption", attrs, children)
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
// Element builders - Tables
|
// Element builders - Tables
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
@@ -285,6 +309,12 @@ pub fn onKeyUp<M>(h: fn(String): M): Attr<M> =
|
|||||||
pub fn data<M>(name: String, value: String): Attr<M> =
|
pub fn data<M>(name: String, value: String): Attr<M> =
|
||||||
DataAttr(name, value)
|
DataAttr(name, value)
|
||||||
|
|
||||||
|
pub fn attr<M>(name: String, value: String): Attr<M> =
|
||||||
|
Attribute(name, value)
|
||||||
|
|
||||||
|
pub fn rawHtml<M>(content: String): Html<M> =
|
||||||
|
RawHtml(content)
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
// Utility functions
|
// Utility functions
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
@@ -319,6 +349,7 @@ pub fn renderAttr<M>(attr: Attr<M>): String =
|
|||||||
Checked(false) => "",
|
Checked(false) => "",
|
||||||
Name(n) => " name=\"" + n + "\"",
|
Name(n) => " name=\"" + n + "\"",
|
||||||
DataAttr(name, value) => " data-" + name + "=\"" + value + "\"",
|
DataAttr(name, value) => " data-" + name + "=\"" + value + "\"",
|
||||||
|
Attribute(name, value) => " " + name + "=\"" + value + "\"",
|
||||||
// Event handlers are ignored in static rendering
|
// Event handlers are ignored in static rendering
|
||||||
OnClick(_) => "",
|
OnClick(_) => "",
|
||||||
OnInput(_) => "",
|
OnInput(_) => "",
|
||||||
@@ -355,6 +386,7 @@ pub fn render<M>(html: Html<M>): String =
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
Text(content) => escapeHtml(content),
|
Text(content) => escapeHtml(content),
|
||||||
|
RawHtml(content) => content,
|
||||||
Empty => ""
|
Empty => ""
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -368,15 +400,47 @@ pub fn escapeHtml(s: String): String = {
|
|||||||
s4
|
s4
|
||||||
}
|
}
|
||||||
|
|
||||||
// Render a full HTML document
|
// Render a full HTML document (basic)
|
||||||
pub fn document(title: String, headExtra: List<Html<M>>, bodyContent: List<Html<M>>): String = {
|
pub fn document(title: String, headExtra: List<Html<M>>, bodyContent: List<Html<M>>): String = {
|
||||||
let headElements = List.concat([
|
let headElements = List.concat([
|
||||||
[Element("meta", [DataAttr("charset", "UTF-8")], [])],
|
[Element("meta", [Attribute("charset", "UTF-8")], [])],
|
||||||
[Element("meta", [Name("viewport"), Value("width=device-width, initial-scale=1.0")], [])],
|
[Element("meta", [Name("viewport"), Attribute("content", "width=device-width, initial-scale=1.0")], [])],
|
||||||
[Element("title", [], [Text(title)])],
|
[Element("title", [], [Text(title)])],
|
||||||
headExtra
|
headExtra
|
||||||
])
|
])
|
||||||
let doc = Element("html", [DataAttr("lang", "en")], [
|
let doc = Element("html", [Attribute("lang", "en")], [
|
||||||
|
Element("head", [], headElements),
|
||||||
|
Element("body", [], bodyContent)
|
||||||
|
])
|
||||||
|
"<!DOCTYPE html>\n" + render(doc)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render a full HTML document with SEO meta tags
|
||||||
|
pub fn seoDocument(
|
||||||
|
title: String,
|
||||||
|
description: String,
|
||||||
|
url: String,
|
||||||
|
ogImage: String,
|
||||||
|
headExtra: List<Html<M>>,
|
||||||
|
bodyContent: List<Html<M>>
|
||||||
|
): String = {
|
||||||
|
let headElements = List.concat([
|
||||||
|
[Element("meta", [Attribute("charset", "UTF-8")], [])],
|
||||||
|
[Element("meta", [Name("viewport"), Attribute("content", "width=device-width, initial-scale=1.0")], [])],
|
||||||
|
[Element("title", [], [Text(title)])],
|
||||||
|
[Element("meta", [Name("description"), Attribute("content", description)], [])],
|
||||||
|
[Element("meta", [Attribute("property", "og:title"), Attribute("content", title)], [])],
|
||||||
|
[Element("meta", [Attribute("property", "og:description"), Attribute("content", description)], [])],
|
||||||
|
[Element("meta", [Attribute("property", "og:type"), Attribute("content", "website")], [])],
|
||||||
|
[Element("meta", [Attribute("property", "og:url"), Attribute("content", url)], [])],
|
||||||
|
[Element("meta", [Attribute("property", "og:image"), Attribute("content", ogImage)], [])],
|
||||||
|
[Element("meta", [Name("twitter:card"), Attribute("content", "summary_large_image")], [])],
|
||||||
|
[Element("meta", [Name("twitter:title"), Attribute("content", title)], [])],
|
||||||
|
[Element("meta", [Name("twitter:description"), Attribute("content", description)], [])],
|
||||||
|
[Element("link", [Attribute("rel", "canonical"), Href(url)], [])],
|
||||||
|
headExtra
|
||||||
|
])
|
||||||
|
let doc = Element("html", [Attribute("lang", "en")], [
|
||||||
Element("head", [], headElements),
|
Element("head", [], headElements),
|
||||||
Element("body", [], bodyContent)
|
Element("body", [], bodyContent)
|
||||||
])
|
])
|
||||||
|
|||||||
@@ -625,6 +625,41 @@ pub fn router(routes: List<Route>, notFound: fn(Request): Response): Handler =
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============================================================
|
||||||
|
// Static File Serving
|
||||||
|
// ============================================================
|
||||||
|
|
||||||
|
// Serve a static file from disk
|
||||||
|
pub fn serveStaticFile(basePath: String, requestPath: String): Response with {File} = {
|
||||||
|
let filePath = basePath + requestPath
|
||||||
|
if File.exists(filePath) then {
|
||||||
|
let content = File.read(filePath)
|
||||||
|
let mime = getMimeType(filePath)
|
||||||
|
{ status: 200, headers: [("Content-Type", mime)], body: content }
|
||||||
|
} else
|
||||||
|
{ status: 404, headers: textHeaders(), body: "Not Found" }
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================
|
||||||
|
// Form Body Parsing
|
||||||
|
// ============================================================
|
||||||
|
|
||||||
|
// Parse URL-encoded form body (same format as query strings)
|
||||||
|
pub fn parseFormBody(body: String): List<(String, String)> =
|
||||||
|
parseQueryParams(body)
|
||||||
|
|
||||||
|
// Get a form field value by name
|
||||||
|
pub fn getFormField(fields: List<(String, String)>, name: String): Option<String> =
|
||||||
|
getParam(fields, name)
|
||||||
|
|
||||||
|
// ============================================================
|
||||||
|
// Response Helpers
|
||||||
|
// ============================================================
|
||||||
|
|
||||||
|
// Send a Response using HttpServer effect (convenience wrapper)
|
||||||
|
pub fn sendResponse(resp: Response): Unit with {HttpServer} =
|
||||||
|
HttpServer.respondWithHeaders(resp.status, resp.body, resp.headers)
|
||||||
|
|
||||||
// ============================================================
|
// ============================================================
|
||||||
// Example Usage
|
// Example Usage
|
||||||
// ============================================================
|
// ============================================================
|
||||||
|
|||||||
Reference in New Issue
Block a user