28 Commits

Author SHA1 Message Date
fd5ed53b29 chore: bump version to 0.1.6 2026-02-19 15:22:32 -05:00
2800ce4e2d chore: sync Cargo.lock
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:26:20 -05:00
ec365ebb3f feat: add File.copy and propagate effectful callback effects (WISH-7, WISH-14)
File.copy(source, dest) copies files via interpreter (std::fs::copy) and
C backend (fread/fwrite). Effectful callbacks passed to higher-order
functions like List.map/forEach now propagate their effects to the
enclosing function's inferred effect set.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:24:28 -05:00
52dcc88051 chore: bump version to 0.1.5 2026-02-19 03:47:28 -05:00
1842b668e5 chore: sync Cargo.lock with version 0.1.4
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:47:11 -05:00
c67e3f31c3 feat: add and/or keywords, handle alias, --watch flag, JS tree-shaking
- WISH-008: `and`/`or` as aliases for `&&`/`||` boolean operators
- WISH-006: `handle` as alias for `run ... with` (same AST output)
- WISH-005: `--watch` flag for `lux compile` recompiles on file change
- WISH-009: Tree-shake unused runtime sections from JS output based on
  which effects are actually used (Console, Random, Time, Http, Dom)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:35:47 -05:00
b0ccde749c chore: bump version to 0.1.4 2026-02-19 02:48:56 -05:00
4ba7a23ae3 feat: add comprehensive compilation checks to validate.sh
Adds interpreter, JS compilation, and C compilation checks for all
examples, showcase programs, standard examples, and projects (113 total
checks). Skip lists exclude programs requiring unsupported effects or
interactive I/O.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:43:46 -05:00
89741b4a32 fix: move top-level let initialization into main() in C backend
Top-level let bindings with function calls (e.g., `let result = factorial(10)`)
were emitted as static initializers, which is invalid C since function calls
aren't compile-time constants. Now globals are declared with zero-init and
initialized inside main() before any run expressions execute.

Also fixes validate.sh to use exit codes instead of grep for cargo check/build.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:31:49 -05:00
3a2376cd49 feat: port AST definitions to Lux (self-hosting)
Translate all 30+ type definitions from src/ast.rs (727 lines Rust)
into Lux ADTs in projects/lux-compiler/ast.lux.

Types ported: Span, Ident, Visibility, Version, VersionConstraint,
BehavioralProperty, WhereClause, ModulePath, ImportDecl, Program,
Declaration, FunctionDecl, Parameter, EffectDecl, EffectOp, TypeDecl,
TypeDef, RecordField, Variant, VariantFields, Migration, HandlerDecl,
HandlerImpl, LetDecl, TraitDecl, TraitMethod, TraitBound, ImplDecl,
TraitConstraint, ImplMethod, TypeExpr, Expr (19 variants), Literal,
LiteralKind, BinaryOp, UnaryOp, Statement, MatchArm, Pattern.

Passes `lux check` and `lux run`.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:07:30 -05:00
4dfb04a1b6 chore: sync Cargo.lock with version 0.1.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:51 -05:00
3cdde02eb2 feat: add Int.toFloat/Float.toInt JS backend support and fix Map C codegen
- JS backend: Add Int/Float module dispatch in both Call and EffectOp paths
  for toFloat, toInt, and toString operations
- C backend: Fix lux_strdup → lux_string_dup in Map module codegen

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:40 -05:00
a5762d0397 feat: add built-in Map type with String keys
Add Map<String, V> as a first-class built-in type for key-value storage,
needed for self-hosting the compiler (parser/typechecker/interpreter all
rely heavily on hashmaps).

- types.rs: Type::Map(K,V) variant, all match arms (unify, apply, etc.)
- interpreter.rs: Value::Map, 12 BuiltinFn variants (new/set/get/contains/
  remove/keys/values/size/isEmpty/fromList/toList/merge), immutable semantics
- typechecker.rs: Map<K,V> resolution in resolve_type
- js_backend.rs: Map as JS Map with emit_map_operation()
- c_backend.rs: LuxMap struct (linear-scan), runtime fns, emit_map_operation()
- main.rs: 12 tests covering all Map operations
- validate.sh: now checks all projects/ directories too

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:45:13 -05:00
1132c621c6 fix: allow newlines before then in if/then/else expressions
The parser now skips newlines between the condition and `then` keyword,
enabling multiline if expressions like:
  if long_condition
    then expr1
    else expr2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:38:05 -05:00
a0fff1814e fix: JS backend scoping for let/match/if inside closures
Three related bugs fixed:
- BUG-009: let bindings inside lambdas hoisted to top-level
- BUG-011: match expressions inside lambdas hoisted to top-level
- BUG-012: variable name deduplication leaked across function scopes

Root cause: emit_expr() uses writeln() for statements, but lambdas
captured only the return value, not the emitted statements. Also,
var_substitutions from emit_function() leaked to subsequent code.

Fix: Lambda handler now captures all output emitted during body
evaluation and places it inside the function body. Both emit_function
and Lambda save/restore var_substitutions to prevent cross-scope leaks.
Lambda params are registered as identity substitutions to override any
outer bindings with the same name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:10:55 -05:00
4e9e823246 fix: record spread works with named type aliases
Resolve type aliases (e.g. Player -> { pos: Vec2, speed: Float })
before checking if spread expression is a record type. Previously
{ ...p, field: val } failed with "must be a record type, got Player"
when the variable had a named type annotation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 00:01:20 -05:00
6a2e4a7ac1 chore: bump version to 0.1.3 2026-02-18 23:06:10 -05:00
3d706cb32b feat: add record spread syntax { ...base, field: val }
Adds spread operator for records, allowing concise record updates:
  let p2 = { ...p, x: 5.0 }

Changes across the full pipeline:
- Lexer: new DotDotDot (...) token
- AST: optional spread field on Record variant
- Parser: detect ... at start of record expression
- Typechecker: merge spread record fields with explicit overrides
- Interpreter: evaluate spread, overlay explicit fields
- JS backend: emit native JS spread syntax
- C backend: copy spread into temp, assign overrides
- Formatter, linter, LSP, symbol table: propagate spread

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:27 -05:00
7c3bfa9301 feat: add Math.sin, Math.cos, Math.atan2 trig functions
Adds trigonometric functions to the Math module across interpreter,
type system, and C backend. JS backend already supported them.
Also adds #include <math.h> to C preamble and handles Math module
calls through both Call and EffectOp paths in C backend.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:12 -05:00
b56c5461f1 fix: JS const _ duplication and hardcoded version string
- JS backend now emits wildcard let bindings as side-effect statements
  instead of const _ declarations, fixing SyntaxError on multiple let _ = ...
- Version string now uses env!("CARGO_PKG_VERSION") to auto-sync with Cargo.toml
- Add -lm linker flag for math library support

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:03 -05:00
61e1469845 feat: add ++ concat operator and auto-invoke main
BUG-004: Add ++ operator for string and list concatenation across all
backends (interpreter, C, JS) with type checking and formatting support.

BUG-001: Auto-invoke top-level `let main = fn () => ...` when main is
a zero-parameter function, instead of just printing the function value.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 22:01:41 -05:00
bb0a288210 chore: bump version to 0.1.2 2026-02-18 21:16:44 -05:00
5d7f4633e1 docs: add explicit commit instructions to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:27 -05:00
d05b13d840 fix: JS backend compiles print() to console.log()
Bare `print()` calls in Lux now emit `console.log()` in JS output
instead of undefined `print()`. Fixes BUG-006.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:07 -05:00
0ee3050704 chore: bump version to 0.1.1 2026-02-18 20:41:43 -05:00
80b1276f9f fix: release script auto-bumps patch by default
Release script now supports: patch (default), minor, major, or explicit
version. Auto-updates Cargo.toml and flake.nix before building.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:41:29 -05:00
bd843d2219 fix: record type aliases now work for unification and field access
Expand type aliases via unify_with_env() everywhere in the type checker,
not just in a few places. This fixes named record types like
`type Vec2 = { x: Float, y: Float }` — they now properly unify with
anonymous records and support field access (v.x, v.y).

Also adds scripts/validate.sh for automated full-suite regression
testing (Rust tests + all 5 package test suites + type checking).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:21:29 -05:00
d76aa17b38 feat: static binary builds and automated release script
Switch reqwest from native-tls (openssl) to rustls-tls for a pure-Rust
TLS stack, enabling fully static musl builds. Add `nix build .#static`
for portable Linux binaries and `scripts/release.sh` for automated
Gitea releases with changelog generation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 19:09:32 -05:00
20 changed files with 2613 additions and 304 deletions

View File

@@ -42,17 +42,45 @@ When making changes:
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed) 7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
8. **Git commits**: Always use `--no-gpg-sign` flag 8. **Git commits**: Always use `--no-gpg-sign` flag
### Post-work checklist (run after each major piece of work) ### Post-work checklist (run after each committable change)
**MANDATORY: Run the full validation script after every committable change:**
```bash ```bash
nix develop --command cargo check # No Rust errors ./scripts/validate.sh
nix develop --command cargo test # All tests pass (currently 381)
./target/release/lux check # Type check + lint all .lux files
./target/release/lux fmt # Format all .lux files
./target/release/lux lint # Standalone lint pass
``` ```
This script runs ALL of the following checks and will fail if any regress:
1. `cargo check` — no Rust compilation errors
2. `cargo test` — all Rust tests pass (currently 387)
3. `cargo build --release` — release binary builds
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
5. `lux check` on every package — type checking + lint passes
If `validate.sh` is not available or you need to run manually:
```bash
nix develop --command cargo check # No Rust errors
nix develop --command cargo test # All Rust tests pass
nix develop --command cargo build --release # Build release binary
cd ../packages/path && ../../lang/target/release/lux test # Package tests
cd ../packages/frontmatter && ../../lang/target/release/lux test
cd ../packages/xml && ../../lang/target/release/lux test
cd ../packages/rss && ../../lang/target/release/lux test
cd ../packages/markdown && ../../lang/target/release/lux test
```
**Do NOT commit if any check fails.** Fix the issue first.
### Commit after every piece of work ### Commit after every piece of work
**After completing each logical unit of work, commit immediately.** Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits. **After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
**Commit workflow:**
1. Make the change
2. Run `./scripts/validate.sh` (all 13 checks must pass)
3. `git add` the relevant files
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
5. Move on to the next task
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
**IMPORTANT: Always verify Lux code you write:** **IMPORTANT: Always verify Lux code you write:**
- Run with interpreter: `./target/release/lux file.lux` - Run with interpreter: `./target/release/lux file.lux`
@@ -109,7 +137,7 @@ When working on any major task that involves writing Lux code, **document every
## Code Quality ## Code Quality
- Fix all compiler warnings before committing - Fix all compiler warnings before committing
- Ensure all tests pass (currently 381 tests) - Ensure all tests pass (currently 387 tests)
- Add new tests when adding features - Add new tests when adding features
- Keep examples and documentation in sync - Keep examples and documentation in sync

216
Cargo.lock generated
View File

@@ -135,16 +135,6 @@ dependencies = [
"libc", "libc",
] ]
[[package]]
name = "core-foundation"
version = "0.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2a6cd9ae233e7f62ba4e9353e81a88df7fc8a5987b8d445b4d90c879bd156f6"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]] [[package]]
name = "core-foundation-sys" name = "core-foundation-sys"
version = "0.8.7" version = "0.8.7"
@@ -235,7 +225,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb" checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb"
dependencies = [ dependencies = [
"libc", "libc",
"windows-sys 0.61.2", "windows-sys 0.59.0",
] ]
[[package]] [[package]]
@@ -297,21 +287,6 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2" checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
[[package]]
name = "foreign-types"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
dependencies = [
"foreign-types-shared",
]
[[package]]
name = "foreign-types-shared"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
[[package]] [[package]]
name = "form_urlencoded" name = "form_urlencoded"
version = "1.2.2" version = "1.2.2"
@@ -552,16 +527,17 @@ dependencies = [
] ]
[[package]] [[package]]
name = "hyper-tls" name = "hyper-rustls"
version = "0.5.0" version = "0.24.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905" checksum = "ec3efd23720e2049821a693cbc7e65ea87c72f1c58ff2f9522ff332b1491e590"
dependencies = [ dependencies = [
"bytes", "futures-util",
"http",
"hyper", "hyper",
"native-tls", "rustls",
"tokio", "tokio",
"tokio-native-tls", "tokio-rustls",
] ]
[[package]] [[package]]
@@ -794,7 +770,7 @@ dependencies = [
[[package]] [[package]]
name = "lux" name = "lux"
version = "0.1.0" version = "0.1.5"
dependencies = [ dependencies = [
"lsp-server", "lsp-server",
"lsp-types", "lsp-types",
@@ -843,23 +819,6 @@ dependencies = [
"windows-sys 0.61.2", "windows-sys 0.61.2",
] ]
[[package]]
name = "native-tls"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d5d26952a508f321b4d3d2e80e78fc2603eaefcdf0c30783867f19586518bdc"
dependencies = [
"libc",
"log",
"openssl",
"openssl-probe",
"openssl-sys",
"schannel",
"security-framework",
"security-framework-sys",
"tempfile",
]
[[package]] [[package]]
name = "nibble_vec" name = "nibble_vec"
version = "0.1.0" version = "0.1.0"
@@ -905,50 +864,6 @@ version = "1.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d" checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
[[package]]
name = "openssl"
version = "0.10.75"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
dependencies = [
"bitflags 2.10.0",
"cfg-if",
"foreign-types",
"libc",
"once_cell",
"openssl-macros",
"openssl-sys",
]
[[package]]
name = "openssl-macros"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "openssl-probe"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
[[package]]
name = "openssl-sys"
version = "0.9.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
dependencies = [
"cc",
"libc",
"pkg-config",
"vcpkg",
]
[[package]] [[package]]
name = "parking_lot" name = "parking_lot"
version = "0.12.5" version = "0.12.5"
@@ -1203,15 +1118,15 @@ dependencies = [
"http", "http",
"http-body", "http-body",
"hyper", "hyper",
"hyper-tls", "hyper-rustls",
"ipnet", "ipnet",
"js-sys", "js-sys",
"log", "log",
"mime", "mime",
"native-tls",
"once_cell", "once_cell",
"percent-encoding", "percent-encoding",
"pin-project-lite", "pin-project-lite",
"rustls",
"rustls-pemfile", "rustls-pemfile",
"serde", "serde",
"serde_json", "serde_json",
@@ -1219,15 +1134,30 @@ dependencies = [
"sync_wrapper", "sync_wrapper",
"system-configuration", "system-configuration",
"tokio", "tokio",
"tokio-native-tls", "tokio-rustls",
"tower-service", "tower-service",
"url", "url",
"wasm-bindgen", "wasm-bindgen",
"wasm-bindgen-futures", "wasm-bindgen-futures",
"web-sys", "web-sys",
"webpki-roots",
"winreg", "winreg",
] ]
[[package]]
name = "ring"
version = "0.17.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a4689e6c2294d81e88dc6261c768b63bc4fcdb852be6d1352498b114f61383b7"
dependencies = [
"cc",
"cfg-if",
"getrandom 0.2.17",
"libc",
"untrusted",
"windows-sys 0.52.0",
]
[[package]] [[package]]
name = "rusqlite" name = "rusqlite"
version = "0.31.0" version = "0.31.0"
@@ -1252,7 +1182,19 @@ dependencies = [
"errno", "errno",
"libc", "libc",
"linux-raw-sys", "linux-raw-sys",
"windows-sys 0.61.2", "windows-sys 0.59.0",
]
[[package]]
name = "rustls"
version = "0.21.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f56a14d1f48b391359b22f731fd4bd7e43c97f3c50eee276f3aa09c94784d3e"
dependencies = [
"log",
"ring",
"rustls-webpki",
"sct",
] ]
[[package]] [[package]]
@@ -1264,6 +1206,16 @@ dependencies = [
"base64 0.21.7", "base64 0.21.7",
] ]
[[package]]
name = "rustls-webpki"
version = "0.101.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b6275d1ee7a1cd780b64aca7726599a1dbc893b1e64144529e55c3c2f745765"
dependencies = [
"ring",
"untrusted",
]
[[package]] [[package]]
name = "rustversion" name = "rustversion"
version = "1.0.22" version = "1.0.22"
@@ -1298,15 +1250,6 @@ version = "1.0.23"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f" checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
[[package]]
name = "schannel"
version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
dependencies = [
"windows-sys 0.61.2",
]
[[package]] [[package]]
name = "scopeguard" name = "scopeguard"
version = "1.2.0" version = "1.2.0"
@@ -1314,26 +1257,13 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49" checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]] [[package]]
name = "security-framework" name = "sct"
version = "3.6.0" version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d17b898a6d6948c3a8ee4372c17cb384f90d2e6e912ef00895b14fd7ab54ec38" checksum = "da046153aa2352493d6cb7da4b6e5c0c057d8a1d0a9aa8560baffdd945acd414"
dependencies = [ dependencies = [
"bitflags 2.10.0", "ring",
"core-foundation 0.10.1", "untrusted",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
[[package]]
name = "security-framework-sys"
version = "2.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "321c8673b092a9a42605034a9879d73cb79101ed5fd117bc9a597b89b4e9e61a"
dependencies = [
"core-foundation-sys",
"libc",
] ]
[[package]] [[package]]
@@ -1521,7 +1451,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7" checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
dependencies = [ dependencies = [
"bitflags 1.3.2", "bitflags 1.3.2",
"core-foundation 0.9.4", "core-foundation",
"system-configuration-sys", "system-configuration-sys",
] ]
@@ -1545,7 +1475,7 @@ dependencies = [
"getrandom 0.4.1", "getrandom 0.4.1",
"once_cell", "once_cell",
"rustix", "rustix",
"windows-sys 0.61.2", "windows-sys 0.59.0",
] ]
[[package]] [[package]]
@@ -1619,16 +1549,6 @@ dependencies = [
"windows-sys 0.61.2", "windows-sys 0.61.2",
] ]
[[package]]
name = "tokio-native-tls"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
dependencies = [
"native-tls",
"tokio",
]
[[package]] [[package]]
name = "tokio-postgres" name = "tokio-postgres"
version = "0.7.16" version = "0.7.16"
@@ -1655,6 +1575,16 @@ dependencies = [
"whoami", "whoami",
] ]
[[package]]
name = "tokio-rustls"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c28327cf380ac148141087fbfb9de9d7bd4e84ab5d2c28fbc911d753de8a7081"
dependencies = [
"rustls",
"tokio",
]
[[package]] [[package]]
name = "tokio-util" name = "tokio-util"
version = "0.7.18" version = "0.7.18"
@@ -1750,6 +1680,12 @@ version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853" checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
[[package]]
name = "untrusted"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1"
[[package]] [[package]]
name = "url" name = "url"
version = "2.5.8" version = "2.5.8"
@@ -1941,6 +1877,12 @@ dependencies = [
"wasm-bindgen", "wasm-bindgen",
] ]
[[package]]
name = "webpki-roots"
version = "0.25.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f20c57d8d7db6d3b86154206ae5d8fba62dd39573114de97c2cb0578251f8e1"
[[package]] [[package]]
name = "whoami" name = "whoami"
version = "2.1.1" version = "2.1.1"

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "lux" name = "lux"
version = "0.1.0" version = "0.1.6"
edition = "2021" edition = "2021"
description = "A functional programming language with first-class effects, schema evolution, and behavioral types" description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
license = "MIT" license = "MIT"
@@ -13,7 +13,7 @@ lsp-types = "0.94"
serde = { version = "1", features = ["derive"] } serde = { version = "1", features = ["derive"] }
serde_json = "1" serde_json = "1"
rand = "0.8" rand = "0.8"
reqwest = { version = "0.11", features = ["blocking", "json"] } reqwest = { version = "0.11", default-features = false, features = ["blocking", "json", "rustls-tls"] }
tiny_http = "0.12" tiny_http = "0.12"
rusqlite = { version = "0.31", features = ["bundled"] } rusqlite = { version = "0.31", features = ["bundled"] }
postgres = "0.19" postgres = "0.19"

View File

@@ -14,6 +14,7 @@
pkgs = import nixpkgs { inherit system overlays; }; pkgs = import nixpkgs { inherit system overlays; };
rustToolchain = pkgs.rust-bin.stable.latest.default.override { rustToolchain = pkgs.rust-bin.stable.latest.default.override {
extensions = [ "rust-src" "rust-analyzer" ]; extensions = [ "rust-src" "rust-analyzer" ];
targets = [ "x86_64-unknown-linux-musl" ];
}; };
in in
{ {
@@ -22,8 +23,8 @@
rustToolchain rustToolchain
cargo-watch cargo-watch
cargo-edit cargo-edit
pkg-config # Static builds
openssl pkgsStatic.stdenv.cc
# Benchmark tools # Benchmark tools
hyperfine hyperfine
poop poop
@@ -43,7 +44,7 @@
printf "\n" printf "\n"
printf " \033[1;35m \033[0m\n" printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m\n" printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m v0.1.0\n" printf " \033[1;35m \033[0m v0.1.6\n"
printf "\n" printf "\n"
printf " Functional language with first-class effects\n" printf " Functional language with first-class effects\n"
printf "\n" printf "\n"
@@ -61,18 +62,47 @@
packages.default = pkgs.rustPlatform.buildRustPackage { packages.default = pkgs.rustPlatform.buildRustPackage {
pname = "lux"; pname = "lux";
version = "0.1.0"; version = "0.1.6";
src = ./.; src = ./.;
cargoLock.lockFile = ./Cargo.lock; cargoLock.lockFile = ./Cargo.lock;
nativeBuildInputs = [ pkgs.pkg-config ];
buildInputs = [ pkgs.openssl ];
doCheck = false; doCheck = false;
}; };
# Benchmark scripts packages.static = let
muslPkgs = import nixpkgs {
inherit system;
crossSystem = {
config = "x86_64-unknown-linux-musl";
isStatic = true;
};
};
in muslPkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.6";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
CARGO_BUILD_TARGET = "x86_64-unknown-linux-musl";
CARGO_BUILD_RUSTFLAGS = "-C target-feature=+crt-static";
doCheck = false;
postInstall = ''
$STRIP $out/bin/lux 2>/dev/null || true
'';
};
apps = { apps = {
# Release automation
release = {
type = "app";
program = toString (pkgs.writeShellScript "lux-release" ''
exec ${self}/scripts/release.sh "$@"
'');
};
# Benchmark scripts
# Run hyperfine benchmark comparison # Run hyperfine benchmark comparison
bench = { bench = {
type = "app"; type = "app";

View File

@@ -0,0 +1,225 @@
// Lux AST — Self-hosted Abstract Syntax Tree definitions
//
// Direct translation of src/ast.rs into Lux ADTs.
// These types represent the parsed structure of a Lux program.
//
// Naming conventions to avoid collisions:
// Ex = Expr variant, Pat = Pattern, Te = TypeExpr
// Td = TypeDef, Vf = VariantFields, Op = Operator
// Decl = Declaration, St = Statement
// === Source Location ===
type Span = | Span(Int, Int)
// === Identifiers ===
type Ident = | Ident(String, Span)
// === Visibility ===
type Visibility = | Public | Private
// === Schema Evolution ===
type Version = | Version(Int, Span)
type VersionConstraint =
| VcExact(Version)
| VcAtLeast(Version)
| VcLatest(Span)
// === Behavioral Types ===
type BehavioralProperty =
| BpPure
| BpTotal
| BpIdempotent
| BpDeterministic
| BpCommutative
// === Trait Bound (needed before WhereClause) ===
type TraitBound = | TraitBound(Ident, List<TypeExpr>, Span)
// === Trait Constraint (needed before WhereClause) ===
type TraitConstraint = | TraitConstraint(Ident, List<TraitBound>, Span)
// === Where Clauses ===
type WhereClause =
| WcProperty(Ident, BehavioralProperty, Span)
| WcResult(Expr, Span)
| WcTrait(TraitConstraint)
// === Module Path ===
type ModulePath = | ModulePath(List<Ident>, Span)
// === Import ===
// path, alias, items, wildcard, span
type ImportDecl = | ImportDecl(ModulePath, Option<Ident>, Option<List<Ident>>, Bool, Span)
// === Program ===
type Program = | Program(List<ImportDecl>, List<Declaration>)
// === Declarations ===
type Declaration =
| DeclFunction(FunctionDecl)
| DeclEffect(EffectDecl)
| DeclType(TypeDecl)
| DeclHandler(HandlerDecl)
| DeclLet(LetDecl)
| DeclTrait(TraitDecl)
| DeclImpl(ImplDecl)
// === Parameter ===
type Parameter = | Parameter(Ident, TypeExpr, Span)
// === Effect Operation ===
type EffectOp = | EffectOp(Ident, List<Parameter>, TypeExpr, Span)
// === Record Field ===
type RecordField = | RecordField(Ident, TypeExpr, Span)
// === Variant Fields ===
type VariantFields =
| VfUnit
| VfTuple(List<TypeExpr>)
| VfRecord(List<RecordField>)
// === Variant ===
type Variant = | Variant(Ident, VariantFields, Span)
// === Migration ===
type Migration = | Migration(Version, Expr, Span)
// === Handler Impl ===
// op_name, params, resume, body, span
type HandlerImpl = | HandlerImpl(Ident, List<Ident>, Option<Ident>, Expr, Span)
// === Impl Method ===
// name, params, return_type, body, span
type ImplMethod = | ImplMethod(Ident, List<Parameter>, Option<TypeExpr>, Expr, Span)
// === Trait Method ===
// name, type_params, params, return_type, default_impl, span
type TraitMethod = | TraitMethod(Ident, List<Ident>, List<Parameter>, TypeExpr, Option<Expr>, Span)
// === Type Expressions ===
type TypeExpr =
| TeNamed(Ident)
| TeApp(TypeExpr, List<TypeExpr>)
| TeFunction(List<TypeExpr>, TypeExpr, List<Ident>)
| TeTuple(List<TypeExpr>)
| TeRecord(List<RecordField>)
| TeUnit
| TeVersioned(TypeExpr, VersionConstraint)
// === Literal ===
type LiteralKind =
| LitInt(Int)
| LitFloat(String)
| LitString(String)
| LitChar(Char)
| LitBool(Bool)
| LitUnit
type Literal = | Literal(LiteralKind, Span)
// === Binary Operators ===
type BinaryOp =
| OpAdd | OpSub | OpMul | OpDiv | OpMod
| OpEq | OpNe | OpLt | OpLe | OpGt | OpGe
| OpAnd | OpOr
| OpPipe | OpConcat
// === Unary Operators ===
type UnaryOp = | OpNeg | OpNot
// === Statements ===
type Statement =
| StExpr(Expr)
| StLet(Ident, Option<TypeExpr>, Expr, Span)
// === Match Arms ===
type MatchArm = | MatchArm(Pattern, Option<Expr>, Expr, Span)
// === Patterns ===
type Pattern =
| PatWildcard(Span)
| PatVar(Ident)
| PatLiteral(Literal)
| PatConstructor(Ident, List<Pattern>, Span)
| PatRecord(List<(Ident, Pattern)>, Span)
| PatTuple(List<Pattern>, Span)
// === Function Declaration ===
// visibility, doc, name, type_params, params, return_type, effects, properties, where_clauses, body, span
type FunctionDecl = | FunctionDecl(Visibility, Option<String>, Ident, List<Ident>, List<Parameter>, TypeExpr, List<Ident>, List<BehavioralProperty>, List<WhereClause>, Expr, Span)
// === Effect Declaration ===
// doc, name, type_params, operations, span
type EffectDecl = | EffectDecl(Option<String>, Ident, List<Ident>, List<EffectOp>, Span)
// === Type Declaration ===
// visibility, doc, name, type_params, version, definition, migrations, span
type TypeDecl = | TypeDecl(Visibility, Option<String>, Ident, List<Ident>, Option<Version>, TypeDef, List<Migration>, Span)
// === Handler Declaration ===
// name, params, effect, implementations, span
type HandlerDecl = | HandlerDecl(Ident, List<Parameter>, Ident, List<HandlerImpl>, Span)
// === Let Declaration ===
// visibility, doc, name, typ, value, span
type LetDecl = | LetDecl(Visibility, Option<String>, Ident, Option<TypeExpr>, Expr, Span)
// === Trait Declaration ===
// visibility, doc, name, type_params, super_traits, methods, span
type TraitDecl = | TraitDecl(Visibility, Option<String>, Ident, List<Ident>, List<TraitBound>, List<TraitMethod>, Span)
// === Impl Declaration ===
// type_params, constraints, trait_name, trait_args, target_type, methods, span
type ImplDecl = | ImplDecl(List<Ident>, List<TraitConstraint>, Ident, List<TypeExpr>, TypeExpr, List<ImplMethod>, Span)
// === Expressions ===
type Expr =
| ExLiteral(Literal)
| ExVar(Ident)
| ExBinaryOp(BinaryOp, Expr, Expr, Span)
| ExUnaryOp(UnaryOp, Expr, Span)
| ExCall(Expr, List<Expr>, Span)
| ExEffectOp(Ident, Ident, List<Expr>, Span)
| ExField(Expr, Ident, Span)
| ExTupleIndex(Expr, Int, Span)
| ExLambda(List<Parameter>, Option<TypeExpr>, List<Ident>, Expr, Span)
| ExLet(Ident, Option<TypeExpr>, Expr, Expr, Span)
| ExIf(Expr, Expr, Expr, Span)
| ExMatch(Expr, List<MatchArm>, Span)
| ExBlock(List<Statement>, Expr, Span)
| ExRecord(Option<Expr>, List<(Ident, Expr)>, Span)
| ExTuple(List<Expr>, Span)
| ExList(List<Expr>, Span)
| ExRun(Expr, List<(Ident, Expr)>, Span)
| ExResume(Expr, Span)

213
scripts/release.sh Executable file
View File

@@ -0,0 +1,213 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Release Script
# Builds a static binary, generates changelog, and creates a Gitea release.
#
# Usage:
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
# ./scripts/release.sh patch # same as above
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
# ./scripts/release.sh v1.2.3 # explicit version
#
# Environment:
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
REPO_OWNER="blu"
REPO_NAME="lux"
API_BASE="$GITEA_URL/api/v1"
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
info() { printf "${CYAN}::${NC} %s\n" "$1"; }
ok() { printf "${GREEN}ok${NC} %s\n" "$1"; }
warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
# --- Determine version ---
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
BUMP="${1:-patch}"
bump_version() {
local ver="$1" part="$2"
IFS='.' read -r major minor patch <<< "$ver"
case "$part" in
major) echo "$((major + 1)).0.0" ;;
minor) echo "$major.$((minor + 1)).0" ;;
patch) echo "$major.$minor.$((patch + 1))" ;;
*) echo "$part" ;; # treat as explicit version
esac
}
case "$BUMP" in
major|minor|patch)
VERSION=$(bump_version "$CURRENT" "$BUMP")
info "Bumping $BUMP: $CURRENT$VERSION"
;;
*)
# Explicit version — strip v prefix if present
VERSION="${BUMP#v}"
info "Explicit version: $VERSION"
;;
esac
TAG="v$VERSION"
# --- Check for clean working tree ---
if [ -n "$(git status --porcelain)" ]; then
warn "Working tree has uncommitted changes:"
git status --short
printf "\n"
read -rp "Continue anyway? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || exit 1
fi
# --- Check if tag already exists ---
if git rev-parse "$TAG" >/dev/null 2>&1; then
err "Tag $TAG already exists. Choose a different version."
fi
# --- Update version in source files ---
if [ "$VERSION" != "$CURRENT" ]; then
info "Updating version in Cargo.toml and flake.nix..."
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
git add Cargo.toml flake.nix
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
ok "Version updated and committed"
fi
# --- Generate changelog ---
info "Generating changelog..."
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
if [ -n "$LAST_TAG" ]; then
RANGE="$LAST_TAG..HEAD"
info "Changes since $LAST_TAG:"
else
RANGE="HEAD"
info "First release — summarizing recent commits:"
fi
CHANGELOG=$(git log "$RANGE" --pretty=format:"- %s" --no-merges 2>/dev/null | head -50 || true)
if [ -z "$CHANGELOG" ]; then
CHANGELOG="- Initial release"
fi
# --- Build static binary ---
info "Building static binary (nix build .#static)..."
nix build .#static
BINARY="result/bin/lux"
if [ ! -f "$BINARY" ]; then
err "Static binary not found at $BINARY"
fi
BINARY_SIZE=$(ls -lh "$BINARY" | awk '{print $5}')
BINARY_TYPE=$(file "$BINARY" | sed 's/.*: //')
ok "Binary: $BINARY_SIZE, $BINARY_TYPE"
# --- Prepare release artifact ---
ARTIFACT="/tmp/lux-${TAG}-linux-x86_64"
cp "$BINARY" "$ARTIFACT"
chmod +x "$ARTIFACT"
# --- Show release summary ---
printf "\n"
printf "${BOLD}═══ Release Summary ═══${NC}\n"
printf "\n"
printf " ${BOLD}Tag:${NC} %s\n" "$TAG"
printf " ${BOLD}Binary:${NC} %s (%s)\n" "lux-${TAG}-linux-x86_64" "$BINARY_SIZE"
printf " ${BOLD}Commit:${NC} %s\n" "$(git rev-parse --short HEAD)"
printf "\n"
printf "${BOLD}Changelog:${NC}\n"
printf "%s\n" "$CHANGELOG"
printf "\n"
# --- Confirm ---
read -rp "Create release $TAG? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || { info "Aborted."; exit 0; }
# --- Get Gitea token ---
if [ -z "${GITEA_TOKEN:-}" ]; then
printf "\n"
info "Gitea API token required (create at $GITEA_URL/user/settings/applications)"
read -rsp "Token: " GITEA_TOKEN
printf "\n"
fi
if [ -z "$GITEA_TOKEN" ]; then
err "No token provided"
fi
# --- Create and push tag ---
info "Creating tag $TAG..."
git tag -a "$TAG" -m "Release $TAG" --no-sign
ok "Tag created"
info "Pushing tag to origin..."
git push origin "$TAG"
ok "Tag pushed"
# --- Create Gitea release ---
info "Creating release on Gitea..."
RELEASE_BODY=$(printf "## Lux %s\n\n### Changes\n\n%s\n\n### Installation\n\n\`\`\`bash\ncurl -Lo lux %s/%s/%s/releases/download/%s/lux-linux-x86_64\nchmod +x lux\n./lux --version\n\`\`\`" \
"$TAG" "$CHANGELOG" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG")
RELEASE_JSON=$(jq -n \
--arg tag "$TAG" \
--arg name "Lux $TAG" \
--arg body "$RELEASE_BODY" \
'{tag_name: $tag, name: $name, body: $body, draft: false, prerelease: false}')
RELEASE_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/json" \
-d "$RELEASE_JSON")
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id // empty')
if [ -z "$RELEASE_ID" ]; then
echo "$RELEASE_RESPONSE" | jq . 2>/dev/null || echo "$RELEASE_RESPONSE"
err "Failed to create release"
fi
ok "Release created (id: $RELEASE_ID)"
# --- Upload binary ---
info "Uploading binary..."
UPLOAD_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases/$RELEASE_ID/assets?name=lux-linux-x86_64" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/octet-stream" \
--data-binary "@$ARTIFACT")
ASSET_NAME=$(echo "$UPLOAD_RESPONSE" | jq -r '.name // empty')
if [ -z "$ASSET_NAME" ]; then
echo "$UPLOAD_RESPONSE" | jq . 2>/dev/null || echo "$UPLOAD_RESPONSE"
err "Failed to upload binary"
fi
ok "Binary uploaded: $ASSET_NAME"
# --- Done ---
printf "\n"
printf "${GREEN}${BOLD}Release $TAG published!${NC}\n"
printf "\n"
printf " ${BOLD}URL:${NC} %s/%s/%s/releases/tag/%s\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf " ${BOLD}Download:${NC} %s/%s/%s/releases/download/%s/lux-linux-x86_64\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf "\n"
# Cleanup
rm -f "$ARTIFACT"

211
scripts/validate.sh Executable file
View File

@@ -0,0 +1,211 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Full Validation Script
# Runs all checks: Rust tests, package tests, type checking, example compilation.
# Run after every committable change to ensure no regressions.
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
LUX="$(pwd)/target/release/lux"
PACKAGES_DIR="$(pwd)/../packages"
PROJECTS_DIR="$(pwd)/projects"
EXAMPLES_DIR="$(pwd)/examples"
RED='\033[0;31m'
GREEN='\033[0;32m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
FAILED=0
TOTAL=0
step() {
TOTAL=$((TOTAL + 1))
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
}
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
# --- Rust checks ---
step "cargo check"
if nix develop --command cargo check 2>/dev/null; then ok; else fail; fi
step "cargo test"
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
# --- Build release binary ---
step "cargo build --release"
if nix develop --command cargo build --release 2>/dev/null; then ok; else fail; fi
# --- Package tests ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux test ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
fi
done
# --- Lux check on packages ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux check ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
fi
done
# --- Project checks ---
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
if [ -f "$proj_dir/main.lux" ]; then
step "lux check (project: $proj)"
OUTPUT=$("$LUX" check "$proj_dir/main.lux" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
fi
# Check any standalone .lux files in the project
for lux_file in "$proj_dir"/*.lux; do
[ -f "$lux_file" ] || continue
fname=$(basename "$lux_file")
[ "$fname" = "main.lux" ] && continue
step "lux check (project: $proj/$fname)"
OUTPUT=$("$LUX" check "$lux_file" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
done
done
# === Compilation & Interpreter Checks ===
# --- Interpreter: examples ---
# Skip: http_api, http, http_router, http_server (network), postgres_demo (db),
# random, property_testing (Random effect), shell (Process), json (File I/O),
# file_io (File I/O), test_math, test_lists (Test effect), stress_shared_rc,
# test_rc_comparison (internal tests), modules/* (need cwd)
INTERP_SKIP="http_api http http_router http_server postgres_demo random property_testing shell json file_io test_math test_lists stress_shared_rc test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $INTERP_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (examples/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/standard ---
# Skip: guessing_game (reads stdin)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "guessing_game" ] && continue
step "interpreter (standard/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/showcase ---
# Skip: task_manager (parse error in current version)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "interpreter (showcase/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: projects ---
# Skip: guessing-game (Random), rest-api (HttpServer)
PROJ_INTERP_SKIP="guessing-game rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $PROJ_INTERP_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (project: $proj)"
if timeout 10 "$LUX" "$proj_dir/main.lux" >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples ---
# Skip files that fail JS compilation (unsupported features)
JS_SKIP="http_api http http_router postgres_demo property_testing json test_lists test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $JS_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (examples/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/standard ---
# Skip: stdlib_demo (uses String.toUpper not in JS backend)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "stdlib_demo" ] && continue
step "compile JS (standard/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/showcase ---
# Skip: task_manager (unsupported features)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "compile JS (showcase/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: projects ---
# Skip: json-parser, rest-api (unsupported features)
JS_PROJ_SKIP="json-parser rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $JS_PROJ_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (project: $proj)"
if "$LUX" compile "$proj_dir/main.lux" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples ---
# Only compile examples known to work with C backend
C_EXAMPLES="hello factorial pipelines tailcall jit_test"
for name in $C_EXAMPLES; do
f="$EXAMPLES_DIR/$name.lux"
[ -f "$f" ] || continue
step "compile C (examples/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples/standard ---
C_STD_EXAMPLES="hello_world factorial fizzbuzz primes guessing_game"
for name in $C_STD_EXAMPLES; do
f="$EXAMPLES_DIR/standard/$name.lux"
[ -f "$f" ] || continue
step "compile C (standard/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- Cleanup ---
rm -f /tmp/lux_validate.js /tmp/lux_validate_bin
# --- Summary ---
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
if [ $FAILED -eq 0 ]; then
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
else
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
exit 1
fi

View File

@@ -541,7 +541,9 @@ pub enum Expr {
span: Span, span: Span,
}, },
/// Record literal: { name: "Alice", age: 30 } /// Record literal: { name: "Alice", age: 30 }
/// With optional spread: { ...base, name: "Bob" }
Record { Record {
spread: Option<Box<Expr>>,
fields: Vec<(Ident, Expr)>, fields: Vec<(Ident, Expr)>,
span: Span, span: Span,
}, },
@@ -621,7 +623,8 @@ pub enum BinaryOp {
And, And,
Or, Or,
// Other // Other
Pipe, // |> Pipe, // |>
Concat, // ++
} }
impl fmt::Display for BinaryOp { impl fmt::Display for BinaryOp {
@@ -641,6 +644,7 @@ impl fmt::Display for BinaryOp {
BinaryOp::And => write!(f, "&&"), BinaryOp::And => write!(f, "&&"),
BinaryOp::Or => write!(f, "||"), BinaryOp::Or => write!(f, "||"),
BinaryOp::Pipe => write!(f, "|>"), BinaryOp::Pipe => write!(f, "|>"),
BinaryOp::Concat => write!(f, "++"),
} }
} }
} }

View File

@@ -279,7 +279,7 @@ impl CBackend {
Declaration::Let(let_decl) => { Declaration::Let(let_decl) => {
// Skip run expressions - they're handled in the main wrapper // Skip run expressions - they're handled in the main wrapper
if !matches!(&let_decl.value, Expr::Run { .. }) { if !matches!(&let_decl.value, Expr::Run { .. }) {
self.emit_global_let(&let_decl.name, &let_decl.value)?; self.emit_global_let(&let_decl.name)?;
} }
} }
_ => {} _ => {}
@@ -730,10 +730,10 @@ impl CBackend {
} }
// Check for string concatenation - use lux_string_concat instead of + // Check for string concatenation - use lux_string_concat instead of +
if matches!(op, BinaryOp::Add) { if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
let left_is_string = self.infer_expr_type(left).as_deref() == Some("LuxString"); let left_is_string = self.infer_expr_type(left).as_deref() == Some("LuxString");
let right_is_string = self.infer_expr_type(right).as_deref() == Some("LuxString"); let right_is_string = self.infer_expr_type(right).as_deref() == Some("LuxString");
if left_is_string || right_is_string { if left_is_string || right_is_string || matches!(op, BinaryOp::Concat) {
return Ok(format!("lux_string_concat({}, {})", l, r)); return Ok(format!("lux_string_concat({}, {})", l, r));
} }
} }
@@ -858,6 +858,7 @@ impl CBackend {
self.writeln("#include <stdio.h>"); self.writeln("#include <stdio.h>");
self.writeln("#include <stdlib.h>"); self.writeln("#include <stdlib.h>");
self.writeln("#include <string.h>"); self.writeln("#include <string.h>");
self.writeln("#include <math.h>");
self.writeln(""); self.writeln("");
self.writeln("// === Lux Runtime Types ==="); self.writeln("// === Lux Runtime Types ===");
self.writeln(""); self.writeln("");
@@ -881,6 +882,14 @@ impl CBackend {
self.writeln(" int64_t capacity;"); self.writeln(" int64_t capacity;");
self.writeln("};"); self.writeln("};");
self.writeln(""); self.writeln("");
self.writeln("// Map struct (linear-scan key-value table, string keys)");
self.writeln("typedef struct {");
self.writeln(" LuxString* keys;");
self.writeln(" void** values;");
self.writeln(" int64_t length;");
self.writeln(" int64_t capacity;");
self.writeln("} LuxMap;");
self.writeln("");
self.writeln("// === Reference Counting Infrastructure ==="); self.writeln("// === Reference Counting Infrastructure ===");
self.writeln("// Perceus-inspired RC system for automatic memory management."); self.writeln("// Perceus-inspired RC system for automatic memory management.");
self.writeln("// See docs/REFERENCE_COUNTING.md for details."); self.writeln("// See docs/REFERENCE_COUNTING.md for details.");
@@ -1140,6 +1149,7 @@ impl CBackend {
self.writeln(" void (*delete_file)(void* env, LuxString path);"); self.writeln(" void (*delete_file)(void* env, LuxString path);");
self.writeln(" LuxBool (*isDir)(void* env, LuxString path);"); self.writeln(" LuxBool (*isDir)(void* env, LuxString path);");
self.writeln(" void (*mkdir)(void* env, LuxString path);"); self.writeln(" void (*mkdir)(void* env, LuxString path);");
self.writeln(" void (*copy)(void* env, LuxString src, LuxString dst);");
self.writeln(" LuxList* (*readDir)(void* env, LuxString path);"); self.writeln(" LuxList* (*readDir)(void* env, LuxString path);");
self.writeln(" void* env;"); self.writeln(" void* env;");
self.writeln("} LuxFileHandler;"); self.writeln("} LuxFileHandler;");
@@ -1327,6 +1337,20 @@ impl CBackend {
self.writeln(" mkdir(path, 0755);"); self.writeln(" mkdir(path, 0755);");
self.writeln("}"); self.writeln("}");
self.writeln(""); self.writeln("");
self.writeln("static void lux_file_copy(LuxString src, LuxString dst) {");
self.writeln(" FILE* fin = fopen(src, \"rb\");");
self.writeln(" if (!fin) return;");
self.writeln(" FILE* fout = fopen(dst, \"wb\");");
self.writeln(" if (!fout) { fclose(fin); return; }");
self.writeln(" char buf[4096];");
self.writeln(" size_t n;");
self.writeln(" while ((n = fread(buf, 1, sizeof(buf), fin)) > 0) {");
self.writeln(" fwrite(buf, 1, n, fout);");
self.writeln(" }");
self.writeln(" fclose(fin);");
self.writeln(" fclose(fout);");
self.writeln("}");
self.writeln("");
self.writeln("#include <dirent.h>"); self.writeln("#include <dirent.h>");
self.writeln("// Forward declarations needed by lux_file_readDir"); self.writeln("// Forward declarations needed by lux_file_readDir");
self.writeln("static LuxList* lux_list_new(int64_t capacity);"); self.writeln("static LuxList* lux_list_new(int64_t capacity);");
@@ -1382,6 +1406,11 @@ impl CBackend {
self.writeln(" lux_file_mkdir(path);"); self.writeln(" lux_file_mkdir(path);");
self.writeln("}"); self.writeln("}");
self.writeln(""); self.writeln("");
self.writeln("static void default_file_copy(void* env, LuxString src, LuxString dst) {");
self.writeln(" (void)env;");
self.writeln(" lux_file_copy(src, dst);");
self.writeln("}");
self.writeln("");
self.writeln("static LuxList* default_file_readDir(void* env, LuxString path) {"); self.writeln("static LuxList* default_file_readDir(void* env, LuxString path) {");
self.writeln(" (void)env;"); self.writeln(" (void)env;");
self.writeln(" return lux_file_readDir(path);"); self.writeln(" return lux_file_readDir(path);");
@@ -1395,6 +1424,7 @@ impl CBackend {
self.writeln(" .delete_file = default_file_delete,"); self.writeln(" .delete_file = default_file_delete,");
self.writeln(" .isDir = default_file_isDir,"); self.writeln(" .isDir = default_file_isDir,");
self.writeln(" .mkdir = default_file_mkdir,"); self.writeln(" .mkdir = default_file_mkdir,");
self.writeln(" .copy = default_file_copy,");
self.writeln(" .readDir = default_file_readDir,"); self.writeln(" .readDir = default_file_readDir,");
self.writeln(" .env = NULL"); self.writeln(" .env = NULL");
self.writeln("};"); self.writeln("};");
@@ -2042,6 +2072,76 @@ impl CBackend {
self.writeln(" return result;"); self.writeln(" return result;");
self.writeln("}"); self.writeln("}");
self.writeln(""); self.writeln("");
// === Map Runtime Functions ===
self.writeln("static LuxMap* lux_map_new(int64_t capacity) {");
self.writeln(" LuxMap* map = (LuxMap*)malloc(sizeof(LuxMap));");
self.writeln(" map->capacity = capacity > 0 ? capacity : 8;");
self.writeln(" map->keys = (LuxString*)calloc(map->capacity, sizeof(LuxString));");
self.writeln(" map->values = (void**)calloc(map->capacity, sizeof(void*));");
self.writeln(" map->length = 0;");
self.writeln(" return map;");
self.writeln("}");
self.writeln("");
self.writeln("static int64_t lux_map_find(LuxMap* map, LuxString key) {");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" if (map->keys[i] && strcmp(map->keys[i], key) == 0) return i;");
self.writeln(" }");
self.writeln(" return -1;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_clone(LuxMap* map) {");
self.writeln(" LuxMap* result = lux_map_new(map->capacity);");
self.writeln(" result->length = map->length;");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" result->keys[i] = lux_string_dup(map->keys[i]);");
self.writeln(" result->values[i] = map->values[i];");
self.writeln(" lux_incref(map->values[i]);");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_set(LuxMap* map, LuxString key, void* value) {");
self.writeln(" LuxMap* result = lux_map_clone(map);");
self.writeln(" int64_t idx = lux_map_find(result, key);");
self.writeln(" if (idx >= 0) {");
self.writeln(" lux_decref(result->values[idx]);");
self.writeln(" result->values[idx] = value;");
self.writeln(" lux_incref(value);");
self.writeln(" } else {");
self.writeln(" if (result->length >= result->capacity) {");
self.writeln(" result->capacity *= 2;");
self.writeln(" result->keys = (LuxString*)realloc(result->keys, sizeof(LuxString) * result->capacity);");
self.writeln(" result->values = (void**)realloc(result->values, sizeof(void*) * result->capacity);");
self.writeln(" }");
self.writeln(" result->keys[result->length] = lux_string_dup(key);");
self.writeln(" result->values[result->length] = value;");
self.writeln(" lux_incref(value);");
self.writeln(" result->length++;");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static int64_t lux_map_size(LuxMap* map) { return map->length; }");
self.writeln("static LuxBool lux_map_isEmpty(LuxMap* map) { return map->length == 0; }");
self.writeln("");
self.writeln("static LuxBool lux_map_contains(LuxMap* map, LuxString key) {");
self.writeln(" return lux_map_find(map, key) >= 0;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_remove(LuxMap* map, LuxString key) {");
self.writeln(" LuxMap* result = lux_map_new(map->capacity);");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" if (strcmp(map->keys[i], key) != 0) {");
self.writeln(" result->keys[result->length] = lux_string_dup(map->keys[i]);");
self.writeln(" result->values[result->length] = map->values[i];");
self.writeln(" lux_incref(map->values[i]);");
self.writeln(" result->length++;");
self.writeln(" }");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static Option lux_option_none(void) { return (Option){Option_TAG_NONE}; }"); self.writeln("static Option lux_option_none(void) { return (Option){Option_TAG_NONE}; }");
self.writeln("static Option lux_option_some(void* value) { return (Option){Option_TAG_SOME, .data.some = {value}}; }"); self.writeln("static Option lux_option_some(void* value) { return (Option){Option_TAG_SOME, .data.some = {value}}; }");
self.writeln(""); self.writeln("");
@@ -2839,8 +2939,18 @@ impl CBackend {
} }
} }
// String concatenation for ++ and +
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
let left_is_string = self.infer_expr_type(left).as_deref() == Some("LuxString");
let right_is_string = self.infer_expr_type(right).as_deref() == Some("LuxString");
if left_is_string || right_is_string || matches!(op, BinaryOp::Concat) {
return Ok(format!("lux_string_concat({}, {})", l, r));
}
}
let op_str = match op { let op_str = match op {
BinaryOp::Add => "+", BinaryOp::Add => "+",
BinaryOp::Concat => unreachable!("handled above"),
BinaryOp::Sub => "-", BinaryOp::Sub => "-",
BinaryOp::Mul => "*", BinaryOp::Mul => "*",
BinaryOp::Div => "/", BinaryOp::Div => "/",
@@ -3003,6 +3113,9 @@ impl CBackend {
if module_name.name == "List" { if module_name.name == "List" {
return self.emit_list_operation(&field.name, args); return self.emit_list_operation(&field.name, args);
} }
if module_name.name == "Map" {
return self.emit_map_operation(&field.name, args);
}
// Int module // Int module
if module_name.name == "Int" && field.name == "toString" { if module_name.name == "Int" && field.name == "toString" {
let arg = self.emit_expr(&args[0])?; let arg = self.emit_expr(&args[0])?;
@@ -3011,6 +3124,10 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString"); self.register_rc_var(&temp, "LuxString");
return Ok(temp); return Ok(temp);
} }
if module_name.name == "Int" && field.name == "toFloat" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxFloat){})", arg));
}
// Float module // Float module
if module_name.name == "Float" && field.name == "toString" { if module_name.name == "Float" && field.name == "toString" {
let arg = self.emit_expr(&args[0])?; let arg = self.emit_expr(&args[0])?;
@@ -3019,6 +3136,14 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString"); self.register_rc_var(&temp, "LuxString");
return Ok(temp); return Ok(temp);
} }
if module_name.name == "Float" && field.name == "toInt" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxInt){})", arg));
}
// Math module
if module_name.name == "Math" {
return self.emit_math_operation(&field.name, args);
}
// Check for user-defined module function // Check for user-defined module function
let key = (module_name.name.clone(), field.name.clone()); let key = (module_name.name.clone(), field.name.clone());
if let Some(c_name) = self.module_functions.get(&key).cloned() { if let Some(c_name) = self.module_functions.get(&key).cloned() {
@@ -3364,6 +3489,10 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString"); self.register_rc_var(&temp, "LuxString");
return Ok(temp); return Ok(temp);
} }
"toFloat" => {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxFloat){})", arg));
}
_ => {} _ => {}
} }
} }
@@ -3378,10 +3507,24 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString"); self.register_rc_var(&temp, "LuxString");
return Ok(temp); return Ok(temp);
} }
"toInt" => {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxInt){})", arg));
}
_ => {} _ => {}
} }
} }
// Math module (treated as effect by parser but handled as direct C calls)
if effect.name == "Math" {
return self.emit_math_operation(&operation.name, args);
}
// Map module
if effect.name == "Map" {
return self.emit_map_operation(&operation.name, args);
}
// Built-in Console effect // Built-in Console effect
if effect.name == "Console" { if effect.name == "Console" {
if operation.name == "print" { if operation.name == "print" {
@@ -3557,6 +3700,16 @@ impl CBackend {
} }
return Ok("NULL".to_string()); return Ok("NULL".to_string());
} }
"copy" => {
let src = self.emit_expr(&args[0])?;
let dst = self.emit_expr(&args[1])?;
if self.has_evidence {
self.writeln(&format!("ev->file->copy(ev->file->env, {}, {});", src, dst));
} else {
self.writeln(&format!("lux_file_copy({}, {});", src, dst));
}
return Ok("NULL".to_string());
}
"readDir" | "listDir" => { "readDir" | "listDir" => {
let path = self.emit_expr(&args[0])?; let path = self.emit_expr(&args[0])?;
let temp = format!("_readdir_{}", self.fresh_name()); let temp = format!("_readdir_{}", self.fresh_name());
@@ -3844,12 +3997,34 @@ impl CBackend {
} }
} }
Expr::Record { fields, .. } => { Expr::Record {
let field_strs: Result<Vec<_>, _> = fields.iter().map(|(name, val)| { spread, fields, ..
let v = self.emit_expr(val)?; } => {
Ok(format!(".{} = {}", name.name, v)) if let Some(spread_expr) = spread {
}).collect(); // Evaluate spread source, then override fields
Ok(format!("{{ {} }}", field_strs?.join(", "))) let base = self.emit_expr(spread_expr)?;
if fields.is_empty() {
Ok(base)
} else {
// Copy spread into a temp, then override fields
let temp = format!("_spread_{}", self.fresh_name());
self.writeln(&format!("__auto_type {} = {};", temp, base));
for (name, val) in fields {
let v = self.emit_expr(val)?;
self.writeln(&format!("{}.{} = {};", temp, name.name, v));
}
Ok(temp)
}
} else {
let field_strs: Result<Vec<_>, _> = fields
.iter()
.map(|(name, val)| {
let v = self.emit_expr(val)?;
Ok(format!(".{} = {}", name.name, v))
})
.collect();
Ok(format!("{{ {} }}", field_strs?.join(", ")))
}
} }
Expr::Field { object, field, .. } => { Expr::Field { object, field, .. } => {
@@ -3919,6 +4094,64 @@ impl CBackend {
} }
} }
/// Emit code for Math module operations (Math.sin, Math.cos, etc.)
fn emit_math_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op {
"abs" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("fabs({})", x))
}
"min" => {
let a = self.emit_expr(&args[0])?;
let b = self.emit_expr(&args[1])?;
Ok(format!("fmin({}, {})", a, b))
}
"max" => {
let a = self.emit_expr(&args[0])?;
let b = self.emit_expr(&args[1])?;
Ok(format!("fmax({}, {})", a, b))
}
"sqrt" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("sqrt({})", x))
}
"pow" => {
let base = self.emit_expr(&args[0])?;
let exp = self.emit_expr(&args[1])?;
Ok(format!("pow({}, {})", base, exp))
}
"floor" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)floor({})", x))
}
"ceil" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)ceil({})", x))
}
"round" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)round({})", x))
}
"sin" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("sin({})", x))
}
"cos" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("cos({})", x))
}
"atan2" => {
let y = self.emit_expr(&args[0])?;
let x = self.emit_expr(&args[1])?;
Ok(format!("atan2({}, {})", y, x))
}
_ => Err(CGenError {
message: format!("Math.{} not supported in C backend", op),
span: None,
}),
}
}
/// Emit code for List module operations (List.map, List.filter, etc.) /// Emit code for List module operations (List.map, List.filter, etc.)
fn emit_list_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> { fn emit_list_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op { match op {
@@ -4420,6 +4653,140 @@ impl CBackend {
} }
} }
/// Emit code for Map module operations
fn emit_map_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op {
"new" => {
let temp = format!("_map_new_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_new(8);", temp));
Ok(temp)
}
"set" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let val = self.emit_expr(&args[2])?;
let boxed_val = self.box_value(&val, None);
let temp = format!("_map_set_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_set({}, {}, {});", temp, map, key, boxed_val));
Ok(temp)
}
"get" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let idx_temp = format!("_map_idx_{}", self.fresh_name());
let result_temp = format!("_map_get_{}", self.fresh_name());
self.writeln(&format!("int64_t {} = lux_map_find({}, {});", idx_temp, map, key));
self.writeln(&format!("Option {};", result_temp));
self.writeln(&format!("if ({} >= 0) {{", idx_temp));
self.indent += 1;
self.writeln(&format!("lux_incref({}->values[{}]);", map, idx_temp));
self.writeln(&format!("{} = lux_option_some({}->values[{}]);", result_temp, map, idx_temp));
self.indent -= 1;
self.writeln("} else {");
self.indent += 1;
self.writeln(&format!("{} = lux_option_none();", result_temp));
self.indent -= 1;
self.writeln("}");
Ok(result_temp)
}
"contains" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!("lux_map_contains({}, {})", map, key))
}
"remove" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let temp = format!("_map_rm_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_remove({}, {});", temp, map, key));
Ok(temp)
}
"keys" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_keys_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
// Sort keys: simple insertion sort
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", map));
self.indent += 1;
self.writeln(&format!("LuxString _ks = lux_string_dup({}->keys[_i]);", map));
self.writeln(&format!("lux_list_push({}, _ks);", temp));
self.indent -= 1;
self.writeln("}");
// Sort via bubble sort (small N)
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++)", temp));
self.writeln(&format!(" for (int64_t _j = _i+1; _j < {}->length; _j++)", temp));
self.writeln(&format!(" if (strcmp({}->elements[_i], {}->elements[_j]) > 0) {{", temp, temp));
self.writeln(&format!(" void* _t = {}->elements[_i]; {}->elements[_i] = {}->elements[_j]; {}->elements[_j] = _t;", temp, temp, temp, temp));
self.writeln(" }");
Ok(temp)
}
"values" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_vals_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
// Sort by key first, then collect values
self.writeln(&format!("int64_t* _idx = (int64_t*)malloc(sizeof(int64_t) * {}->length);", map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) _idx[_i] = _i;", map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++)", map));
self.writeln(&format!(" for (int64_t _j = _i+1; _j < {}->length; _j++)", map));
self.writeln(&format!(" if (strcmp({}->keys[_idx[_i]], {}->keys[_idx[_j]]) > 0) {{ int64_t _t = _idx[_i]; _idx[_i] = _idx[_j]; _idx[_j] = _t; }}", map, map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", map));
self.indent += 1;
self.writeln(&format!("lux_incref({}->values[_idx[_i]]);", map));
self.writeln(&format!("lux_list_push({}, {}->values[_idx[_i]]);", temp, map));
self.indent -= 1;
self.writeln("}");
self.writeln("free(_idx);");
Ok(temp)
}
"size" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("lux_map_size({})", map))
}
"isEmpty" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("lux_map_isEmpty({})", map))
}
"fromList" => {
let list = self.emit_expr(&args[0])?;
let temp = format!("_map_fl_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_new({}->length);", temp, list));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", list));
self.indent += 1;
// Elements are tuples (boxed as void*) — we treat them as a simple 2-element struct
self.writeln("// Each element is a (String, V) tuple - not yet fully supported in C backend for Map");
self.indent -= 1;
self.writeln("}");
Ok(temp)
}
"toList" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_tl_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
self.writeln("// Map.toList not fully supported in C backend yet");
Ok(temp)
}
"merge" => {
let m1 = self.emit_expr(&args[0])?;
let m2 = self.emit_expr(&args[1])?;
let temp = format!("_map_merge_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_clone({});", temp, m1));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", m2));
self.indent += 1;
self.writeln(&format!("LuxMap* _next = lux_map_set({}, {}->keys[_i], {}->values[_i]);", temp, m2, m2));
self.writeln(&format!("free({}->keys); free({}->values); free({});", temp, temp, temp));
self.writeln(&format!("{} = _next;", temp));
self.indent -= 1;
self.writeln("}");
Ok(temp)
}
_ => Err(CGenError {
message: format!("Unsupported Map operation: {}", op),
span: None,
}),
}
}
fn emit_expr_with_substitution(&mut self, expr: &Expr, from: &str, to: &str) -> Result<String, CGenError> { fn emit_expr_with_substitution(&mut self, expr: &Expr, from: &str, to: &str) -> Result<String, CGenError> {
// Simple substitution - in a real implementation, this would be more sophisticated // Simple substitution - in a real implementation, this would be more sophisticated
match expr { match expr {
@@ -4732,11 +5099,13 @@ impl CBackend {
"toString" => return Some("LuxString".to_string()), "toString" => return Some("LuxString".to_string()),
"parse" => return Some("Option".to_string()), "parse" => return Some("Option".to_string()),
"abs" | "min" | "max" => return Some("LuxInt".to_string()), "abs" | "min" | "max" => return Some("LuxInt".to_string()),
"toFloat" => return Some("LuxFloat".to_string()),
_ => {} _ => {}
}, },
"Float" => match field.name.as_str() { "Float" => match field.name.as_str() {
"toString" => return Some("LuxString".to_string()), "toString" => return Some("LuxString".to_string()),
"parse" => return Some("Option".to_string()), "parse" => return Some("Option".to_string()),
"toInt" => return Some("LuxInt".to_string()),
_ => return Some("LuxFloat".to_string()), _ => return Some("LuxFloat".to_string()),
}, },
_ => { _ => {
@@ -4788,6 +5157,7 @@ impl CBackend {
if effect.name == "Int" { if effect.name == "Int" {
match operation.name.as_str() { match operation.name.as_str() {
"toString" => return Some("LuxString".to_string()), "toString" => return Some("LuxString".to_string()),
"toFloat" => return Some("LuxFloat".to_string()),
_ => return None, _ => return None,
} }
} }
@@ -4795,6 +5165,7 @@ impl CBackend {
if effect.name == "Float" { if effect.name == "Float" {
match operation.name.as_str() { match operation.name.as_str() {
"toString" => return Some("LuxString".to_string()), "toString" => return Some("LuxString".to_string()),
"toInt" => return Some("LuxInt".to_string()),
_ => return None, _ => return None,
} }
} }
@@ -4835,7 +5206,7 @@ impl CBackend {
} else if effect.name == "File" { } else if effect.name == "File" {
match operation.name.as_str() { match operation.name.as_str() {
"read" => Some("LuxString".to_string()), "read" => Some("LuxString".to_string()),
"write" | "append" | "delete" | "mkdir" => Some("void".to_string()), "write" | "append" | "delete" | "mkdir" | "copy" => Some("void".to_string()),
"exists" | "isDir" => Some("LuxBool".to_string()), "exists" | "isDir" => Some("LuxBool".to_string()),
"readDir" | "listDir" => Some("LuxList*".to_string()), "readDir" | "listDir" => Some("LuxList*".to_string()),
_ => None, _ => None,
@@ -5221,9 +5592,9 @@ impl CBackend {
} }
} }
fn emit_global_let(&mut self, name: &Ident, value: &Expr) -> Result<(), CGenError> { fn emit_global_let(&mut self, name: &Ident) -> Result<(), CGenError> {
let val = self.emit_expr(value)?; // Declare global variable without initializer (initialized in main)
self.writeln(&format!("static LuxInt {} = {};", name.name, val)); self.writeln(&format!("static LuxInt {} = 0;", name.name));
self.writeln(""); self.writeln("");
Ok(()) Ok(())
} }
@@ -5234,12 +5605,16 @@ impl CBackend {
matches!(d, Declaration::Function(f) if f.name.name == "main") matches!(d, Declaration::Function(f) if f.name.name == "main")
}); });
// Check for top-level run expressions // Check for top-level run expressions or let bindings
let has_run = program.declarations.iter().any(|d| { let has_run = program.declarations.iter().any(|d| {
matches!(d, Declaration::Let(let_decl) if matches!(&let_decl.value, Expr::Run { .. })) matches!(d, Declaration::Let(let_decl) if matches!(&let_decl.value, Expr::Run { .. }))
}); });
if has_main || has_run { let has_global_lets = program.declarations.iter().any(|d| {
matches!(d, Declaration::Let(let_decl) if !matches!(&let_decl.value, Expr::Run { .. }))
});
if has_main || has_run || has_global_lets {
self.writeln("int main(int argc, char** argv) {"); self.writeln("int main(int argc, char** argv) {");
self.indent += 1; self.indent += 1;
@@ -5248,6 +5623,16 @@ impl CBackend {
self.writeln("lux_argv = argv;"); self.writeln("lux_argv = argv;");
self.writeln(""); self.writeln("");
// Initialize top-level let bindings (non-run) inside main
for decl in &program.declarations {
if let Declaration::Let(let_decl) = decl {
if !matches!(&let_decl.value, Expr::Run { .. }) {
let val = self.emit_expr(&let_decl.value)?;
self.writeln(&format!("{} = {};", let_decl.name.name, val));
}
}
}
// Execute top-level let bindings with run expressions // Execute top-level let bindings with run expressions
// Track if main was already called via a run expression // Track if main was already called via a run expression
let mut main_called_via_run = false; let mut main_called_via_run = false;
@@ -5821,7 +6206,10 @@ impl CBackend {
} }
self.collect_free_vars(body, &inner_bound, free); self.collect_free_vars(body, &inner_bound, free);
} }
Expr::Record { fields, .. } => { Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_free_vars(spread_expr, bound, free);
}
for (_, val) in fields { for (_, val) in fields {
self.collect_free_vars(val, bound, free); self.collect_free_vars(val, bound, free);
} }

View File

@@ -69,6 +69,8 @@ pub struct JsBackend {
has_handlers: bool, has_handlers: bool,
/// Variable substitutions for let binding /// Variable substitutions for let binding
var_substitutions: HashMap<String, String>, var_substitutions: HashMap<String, String>,
/// Effects actually used in the program (for tree-shaking runtime)
used_effects: HashSet<String>,
} }
impl JsBackend { impl JsBackend {
@@ -90,6 +92,7 @@ impl JsBackend {
effectful_functions: HashSet::new(), effectful_functions: HashSet::new(),
has_handlers: false, has_handlers: false,
var_substitutions: HashMap::new(), var_substitutions: HashMap::new(),
used_effects: HashSet::new(),
} }
} }
@@ -97,9 +100,6 @@ impl JsBackend {
pub fn generate(&mut self, program: &Program) -> Result<String, JsGenError> { pub fn generate(&mut self, program: &Program) -> Result<String, JsGenError> {
self.output.clear(); self.output.clear();
// Emit runtime helpers
self.emit_runtime();
// First pass: collect all function names, types, and effects // First pass: collect all function names, types, and effects
for decl in &program.declarations { for decl in &program.declarations {
match decl { match decl {
@@ -116,6 +116,12 @@ impl JsBackend {
} }
} }
// Collect used effects for tree-shaking
self.collect_used_effects(program);
// Emit runtime helpers (tree-shaken based on used effects)
self.emit_runtime();
// Emit type constructors // Emit type constructors
for decl in &program.declarations { for decl in &program.declarations {
if let Declaration::Type(t) = decl { if let Declaration::Type(t) = decl {
@@ -163,32 +169,181 @@ impl JsBackend {
Ok(self.output.clone()) Ok(self.output.clone())
} }
/// Emit the minimal Lux runtime /// Collect all effects used in the program for runtime tree-shaking
fn collect_used_effects(&mut self, program: &Program) {
for decl in &program.declarations {
match decl {
Declaration::Function(f) => {
for effect in &f.effects {
self.used_effects.insert(effect.name.clone());
}
self.collect_effects_from_expr(&f.body);
}
Declaration::Let(l) => {
self.collect_effects_from_expr(&l.value);
}
Declaration::Handler(h) => {
self.used_effects.insert(h.effect.name.clone());
for imp in &h.implementations {
self.collect_effects_from_expr(&imp.body);
}
}
_ => {}
}
}
}
/// Recursively collect effect names from an expression
fn collect_effects_from_expr(&mut self, expr: &Expr) {
match expr {
Expr::EffectOp { effect, args, .. } => {
self.used_effects.insert(effect.name.clone());
for arg in args {
self.collect_effects_from_expr(arg);
}
}
Expr::Run { expr, handlers, .. } => {
self.collect_effects_from_expr(expr);
for (effect, handler) in handlers {
self.used_effects.insert(effect.name.clone());
self.collect_effects_from_expr(handler);
}
}
Expr::Call { func, args, .. } => {
self.collect_effects_from_expr(func);
for arg in args {
self.collect_effects_from_expr(arg);
}
}
Expr::Lambda { body, effects, .. } => {
for effect in effects {
self.used_effects.insert(effect.name.clone());
}
self.collect_effects_from_expr(body);
}
Expr::Let { value, body, .. } => {
self.collect_effects_from_expr(value);
self.collect_effects_from_expr(body);
}
Expr::If { condition, then_branch, else_branch, .. } => {
self.collect_effects_from_expr(condition);
self.collect_effects_from_expr(then_branch);
self.collect_effects_from_expr(else_branch);
}
Expr::Match { scrutinee, arms, .. } => {
self.collect_effects_from_expr(scrutinee);
for arm in arms {
self.collect_effects_from_expr(&arm.body);
if let Some(guard) = &arm.guard {
self.collect_effects_from_expr(guard);
}
}
}
Expr::Block { statements, result, .. } => {
for stmt in statements {
match stmt {
Statement::Expr(e) => self.collect_effects_from_expr(e),
Statement::Let { value, .. } => self.collect_effects_from_expr(value),
}
}
self.collect_effects_from_expr(result);
}
Expr::BinaryOp { left, right, .. } => {
self.collect_effects_from_expr(left);
self.collect_effects_from_expr(right);
}
Expr::UnaryOp { operand, .. } => {
self.collect_effects_from_expr(operand);
}
Expr::Field { object, .. } => {
self.collect_effects_from_expr(object);
}
Expr::TupleIndex { object, .. } => {
self.collect_effects_from_expr(object);
}
Expr::Record { spread, fields, .. } => {
if let Some(s) = spread {
self.collect_effects_from_expr(s);
}
for (_, expr) in fields {
self.collect_effects_from_expr(expr);
}
}
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
for el in elements {
self.collect_effects_from_expr(el);
}
}
Expr::Resume { value, .. } => {
self.collect_effects_from_expr(value);
}
Expr::Literal(_) | Expr::Var(_) => {}
}
}
/// Emit the Lux runtime, tree-shaken based on used effects
fn emit_runtime(&mut self) { fn emit_runtime(&mut self) {
let uses_console = self.used_effects.contains("Console");
let uses_random = self.used_effects.contains("Random");
let uses_time = self.used_effects.contains("Time");
let uses_http = self.used_effects.contains("Http");
let uses_dom = self.used_effects.contains("Dom");
let uses_html = self.used_effects.contains("Html") || uses_dom;
self.writeln("// Lux Runtime"); self.writeln("// Lux Runtime");
self.writeln("const Lux = {"); self.writeln("const Lux = {");
self.indent += 1; self.indent += 1;
// Option helpers // Core helpers — always emitted
self.writeln("Some: (value) => ({ tag: \"Some\", value }),"); self.writeln("Some: (value) => ({ tag: \"Some\", value }),");
self.writeln("None: () => ({ tag: \"None\" }),"); self.writeln("None: () => ({ tag: \"None\" }),");
self.writeln(""); self.writeln("");
// Result helpers
self.writeln("Ok: (value) => ({ tag: \"Ok\", value }),"); self.writeln("Ok: (value) => ({ tag: \"Ok\", value }),");
self.writeln("Err: (error) => ({ tag: \"Err\", error }),"); self.writeln("Err: (error) => ({ tag: \"Err\", error }),");
self.writeln(""); self.writeln("");
// List helpers
self.writeln("Cons: (head, tail) => [head, ...tail],"); self.writeln("Cons: (head, tail) => [head, ...tail],");
self.writeln("Nil: () => [],"); self.writeln("Nil: () => [],");
self.writeln(""); self.writeln("");
// Default handlers for effects // Default handlers — only include effects that are used
self.writeln("defaultHandlers: {"); self.writeln("defaultHandlers: {");
self.indent += 1; self.indent += 1;
// Console effect if uses_console {
self.emit_console_handler();
}
if uses_random {
self.emit_random_handler();
}
if uses_time {
self.emit_time_handler();
}
if uses_http {
self.emit_http_handler();
}
if uses_dom {
self.emit_dom_handler();
}
self.indent -= 1;
self.writeln("},");
// HTML rendering — only if Html or Dom effects are used
if uses_html {
self.emit_html_helpers();
}
// TEA runtime — only if Dom is used
if uses_dom {
self.emit_tea_runtime();
}
self.indent -= 1;
self.writeln("};");
self.writeln("");
}
fn emit_console_handler(&mut self) {
self.writeln("Console: {"); self.writeln("Console: {");
self.indent += 1; self.indent += 1;
self.writeln("print: (msg) => console.log(msg),"); self.writeln("print: (msg) => console.log(msg),");
@@ -207,8 +362,9 @@ impl JsBackend {
self.writeln("readInt: () => parseInt(Lux.defaultHandlers.Console.readLine(), 10)"); self.writeln("readInt: () => parseInt(Lux.defaultHandlers.Console.readLine(), 10)");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// Random effect fn emit_random_handler(&mut self) {
self.writeln("Random: {"); self.writeln("Random: {");
self.indent += 1; self.indent += 1;
self.writeln("int: (min, max) => Math.floor(Math.random() * (max - min + 1)) + min,"); self.writeln("int: (min, max) => Math.floor(Math.random() * (max - min + 1)) + min,");
@@ -216,16 +372,18 @@ impl JsBackend {
self.writeln("float: () => Math.random()"); self.writeln("float: () => Math.random()");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// Time effect fn emit_time_handler(&mut self) {
self.writeln("Time: {"); self.writeln("Time: {");
self.indent += 1; self.indent += 1;
self.writeln("now: () => Date.now(),"); self.writeln("now: () => Date.now(),");
self.writeln("sleep: (ms) => new Promise(resolve => setTimeout(resolve, ms))"); self.writeln("sleep: (ms) => new Promise(resolve => setTimeout(resolve, ms))");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// Http effect (browser/Node compatible) fn emit_http_handler(&mut self) {
self.writeln("Http: {"); self.writeln("Http: {");
self.indent += 1; self.indent += 1;
self.writeln("get: async (url) => {"); self.writeln("get: async (url) => {");
@@ -287,8 +445,9 @@ impl JsBackend {
self.writeln("}"); self.writeln("}");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// Dom effect (browser only - stubs for Node.js) fn emit_dom_handler(&mut self) {
self.writeln("Dom: {"); self.writeln("Dom: {");
self.indent += 1; self.indent += 1;
@@ -316,7 +475,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Element creation
self.writeln("createElement: (tag) => {"); self.writeln("createElement: (tag) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (typeof document === 'undefined') return null;"); self.writeln("if (typeof document === 'undefined') return null;");
@@ -331,7 +489,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// DOM manipulation
self.writeln("appendChild: (parent, child) => {"); self.writeln("appendChild: (parent, child) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (parent && child) parent.appendChild(child);"); self.writeln("if (parent && child) parent.appendChild(child);");
@@ -356,7 +513,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Content
self.writeln("setTextContent: (el, text) => {"); self.writeln("setTextContent: (el, text) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el) el.textContent = text;"); self.writeln("if (el) el.textContent = text;");
@@ -381,7 +537,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Attributes
self.writeln("setAttribute: (el, name, value) => {"); self.writeln("setAttribute: (el, name, value) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el) el.setAttribute(name, value);"); self.writeln("if (el) el.setAttribute(name, value);");
@@ -408,7 +563,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Classes
self.writeln("addClass: (el, className) => {"); self.writeln("addClass: (el, className) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el) el.classList.add(className);"); self.writeln("if (el) el.classList.add(className);");
@@ -433,7 +587,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Styles
self.writeln("setStyle: (el, property, value) => {"); self.writeln("setStyle: (el, property, value) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el) el.style[property] = value;"); self.writeln("if (el) el.style[property] = value;");
@@ -446,7 +599,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Form elements
self.writeln("getValue: (el) => {"); self.writeln("getValue: (el) => {");
self.indent += 1; self.indent += 1;
self.writeln("return el ? el.value : '';"); self.writeln("return el ? el.value : '';");
@@ -471,7 +623,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Events
self.writeln("addEventListener: (el, event, handler) => {"); self.writeln("addEventListener: (el, event, handler) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el) el.addEventListener(event, handler);"); self.writeln("if (el) el.addEventListener(event, handler);");
@@ -484,7 +635,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Focus
self.writeln("focus: (el) => {"); self.writeln("focus: (el) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (el && el.focus) el.focus();"); self.writeln("if (el && el.focus) el.focus();");
@@ -497,7 +647,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Document
self.writeln("getBody: () => {"); self.writeln("getBody: () => {");
self.indent += 1; self.indent += 1;
self.writeln("if (typeof document === 'undefined') return null;"); self.writeln("if (typeof document === 'undefined') return null;");
@@ -512,7 +661,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Window
self.writeln("getWindow: () => {"); self.writeln("getWindow: () => {");
self.indent += 1; self.indent += 1;
self.writeln("if (typeof window === 'undefined') return null;"); self.writeln("if (typeof window === 'undefined') return null;");
@@ -545,7 +693,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Scroll
self.writeln("scrollTo: (x, y) => {"); self.writeln("scrollTo: (x, y) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (typeof window !== 'undefined') window.scrollTo(x, y);"); self.writeln("if (typeof window !== 'undefined') window.scrollTo(x, y);");
@@ -558,7 +705,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Dimensions
self.writeln("getBoundingClientRect: (el) => {"); self.writeln("getBoundingClientRect: (el) => {");
self.indent += 1; self.indent += 1;
self.writeln("if (!el) return { top: 0, left: 0, width: 0, height: 0, right: 0, bottom: 0 };"); self.writeln("if (!el) return { top: 0, left: 0, width: 0, height: 0, right: 0, bottom: 0 };");
@@ -574,13 +720,11 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("}"); self.writeln("}");
self.indent -= 1;
self.writeln("}");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// HTML rendering helpers fn emit_html_helpers(&mut self) {
self.writeln(""); self.writeln("");
self.writeln("// HTML rendering"); self.writeln("// HTML rendering");
self.writeln("renderHtml: (node) => {"); self.writeln("renderHtml: (node) => {");
@@ -682,8 +826,9 @@ impl JsBackend {
self.writeln("return el;"); self.writeln("return el;");
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
}
// TEA (The Elm Architecture) runtime fn emit_tea_runtime(&mut self) {
self.writeln(""); self.writeln("");
self.writeln("// The Elm Architecture (TEA) runtime"); self.writeln("// The Elm Architecture (TEA) runtime");
self.writeln("app: (config) => {"); self.writeln("app: (config) => {");
@@ -727,7 +872,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Simple app (for string-based views like the counter example)
self.writeln(""); self.writeln("");
self.writeln("// Simple TEA app (string-based view)"); self.writeln("// Simple TEA app (string-based view)");
self.writeln("simpleApp: (config) => {"); self.writeln("simpleApp: (config) => {");
@@ -757,7 +901,6 @@ impl JsBackend {
self.indent -= 1; self.indent -= 1;
self.writeln("},"); self.writeln("},");
// Diff and patch (basic implementation for view_deps optimization)
self.writeln(""); self.writeln("");
self.writeln("// Basic diff - checks if model fields changed"); self.writeln("// Basic diff - checks if model fields changed");
self.writeln("hasChanged: (oldModel, newModel, ...paths) => {"); self.writeln("hasChanged: (oldModel, newModel, ...paths) => {");
@@ -777,11 +920,7 @@ impl JsBackend {
self.writeln("}"); self.writeln("}");
self.writeln("return false;"); self.writeln("return false;");
self.indent -= 1; self.indent -= 1;
self.writeln("}"); self.writeln("},");
self.indent -= 1;
self.writeln("};");
self.writeln("");
} }
/// Collect type information from a type declaration /// Collect type information from a type declaration
@@ -888,7 +1027,8 @@ impl JsBackend {
let prev_has_handlers = self.has_handlers; let prev_has_handlers = self.has_handlers;
self.has_handlers = is_effectful; self.has_handlers = is_effectful;
// Clear var substitutions for this function // Save and clear var substitutions for this function scope
let saved_substitutions = self.var_substitutions.clone();
self.var_substitutions.clear(); self.var_substitutions.clear();
// Emit function body // Emit function body
@@ -896,6 +1036,7 @@ impl JsBackend {
self.writeln(&format!("return {};", body_code)); self.writeln(&format!("return {};", body_code));
self.has_handlers = prev_has_handlers; self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
self.indent -= 1; self.indent -= 1;
self.writeln("}"); self.writeln("}");
@@ -909,13 +1050,16 @@ impl JsBackend {
let val = self.emit_expr(&let_decl.value)?; let val = self.emit_expr(&let_decl.value)?;
let var_name = &let_decl.name.name; let var_name = &let_decl.name.name;
// Check if this is a run expression (often results in undefined) if var_name == "_" {
// We still want to execute it for its side effects // Wildcard binding: just execute for side effects
self.writeln(&format!("const {} = {};", var_name, val)); self.writeln(&format!("{};", val));
} else {
self.writeln(&format!("const {} = {};", var_name, val));
// Register the variable for future use // Register the variable for future use
self.var_substitutions self.var_substitutions
.insert(var_name.clone(), var_name.clone()); .insert(var_name.clone(), var_name.clone());
}
Ok(()) Ok(())
} }
@@ -954,12 +1098,17 @@ impl JsBackend {
let r = self.emit_expr(right)?; let r = self.emit_expr(right)?;
// Check for string concatenation // Check for string concatenation
if matches!(op, BinaryOp::Add) { if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
if self.is_string_expr(left) || self.is_string_expr(right) { if self.is_string_expr(left) || self.is_string_expr(right) {
return Ok(format!("({} + {})", l, r)); return Ok(format!("({} + {})", l, r));
} }
} }
// ++ on lists: use .concat()
if matches!(op, BinaryOp::Concat) {
return Ok(format!("{}.concat({})", l, r));
}
let op_str = match op { let op_str = match op {
BinaryOp::Add => "+", BinaryOp::Add => "+",
BinaryOp::Sub => "-", BinaryOp::Sub => "-",
@@ -974,6 +1123,7 @@ impl JsBackend {
BinaryOp::Ge => ">=", BinaryOp::Ge => ">=",
BinaryOp::And => "&&", BinaryOp::And => "&&",
BinaryOp::Or => "||", BinaryOp::Or => "||",
BinaryOp::Concat => unreachable!("handled above"),
BinaryOp::Pipe => { BinaryOp::Pipe => {
// Pipe operator: x |> f becomes f(x) // Pipe operator: x |> f becomes f(x)
return Ok(format!("{}({})", r, l)); return Ok(format!("{}({})", r, l));
@@ -1034,18 +1184,26 @@ impl JsBackend {
name, value, body, .. name, value, body, ..
} => { } => {
let val = self.emit_expr(value)?; let val = self.emit_expr(value)?;
let var_name = format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val)); if name.name == "_" {
// Wildcard binding: just execute for side effects
self.writeln(&format!("{};", val));
} else {
let var_name = format!("{}_{}", name.name, self.fresh_name());
// Add substitution self.writeln(&format!("const {} = {};", var_name, val));
self.var_substitutions
.insert(name.name.clone(), var_name.clone()); // Add substitution
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
let body_result = self.emit_expr(body)?; let body_result = self.emit_expr(body)?;
// Remove substitution // Remove substitution
self.var_substitutions.remove(&name.name); if name.name != "_" {
self.var_substitutions.remove(&name.name);
}
Ok(body_result) Ok(body_result)
} }
@@ -1057,6 +1215,31 @@ impl JsBackend {
if module_name.name == "List" { if module_name.name == "List" {
return self.emit_list_operation(&field.name, args); return self.emit_list_operation(&field.name, args);
} }
if module_name.name == "Map" {
return self.emit_map_operation(&field.name, args);
}
}
}
// Int/Float module operations
if let Expr::Field { object, field, .. } = func.as_ref() {
if let Expr::Var(module_name) = object.as_ref() {
if module_name.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toFloat" => return Ok(arg),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
if module_name.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
} }
} }
@@ -1066,6 +1249,10 @@ impl JsBackend {
let arg = self.emit_expr(&args[0])?; let arg = self.emit_expr(&args[0])?;
return Ok(format!("String({})", arg)); return Ok(format!("String({})", arg));
} }
if ident.name == "print" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("console.log({})", arg));
}
} }
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect(); let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
@@ -1142,6 +1329,26 @@ impl JsBackend {
return self.emit_math_operation(&operation.name, args); return self.emit_math_operation(&operation.name, args);
} }
// Special case: Int module operations
if effect.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toFloat" => return Ok(arg), // JS numbers are already floats
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Float module operations
if effect.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Result module operations (not an effect) // Special case: Result module operations (not an effect)
if effect.name == "Result" { if effect.name == "Result" {
return self.emit_result_operation(&operation.name, args); return self.emit_result_operation(&operation.name, args);
@@ -1152,6 +1359,11 @@ impl JsBackend {
return self.emit_json_operation(&operation.name, args); return self.emit_json_operation(&operation.name, args);
} }
// Special case: Map module operations (not an effect)
if effect.name == "Map" {
return self.emit_map_operation(&operation.name, args);
}
// Special case: Html module operations (not an effect) // Special case: Html module operations (not an effect)
if effect.name == "Html" { if effect.name == "Html" {
return self.emit_html_operation(&operation.name, args); return self.emit_html_operation(&operation.name, args);
@@ -1197,18 +1409,39 @@ impl JsBackend {
param_names param_names
}; };
// Save handler state // Save state
let prev_has_handlers = self.has_handlers; let prev_has_handlers = self.has_handlers;
let saved_substitutions = self.var_substitutions.clone();
self.has_handlers = !effects.is_empty(); self.has_handlers = !effects.is_empty();
// Register lambda params as themselves (override any outer substitutions)
for p in &all_params {
self.var_substitutions.insert(p.clone(), p.clone());
}
// Capture any statements emitted during body evaluation
let output_start = self.output.len();
let prev_indent = self.indent;
self.indent += 1;
let body_code = self.emit_expr(body)?; let body_code = self.emit_expr(body)?;
self.writeln(&format!("return {};", body_code));
// Extract body statements and restore output
let body_statements = self.output[output_start..].to_string();
self.output.truncate(output_start);
self.indent = prev_indent;
// Restore state
self.has_handlers = prev_has_handlers; self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
let indent_str = " ".repeat(self.indent);
Ok(format!( Ok(format!(
"(function({}) {{ return {}; }})", "(function({}) {{\n{}{}}})",
all_params.join(", "), all_params.join(", "),
body_code body_statements,
indent_str,
)) ))
} }
@@ -1228,10 +1461,15 @@ impl JsBackend {
} }
Statement::Let { name, value, .. } => { Statement::Let { name, value, .. } => {
let val = self.emit_expr(value)?; let val = self.emit_expr(value)?;
let var_name = format!("{}_{}", name.name, self.fresh_name()); if name.name == "_" {
self.writeln(&format!("const {} = {};", var_name, val)); self.writeln(&format!("{};", val));
self.var_substitutions } else {
.insert(name.name.clone(), var_name.clone()); let var_name =
format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
} }
} }
} }
@@ -1240,15 +1478,19 @@ impl JsBackend {
self.emit_expr(result) self.emit_expr(result)
} }
Expr::Record { fields, .. } => { Expr::Record {
let field_strs: Result<Vec<_>, _> = fields spread, fields, ..
.iter() } => {
.map(|(name, expr)| { let mut parts = Vec::new();
let val = self.emit_expr(expr)?; if let Some(spread_expr) = spread {
Ok(format!("{}: {}", name.name, val)) let spread_code = self.emit_expr(spread_expr)?;
}) parts.push(format!("...{}", spread_code));
.collect(); }
Ok(format!("{{ {} }}", field_strs?.join(", "))) for (name, expr) in fields {
let val = self.emit_expr(expr)?;
parts.push(format!("{}: {}", name.name, val));
}
Ok(format!("{{ {} }}", parts.join(", ")))
} }
Expr::Tuple { elements, .. } => { Expr::Tuple { elements, .. } => {
@@ -2067,6 +2309,86 @@ impl JsBackend {
} }
} }
/// Emit Map module operations using JS Map
fn emit_map_operation(
&mut self,
operation: &str,
args: &[Expr],
) -> Result<String, JsGenError> {
match operation {
"new" => Ok("new Map()".to_string()),
"set" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let val = self.emit_expr(&args[2])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.set({}, {}); return m; }})()",
map, key, val
))
}
"get" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"({0}.has({1}) ? Lux.Some({0}.get({1})) : Lux.None())",
map, key
))
}
"contains" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!("{}.has({})", map, key))
}
"remove" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.delete({}); return m; }})()",
map, key
))
}
"keys" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("Array.from({}.keys()).sort()", map))
}
"values" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({0}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }}).map(function(e) {{ return e[1]; }})",
map
))
}
"size" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("{}.size", map))
}
"isEmpty" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("({}.size === 0)", map))
}
"fromList" => {
let list = self.emit_expr(&args[0])?;
Ok(format!("new Map({}.map(function(t) {{ return [t[0], t[1]]; }}))", list))
}
"toList" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }})",
map
))
}
"merge" => {
let m1 = self.emit_expr(&args[0])?;
let m2 = self.emit_expr(&args[1])?;
Ok(format!("new Map([...{}, ...{}])", m1, m2))
}
_ => Err(JsGenError {
message: format!("Unknown Map operation: {}", operation),
span: None,
}),
}
}
/// Emit Html module operations for type-safe HTML construction /// Emit Html module operations for type-safe HTML construction
fn emit_html_operation( fn emit_html_operation(
&mut self, &mut self,
@@ -2338,7 +2660,7 @@ impl JsBackend {
} }
} }
Expr::BinaryOp { op, left, right, .. } => { Expr::BinaryOp { op, left, right, .. } => {
matches!(op, BinaryOp::Add) matches!(op, BinaryOp::Add | BinaryOp::Concat)
&& (self.is_string_expr(left) || self.is_string_expr(right)) && (self.is_string_expr(left) || self.is_string_expr(right))
} }
_ => false, _ => false,
@@ -3737,7 +4059,7 @@ line3"
#[test] #[test]
fn test_js_runtime_generated() { fn test_js_runtime_generated() {
// Test that the Lux runtime is properly generated // Test that the Lux runtime core is always generated
use crate::parser::Parser; use crate::parser::Parser;
let source = r#" let source = r#"
@@ -3748,21 +4070,51 @@ line3"
let mut backend = JsBackend::new(); let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate"); let js_code = backend.generate(&program).expect("Should generate");
// Check that Lux runtime includes key functions // Core runtime is always present
assert!(js_code.contains("const Lux = {"), "Lux object should be defined"); assert!(js_code.contains("const Lux = {"), "Lux object should be defined");
assert!(js_code.contains("Some:"), "Option Some should be defined"); assert!(js_code.contains("Some:"), "Option Some should be defined");
assert!(js_code.contains("None:"), "Option None should be defined"); assert!(js_code.contains("None:"), "Option None should be defined");
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined");
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined"); // Console-only program should NOT include Dom, Html, or TEA sections
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined"); assert!(!js_code.contains("Dom:"), "Dom handler should not be in Console-only program");
assert!(js_code.contains("app:"), "TEA app should be defined"); assert!(!js_code.contains("renderHtml:"), "renderHtml should not be in Console-only program");
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined"); assert!(!js_code.contains("app:"), "TEA app should not be in Console-only program");
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined"); assert!(!js_code.contains("Http:"), "Http should not be in Console-only program");
// Console should be present
assert!(js_code.contains("Console:"), "Console handler should exist");
}
#[test]
fn test_js_runtime_tree_shaking_all_effects() {
// Test that all effects are included when all are used
use crate::parser::Parser;
let source = r#"
fn main(): Unit with {Console, Dom} = {
Console.print("Hello")
let _ = Dom.getElementById("app")
()
}
"#;
let program = Parser::parse_source(source).expect("Should parse");
let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate");
assert!(js_code.contains("Console:"), "Console handler should exist");
assert!(js_code.contains("Dom:"), "Dom handler should exist");
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined when Dom is used");
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined when Dom is used");
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined when Dom is used");
assert!(js_code.contains("app:"), "TEA app should be defined when Dom is used");
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined when Dom is used");
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined when Dom is used");
} }
#[test] #[test]
fn test_js_runtime_default_handlers() { fn test_js_runtime_default_handlers() {
// Test that default handlers are properly generated // Test that only used effect handlers are generated
use crate::parser::Parser; use crate::parser::Parser;
let source = r#" let source = r#"
@@ -3773,12 +4125,12 @@ line3"
let mut backend = JsBackend::new(); let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate"); let js_code = backend.generate(&program).expect("Should generate");
// Check that default handlers include all effects // Only Console should be present
assert!(js_code.contains("Console:"), "Console handler should exist"); assert!(js_code.contains("Console:"), "Console handler should exist");
assert!(js_code.contains("Random:"), "Random handler should exist"); assert!(!js_code.contains("Random:"), "Random handler should not exist in Console-only program");
assert!(js_code.contains("Time:"), "Time handler should exist"); assert!(!js_code.contains("Time:"), "Time handler should not exist in Console-only program");
assert!(js_code.contains("Http:"), "Http handler should exist"); assert!(!js_code.contains("Http:"), "Http handler should not exist in Console-only program");
assert!(js_code.contains("Dom:"), "Dom handler should exist"); assert!(!js_code.contains("Dom:"), "Dom handler should not exist in Console-only program");
} }
#[test] #[test]

View File

@@ -688,15 +688,17 @@ impl Formatter {
.join(", ") .join(", ")
) )
} }
Expr::Record { fields, .. } => { Expr::Record {
format!( spread, fields, ..
"{{ {} }}", } => {
fields let mut parts = Vec::new();
.iter() if let Some(spread_expr) = spread {
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val))) parts.push(format!("...{}", self.format_expr(spread_expr)));
.collect::<Vec<_>>() }
.join(", ") for (name, val) in fields {
) parts.push(format!("{}: {}", name.name, self.format_expr(val)));
}
format!("{{ {} }}", parts.join(", "))
} }
Expr::EffectOp { effect, operation, args, .. } => { Expr::EffectOp { effect, operation, args, .. } => {
format!( format!(
@@ -753,6 +755,7 @@ impl Formatter {
BinaryOp::Ge => ">=", BinaryOp::Ge => ">=",
BinaryOp::And => "&&", BinaryOp::And => "&&",
BinaryOp::Or => "||", BinaryOp::Or => "||",
BinaryOp::Concat => "++",
BinaryOp::Pipe => "|>", BinaryOp::Pipe => "|>",
} }
} }

View File

@@ -74,6 +74,9 @@ pub enum BuiltinFn {
MathFloor, MathFloor,
MathCeil, MathCeil,
MathRound, MathRound,
MathSin,
MathCos,
MathAtan2,
// Additional List operations // Additional List operations
ListIsEmpty, ListIsEmpty,
@@ -97,7 +100,9 @@ pub enum BuiltinFn {
// Int/Float operations // Int/Float operations
IntToString, IntToString,
IntToFloat,
FloatToString, FloatToString,
FloatToInt,
// JSON operations // JSON operations
JsonParse, JsonParse,
@@ -119,6 +124,20 @@ pub enum BuiltinFn {
JsonString, JsonString,
JsonArray, JsonArray,
JsonObject, JsonObject,
// Map operations
MapNew,
MapSet,
MapGet,
MapContains,
MapRemove,
MapKeys,
MapValues,
MapSize,
MapIsEmpty,
MapFromList,
MapToList,
MapMerge,
} }
/// Runtime value /// Runtime value
@@ -133,6 +152,7 @@ pub enum Value {
List(Vec<Value>), List(Vec<Value>),
Tuple(Vec<Value>), Tuple(Vec<Value>),
Record(HashMap<String, Value>), Record(HashMap<String, Value>),
Map(HashMap<String, Value>),
Function(Rc<Closure>), Function(Rc<Closure>),
Handler(Rc<HandlerValue>), Handler(Rc<HandlerValue>),
/// Built-in function /// Built-in function
@@ -164,6 +184,7 @@ impl Value {
Value::List(_) => "List", Value::List(_) => "List",
Value::Tuple(_) => "Tuple", Value::Tuple(_) => "Tuple",
Value::Record(_) => "Record", Value::Record(_) => "Record",
Value::Map(_) => "Map",
Value::Function(_) => "Function", Value::Function(_) => "Function",
Value::Handler(_) => "Handler", Value::Handler(_) => "Handler",
Value::Builtin(_) => "Function", Value::Builtin(_) => "Function",
@@ -212,6 +233,11 @@ impl Value {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false) ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
}) })
} }
(Value::Map(xs), Value::Map(ys)) => {
xs.len() == ys.len() && xs.iter().all(|(k, v)| {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => { (Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y)) n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
} }
@@ -282,6 +308,16 @@ impl TryFromValue for Vec<Value> {
} }
} }
impl TryFromValue for HashMap<String, Value> {
const TYPE_NAME: &'static str = "Map";
fn try_from_value(value: &Value) -> Option<Self> {
match value {
Value::Map(m) => Some(m.clone()),
_ => None,
}
}
}
impl TryFromValue for Value { impl TryFromValue for Value {
const TYPE_NAME: &'static str = "any"; const TYPE_NAME: &'static str = "any";
fn try_from_value(value: &Value) -> Option<Self> { fn try_from_value(value: &Value) -> Option<Self> {
@@ -328,6 +364,18 @@ impl fmt::Display for Value {
} }
write!(f, " }}") write!(f, " }}")
} }
Value::Map(entries) => {
write!(f, "Map {{")?;
let mut sorted: Vec<_> = entries.iter().collect();
sorted.sort_by_key(|(k, _)| (*k).clone());
for (i, (key, value)) in sorted.iter().enumerate() {
if i > 0 {
write!(f, ", ")?;
}
write!(f, "\"{}\": {}", key, value)?;
}
write!(f, "}}")
}
Value::Function(_) => write!(f, "<function>"), Value::Function(_) => write!(f, "<function>"),
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b), Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
Value::Handler(_) => write!(f, "<handler>"), Value::Handler(_) => write!(f, "<handler>"),
@@ -1072,18 +1120,23 @@ impl Interpreter {
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)), ("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)), ("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)), ("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
])); ]));
env.define("Math", math_module); env.define("Math", math_module);
// Int module // Int module
let int_module = Value::Record(HashMap::from([ let int_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)), ("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
("toFloat".to_string(), Value::Builtin(BuiltinFn::IntToFloat)),
])); ]));
env.define("Int", int_module); env.define("Int", int_module);
// Float module // Float module
let float_module = Value::Record(HashMap::from([ let float_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)), ("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
("toInt".to_string(), Value::Builtin(BuiltinFn::FloatToInt)),
])); ]));
env.define("Float", float_module); env.define("Float", float_module);
@@ -1110,16 +1163,72 @@ impl Interpreter {
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)), ("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
])); ]));
env.define("Json", json_module); env.define("Json", json_module);
// Map module
let map_module = Value::Record(HashMap::from([
("new".to_string(), Value::Builtin(BuiltinFn::MapNew)),
("set".to_string(), Value::Builtin(BuiltinFn::MapSet)),
("get".to_string(), Value::Builtin(BuiltinFn::MapGet)),
("contains".to_string(), Value::Builtin(BuiltinFn::MapContains)),
("remove".to_string(), Value::Builtin(BuiltinFn::MapRemove)),
("keys".to_string(), Value::Builtin(BuiltinFn::MapKeys)),
("values".to_string(), Value::Builtin(BuiltinFn::MapValues)),
("size".to_string(), Value::Builtin(BuiltinFn::MapSize)),
("isEmpty".to_string(), Value::Builtin(BuiltinFn::MapIsEmpty)),
("fromList".to_string(), Value::Builtin(BuiltinFn::MapFromList)),
("toList".to_string(), Value::Builtin(BuiltinFn::MapToList)),
("merge".to_string(), Value::Builtin(BuiltinFn::MapMerge)),
]));
env.define("Map", map_module);
} }
/// Execute a program /// Execute a program
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> { pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
let mut last_value = Value::Unit; let mut last_value = Value::Unit;
let mut has_main_let = false;
for decl in &program.declarations { for decl in &program.declarations {
// Track if there's a top-level `let main = ...`
if let Declaration::Let(let_decl) = decl {
if let_decl.name.name == "main" {
has_main_let = true;
}
}
last_value = self.eval_declaration(decl)?; last_value = self.eval_declaration(decl)?;
} }
// Auto-invoke main if it was defined as a let binding with a function value
if has_main_let {
if let Some(main_val) = self.global_env.get("main") {
if let Value::Function(ref closure) = main_val {
if closure.params.is_empty() {
let span = Span { start: 0, end: 0 };
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
// Trampoline loop
loop {
match result {
EvalResult::Value(v) => {
last_value = v;
break;
}
EvalResult::Effect(req) => {
last_value = self.handle_effect(req)?;
break;
}
EvalResult::TailCall { func, args, span } => {
result = self.eval_call(func, args, span)?;
}
EvalResult::Resume(v) => {
last_value = v;
break;
}
}
}
}
}
}
}
Ok(last_value) Ok(last_value)
} }
@@ -1525,8 +1634,28 @@ impl Interpreter {
self.eval_expr_tail(result, &block_env, tail) self.eval_expr_tail(result, &block_env, tail)
} }
Expr::Record { fields, .. } => { Expr::Record {
spread, fields, ..
} => {
let mut record = HashMap::new(); let mut record = HashMap::new();
// If there's a spread, evaluate it and start with its fields
if let Some(spread_expr) = spread {
let spread_val = self.eval_expr(spread_expr, env)?;
if let Value::Record(spread_fields) = spread_val {
record = spread_fields;
} else {
return Err(RuntimeError {
message: format!(
"Spread expression must evaluate to a record, got {}",
spread_val.type_name()
),
span: Some(expr.span()),
});
}
}
// Override with explicit fields
for (name, expr) in fields { for (name, expr) in fields {
let val = self.eval_expr(expr, env)?; let val = self.eval_expr(expr, env)?;
record.insert(name.name.clone(), val); record.insert(name.name.clone(), val);
@@ -1599,6 +1728,18 @@ impl Interpreter {
span: Some(span), span: Some(span),
}), }),
}, },
BinaryOp::Concat => match (left, right) {
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
(Value::List(a), Value::List(b)) => {
let mut result = a;
result.extend(b);
Ok(Value::List(result))
}
(l, r) => Err(RuntimeError {
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
span: Some(span),
}),
},
BinaryOp::Sub => match (left, right) { BinaryOp::Sub => match (left, right) {
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)), (Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)), (Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
@@ -2287,6 +2428,26 @@ impl Interpreter {
} }
} }
BuiltinFn::IntToFloat => {
if args.len() != 1 {
return Err(err("Int.toFloat requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::Float(*n as f64))),
v => Err(err(&format!("Int.toFloat expects Int, got {}", v.type_name()))),
}
}
BuiltinFn::FloatToInt => {
if args.len() != 1 {
return Err(err("Float.toInt requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::Int(*f as i64))),
v => Err(err(&format!("Float.toInt expects Float, got {}", v.type_name()))),
}
}
BuiltinFn::TypeOf => { BuiltinFn::TypeOf => {
if args.len() != 1 { if args.len() != 1 {
return Err(err("typeOf requires 1 argument")); return Err(err("typeOf requires 1 argument"));
@@ -2463,6 +2624,45 @@ impl Interpreter {
} }
} }
BuiltinFn::MathSin => {
if args.len() != 1 {
return Err(err("Math.sin requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathCos => {
if args.len() != 1 {
return Err(err("Math.cos requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathAtan2 => {
if args.len() != 2 {
return Err(err("Math.atan2 requires 2 arguments: y, x"));
}
let y = match &args[0] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
let x = match &args[1] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
}
// Additional List operations // Additional List operations
BuiltinFn::ListIsEmpty => { BuiltinFn::ListIsEmpty => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?; let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
@@ -2952,6 +3152,128 @@ impl Interpreter {
} }
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map)))) Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
} }
// Map operations
BuiltinFn::MapNew => {
Ok(EvalResult::Value(Value::Map(HashMap::new())))
}
BuiltinFn::MapSet => {
if args.len() != 3 {
return Err(err("Map.set requires 3 arguments: map, key, value"));
}
let mut map = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.set expects Map as first argument, got {}", v.type_name()))),
};
let key = match &args[1] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.set expects String key, got {}", v.type_name()))),
};
map.insert(key, args[2].clone());
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapGet => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.get", span)?;
match map.get(&key) {
Some(v) => Ok(EvalResult::Value(Value::Constructor {
name: "Some".to_string(),
fields: vec![v.clone()],
})),
None => Ok(EvalResult::Value(Value::Constructor {
name: "None".to_string(),
fields: vec![],
})),
}
}
BuiltinFn::MapContains => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.contains", span)?;
Ok(EvalResult::Value(Value::Bool(map.contains_key(&key))))
}
BuiltinFn::MapRemove => {
let (mut map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.remove", span)?;
map.remove(&key);
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapKeys => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.keys", span)?;
let mut keys: Vec<String> = map.keys().cloned().collect();
keys.sort();
Ok(EvalResult::Value(Value::List(
keys.into_iter().map(Value::String).collect(),
)))
}
BuiltinFn::MapValues => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.values", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries.into_iter().map(|(_, v)| v).collect(),
)))
}
BuiltinFn::MapSize => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.size", span)?;
Ok(EvalResult::Value(Value::Int(map.len() as i64)))
}
BuiltinFn::MapIsEmpty => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.isEmpty", span)?;
Ok(EvalResult::Value(Value::Bool(map.is_empty())))
}
BuiltinFn::MapFromList => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "Map.fromList", span)?;
let mut map = HashMap::new();
for item in list {
match item {
Value::Tuple(fields) if fields.len() == 2 => {
let key = match &fields[0] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.fromList expects (String, V) tuples, got {} key", v.type_name()))),
};
map.insert(key, fields[1].clone());
}
_ => return Err(err("Map.fromList expects List<(String, V)>")),
}
}
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapToList => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.toList", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries
.into_iter()
.map(|(k, v)| Value::Tuple(vec![Value::String(k), v]))
.collect(),
)))
}
BuiltinFn::MapMerge => {
if args.len() != 2 {
return Err(err("Map.merge requires 2 arguments: map1, map2"));
}
let mut map1 = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as first argument, got {}", v.type_name()))),
};
let map2 = match &args[1] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as second argument, got {}", v.type_name()))),
};
for (k, v) in map2 {
map1.insert(k, v);
}
Ok(EvalResult::Value(Value::Map(map1)))
}
} }
} }
@@ -3117,6 +3439,11 @@ impl Interpreter {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false) b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
}) })
} }
(Value::Map(a), Value::Map(b)) => {
a.len() == b.len() && a.iter().all(|(k, v)| {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
( (
Value::Constructor { Value::Constructor {
name: n1, name: n1,
@@ -3537,6 +3864,30 @@ impl Interpreter {
} }
} }
("File", "copy") => {
let source = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string source path".to_string(),
span: None,
}),
};
let dest = match request.args.get(1) {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string destination path".to_string(),
span: None,
}),
};
match std::fs::copy(&source, &dest) {
Ok(_) => Ok(Value::Unit),
Err(e) => Err(RuntimeError {
message: format!("Failed to copy '{}' to '{}': {}", source, dest, e),
span: None,
}),
}
}
// ===== Process Effect ===== // ===== Process Effect =====
("Process", "exec") => { ("Process", "exec") => {
use std::process::Command; use std::process::Command;
@@ -5044,6 +5395,7 @@ mod tests {
// Create a simple migration that adds a field // Create a simple migration that adds a field
// Migration: old.name -> { name: old.name, email: "unknown" } // Migration: old.name -> { name: old.name, email: "unknown" }
let migration_body = Expr::Record { let migration_body = Expr::Record {
spread: None,
fields: vec![ fields: vec![
( (
Ident::new("name", Span::default()), Ident::new("name", Span::default()),

View File

@@ -42,6 +42,7 @@ pub enum TokenKind {
Effect, Effect,
Handler, Handler,
Run, Run,
Handle,
Resume, Resume,
Type, Type,
True, True,
@@ -70,6 +71,7 @@ pub enum TokenKind {
// Operators // Operators
Plus, // + Plus, // +
PlusPlus, // ++
Minus, // - Minus, // -
Star, // * Star, // *
Slash, // / Slash, // /
@@ -89,6 +91,7 @@ pub enum TokenKind {
Arrow, // => Arrow, // =>
ThinArrow, // -> ThinArrow, // ->
Dot, // . Dot, // .
DotDotDot, // ...
Colon, // : Colon, // :
ColonColon, // :: ColonColon, // ::
Comma, // , Comma, // ,
@@ -138,6 +141,7 @@ impl fmt::Display for TokenKind {
TokenKind::Effect => write!(f, "effect"), TokenKind::Effect => write!(f, "effect"),
TokenKind::Handler => write!(f, "handler"), TokenKind::Handler => write!(f, "handler"),
TokenKind::Run => write!(f, "run"), TokenKind::Run => write!(f, "run"),
TokenKind::Handle => write!(f, "handle"),
TokenKind::Resume => write!(f, "resume"), TokenKind::Resume => write!(f, "resume"),
TokenKind::Type => write!(f, "type"), TokenKind::Type => write!(f, "type"),
TokenKind::Import => write!(f, "import"), TokenKind::Import => write!(f, "import"),
@@ -160,6 +164,7 @@ impl fmt::Display for TokenKind {
TokenKind::True => write!(f, "true"), TokenKind::True => write!(f, "true"),
TokenKind::False => write!(f, "false"), TokenKind::False => write!(f, "false"),
TokenKind::Plus => write!(f, "+"), TokenKind::Plus => write!(f, "+"),
TokenKind::PlusPlus => write!(f, "++"),
TokenKind::Minus => write!(f, "-"), TokenKind::Minus => write!(f, "-"),
TokenKind::Star => write!(f, "*"), TokenKind::Star => write!(f, "*"),
TokenKind::Slash => write!(f, "/"), TokenKind::Slash => write!(f, "/"),
@@ -179,6 +184,7 @@ impl fmt::Display for TokenKind {
TokenKind::Arrow => write!(f, "=>"), TokenKind::Arrow => write!(f, "=>"),
TokenKind::ThinArrow => write!(f, "->"), TokenKind::ThinArrow => write!(f, "->"),
TokenKind::Dot => write!(f, "."), TokenKind::Dot => write!(f, "."),
TokenKind::DotDotDot => write!(f, "..."),
TokenKind::Colon => write!(f, ":"), TokenKind::Colon => write!(f, ":"),
TokenKind::ColonColon => write!(f, "::"), TokenKind::ColonColon => write!(f, "::"),
TokenKind::Comma => write!(f, ","), TokenKind::Comma => write!(f, ","),
@@ -268,7 +274,14 @@ impl<'a> Lexer<'a> {
let kind = match c { let kind = match c {
// Single-character tokens // Single-character tokens
'+' => TokenKind::Plus, '+' => {
if self.peek() == Some('+') {
self.advance();
TokenKind::PlusPlus
} else {
TokenKind::Plus
}
}
'*' => TokenKind::Star, '*' => TokenKind::Star,
'%' => TokenKind::Percent, '%' => TokenKind::Percent,
'(' => TokenKind::LParen, '(' => TokenKind::LParen,
@@ -364,7 +377,22 @@ impl<'a> Lexer<'a> {
TokenKind::Pipe TokenKind::Pipe
} }
} }
'.' => TokenKind::Dot, '.' => {
if self.peek() == Some('.') {
// Check for ... (need to peek past second dot)
// We look at source directly since we can only peek one ahead
let next_next = self.source[self.pos..].chars().nth(1);
if next_next == Some('.') {
self.advance(); // consume second '.'
self.advance(); // consume third '.'
TokenKind::DotDotDot
} else {
TokenKind::Dot
}
} else {
TokenKind::Dot
}
}
':' => { ':' => {
if self.peek() == Some(':') { if self.peek() == Some(':') {
self.advance(); self.advance();
@@ -745,6 +773,7 @@ impl<'a> Lexer<'a> {
"effect" => TokenKind::Effect, "effect" => TokenKind::Effect,
"handler" => TokenKind::Handler, "handler" => TokenKind::Handler,
"run" => TokenKind::Run, "run" => TokenKind::Run,
"handle" => TokenKind::Handle,
"resume" => TokenKind::Resume, "resume" => TokenKind::Resume,
"type" => TokenKind::Type, "type" => TokenKind::Type,
"import" => TokenKind::Import, "import" => TokenKind::Import,
@@ -763,6 +792,8 @@ impl<'a> Lexer<'a> {
"commutative" => TokenKind::Commutative, "commutative" => TokenKind::Commutative,
"where" => TokenKind::Where, "where" => TokenKind::Where,
"assume" => TokenKind::Assume, "assume" => TokenKind::Assume,
"and" => TokenKind::And,
"or" => TokenKind::Or,
"true" => TokenKind::Bool(true), "true" => TokenKind::Bool(true),
"false" => TokenKind::Bool(false), "false" => TokenKind::Bool(false),
_ => TokenKind::Ident(ident.to_string()), _ => TokenKind::Ident(ident.to_string()),

View File

@@ -513,7 +513,10 @@ impl Linter {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => { Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.collect_refs_expr(object); self.collect_refs_expr(object);
} }
Expr::Record { fields, .. } => { Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_refs_expr(spread_expr);
}
for (_, val) in fields { for (_, val) in fields {
self.collect_refs_expr(val); self.collect_refs_expr(val);
} }

View File

@@ -1571,7 +1571,10 @@ fn collect_call_site_hints(
collect_call_site_hints(source, e, param_names, hints); collect_call_site_hints(source, e, param_names, hints);
} }
} }
Expr::Record { fields, .. } => { Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
collect_call_site_hints(source, spread_expr, param_names, hints);
}
for (_, e) in fields { for (_, e) in fields {
collect_call_site_hints(source, e, param_names, hints); collect_call_site_hints(source, e, param_names, hints);
} }

View File

@@ -37,7 +37,7 @@ use std::borrow::Cow;
use std::collections::HashSet; use std::collections::HashSet;
use typechecker::TypeChecker; use typechecker::TypeChecker;
const VERSION: &str = "0.1.0"; const VERSION: &str = env!("CARGO_PKG_VERSION");
const HELP: &str = r#" const HELP: &str = r#"
Lux - A functional language with first-class effects Lux - A functional language with first-class effects
@@ -193,10 +193,12 @@ fn main() {
eprintln!(" lux compile <file.lux> --run"); eprintln!(" lux compile <file.lux> --run");
eprintln!(" lux compile <file.lux> --emit-c [-o file.c]"); eprintln!(" lux compile <file.lux> --emit-c [-o file.c]");
eprintln!(" lux compile <file.lux> --target js [-o file.js]"); eprintln!(" lux compile <file.lux> --target js [-o file.js]");
eprintln!(" lux compile <file.lux> --watch");
std::process::exit(1); std::process::exit(1);
} }
let run_after = args.iter().any(|a| a == "--run"); let run_after = args.iter().any(|a| a == "--run");
let emit_c = args.iter().any(|a| a == "--emit-c"); let emit_c = args.iter().any(|a| a == "--emit-c");
let watch = args.iter().any(|a| a == "--watch");
let target_js = args.iter() let target_js = args.iter()
.position(|a| a == "--target") .position(|a| a == "--target")
.and_then(|i| args.get(i + 1)) .and_then(|i| args.get(i + 1))
@@ -212,6 +214,16 @@ fn main() {
} else { } else {
compile_to_c(&args[2], output_path, run_after, emit_c); compile_to_c(&args[2], output_path, run_after, emit_c);
} }
if watch {
// Build the args to replay for each recompilation (without --watch)
let compile_args: Vec<String> = args.iter()
.skip(1)
.filter(|a| a.as_str() != "--watch")
.cloned()
.collect();
watch_and_rerun(&args[2], &compile_args);
}
} }
"repl" => { "repl" => {
// Start REPL // Start REPL
@@ -902,6 +914,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
.args(["-O2", "-o"]) .args(["-O2", "-o"])
.arg(&output_bin) .arg(&output_bin)
.arg(&temp_c) .arg(&temp_c)
.arg("-lm")
.output(); .output();
match compile_result { match compile_result {
@@ -1350,6 +1363,64 @@ fn watch_file(path: &str) {
} }
} }
fn watch_and_rerun(path: &str, compile_args: &[String]) {
use std::time::{Duration, SystemTime};
use std::path::Path;
let file_path = Path::new(path);
if !file_path.exists() {
eprintln!("File not found: {}", path);
std::process::exit(1);
}
println!();
println!("Watching {} for changes (Ctrl+C to stop)...", path);
let mut last_modified = std::fs::metadata(file_path)
.and_then(|m| m.modified())
.unwrap_or(SystemTime::UNIX_EPOCH);
loop {
std::thread::sleep(Duration::from_millis(500));
let modified = match std::fs::metadata(file_path).and_then(|m| m.modified()) {
Ok(m) => m,
Err(_) => continue,
};
if modified > last_modified {
last_modified = modified;
// Clear screen
print!("\x1B[2J\x1B[H");
println!("=== Compiling {} ===", path);
println!();
let result = std::process::Command::new(std::env::current_exe().unwrap())
.args(compile_args)
.status();
match result {
Ok(status) if status.success() => {
println!();
println!("=== Success ===");
}
Ok(_) => {
println!();
println!("=== Failed ===");
}
Err(e) => {
eprintln!("Error running compiler: {}", e);
}
}
println!();
println!("Watching for changes...");
}
}
}
fn serve_static_files(dir: &str, port: u16) { fn serve_static_files(dir: &str, port: u16) {
use std::io::{Write, BufRead, BufReader}; use std::io::{Write, BufRead, BufReader};
use std::net::TcpListener; use std::net::TcpListener;
@@ -5440,4 +5511,173 @@ c")"#;
check_file("projects/rest-api/main.lux").unwrap(); check_file("projects/rest-api/main.lux").unwrap();
} }
} }
// === Map type tests ===
#[test]
fn test_map_new_and_size() {
let source = r#"
let m = Map.new()
let result = Map.size(m)
"#;
assert_eq!(eval(source).unwrap(), "0");
}
#[test]
fn test_map_set_and_get() {
let source = r#"
let m = Map.new()
let m2 = Map.set(m, "name", "Alice")
let result = Map.get(m2, "name")
"#;
assert_eq!(eval(source).unwrap(), "Some(\"Alice\")");
}
#[test]
fn test_map_get_missing() {
let source = r#"
let m = Map.new()
let result = Map.get(m, "missing")
"#;
assert_eq!(eval(source).unwrap(), "None");
}
#[test]
fn test_map_contains() {
let source = r#"
let m = Map.set(Map.new(), "x", 1)
let result = (Map.contains(m, "x"), Map.contains(m, "y"))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_remove() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "a", 1), "b", 2)
let m2 = Map.remove(m, "a")
let result = (Map.size(m2), Map.contains(m2, "a"), Map.contains(m2, "b"))
"#;
assert_eq!(eval(source).unwrap(), "(1, false, true)");
}
#[test]
fn test_map_keys_and_values() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.keys(m)
"#;
assert_eq!(eval(source).unwrap(), "[\"a\", \"b\"]");
}
#[test]
fn test_map_from_list() {
let source = r#"
let m = Map.fromList([("x", 10), ("y", 20)])
let result = (Map.get(m, "x"), Map.size(m))
"#;
assert_eq!(eval(source).unwrap(), "(Some(10), 2)");
}
#[test]
fn test_map_to_list() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.toList(m)
"#;
assert_eq!(eval(source).unwrap(), "[(\"a\", 1), (\"b\", 2)]");
}
#[test]
fn test_map_merge() {
let source = r#"
let m1 = Map.fromList([("a", 1), ("b", 2)])
let m2 = Map.fromList([("b", 3), ("c", 4)])
let merged = Map.merge(m1, m2)
let result = (Map.get(merged, "a"), Map.get(merged, "b"), Map.get(merged, "c"))
"#;
assert_eq!(eval(source).unwrap(), "(Some(1), Some(3), Some(4))");
}
#[test]
fn test_map_immutability() {
let source = r#"
let m1 = Map.fromList([("a", 1)])
let m2 = Map.set(m1, "b", 2)
let result = (Map.size(m1), Map.size(m2))
"#;
assert_eq!(eval(source).unwrap(), "(1, 2)");
}
#[test]
fn test_map_is_empty() {
let source = r#"
let m1 = Map.new()
let m2 = Map.set(m1, "x", 1)
let result = (Map.isEmpty(m1), Map.isEmpty(m2))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_type_annotation() {
let source = r#"
fn lookup(m: Map<String, Int>, key: String): Option<Int> =
Map.get(m, key)
let m = Map.fromList([("age", 30)])
let result = lookup(m, "age")
"#;
assert_eq!(eval(source).unwrap(), "Some(30)");
}
#[test]
fn test_file_copy() {
use std::io::Write;
// Create a temp file, copy it, verify contents
let dir = std::env::temp_dir().join("lux_test_file_copy");
let _ = std::fs::create_dir_all(&dir);
let src = dir.join("src.txt");
let dst = dir.join("dst.txt");
std::fs::File::create(&src).unwrap().write_all(b"hello copy").unwrap();
let _ = std::fs::remove_file(&dst);
let source = format!(r#"
fn main(): Unit with {{File}} =
File.copy("{}", "{}")
let _ = run main() with {{}}
let result = "done"
"#, src.display(), dst.display());
let result = eval(&source);
assert!(result.is_ok(), "File.copy failed: {:?}", result);
let contents = std::fs::read_to_string(&dst).unwrap();
assert_eq!(contents, "hello copy");
// Cleanup
let _ = std::fs::remove_dir_all(&dir);
}
#[test]
fn test_effectful_callback_propagation() {
// WISH-7: effectful callbacks in List.forEach should propagate effects
// This should type-check successfully because Console effect is inferred
let source = r#"
fn printAll(items: List<String>): Unit =
List.forEach(items, fn(x: String): Unit => Console.print(x))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback should type-check: {:?}", result);
}
#[test]
fn test_effectful_callback_in_map() {
// Effectful callback in List.map should propagate effects
let source = r#"
fn readAll(paths: List<String>): List<String> =
List.map(paths, fn(p: String): String => File.read(p))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback in map should type-check: {:?}", result);
}
} }

View File

@@ -245,6 +245,7 @@ impl Parser {
TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)), TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)),
TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)), TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)),
TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")), TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")),
TokenKind::Handle => Err(self.error("Bare 'handle' expressions are not allowed at top level. Use 'let _ = handle ...' or 'let result = handle ...'")),
_ => Err(self.error("Expected declaration (fn, effect, handler, type, trait, impl, or let)")), _ => Err(self.error("Expected declaration (fn, effect, handler, type, trait, impl, or let)")),
} }
} }
@@ -1558,6 +1559,7 @@ impl Parser {
loop { loop {
let op = match self.peek_kind() { let op = match self.peek_kind() {
TokenKind::Plus => BinaryOp::Add, TokenKind::Plus => BinaryOp::Add,
TokenKind::PlusPlus => BinaryOp::Concat,
TokenKind::Minus => BinaryOp::Sub, TokenKind::Minus => BinaryOp::Sub,
_ => break, _ => break,
}; };
@@ -1774,6 +1776,7 @@ impl Parser {
TokenKind::Let => self.parse_let_expr(), TokenKind::Let => self.parse_let_expr(),
TokenKind::Fn => self.parse_lambda_expr(), TokenKind::Fn => self.parse_lambda_expr(),
TokenKind::Run => self.parse_run_expr(), TokenKind::Run => self.parse_run_expr(),
TokenKind::Handle => self.parse_handle_expr(),
TokenKind::Resume => self.parse_resume_expr(), TokenKind::Resume => self.parse_resume_expr(),
// Delimiters // Delimiters
@@ -1791,6 +1794,7 @@ impl Parser {
let condition = Box::new(self.parse_expr()?); let condition = Box::new(self.parse_expr()?);
self.skip_newlines();
self.expect(TokenKind::Then)?; self.expect(TokenKind::Then)?;
self.skip_newlines(); self.skip_newlines();
let then_branch = Box::new(self.parse_expr()?); let then_branch = Box::new(self.parse_expr()?);
@@ -2149,6 +2153,40 @@ impl Parser {
}) })
} }
fn parse_handle_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Handle)?;
let expr = Box::new(self.parse_call_expr()?);
self.expect(TokenKind::With)?;
self.expect(TokenKind::LBrace)?;
self.skip_newlines();
let mut handlers = Vec::new();
while !self.check(TokenKind::RBrace) {
let effect = self.parse_ident()?;
self.expect(TokenKind::Eq)?;
let handler = self.parse_expr()?;
handlers.push((effect, handler));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
let end = self.current_span();
self.expect(TokenKind::RBrace)?;
Ok(Expr::Run {
expr,
handlers,
span: start.merge(end),
})
}
fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> { fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span(); let start = self.current_span();
self.expect(TokenKind::Resume)?; self.expect(TokenKind::Resume)?;
@@ -2207,6 +2245,11 @@ impl Parser {
})); }));
} }
// Check for record spread: { ...expr, field: val }
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
return self.parse_record_expr_rest(start);
}
// Check if it's a record (ident: expr) or block // Check if it's a record (ident: expr) or block
if matches!(self.peek_kind(), TokenKind::Ident(_)) { if matches!(self.peek_kind(), TokenKind::Ident(_)) {
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind); let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
@@ -2221,6 +2264,20 @@ impl Parser {
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> { fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
let mut fields = Vec::new(); let mut fields = Vec::new();
let mut spread = None;
// Check for spread: { ...expr, ... }
if self.check(TokenKind::DotDotDot) {
self.advance(); // consume ...
let spread_expr = self.parse_expr()?;
spread = Some(Box::new(spread_expr));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
while !self.check(TokenKind::RBrace) { while !self.check(TokenKind::RBrace) {
let name = self.parse_ident()?; let name = self.parse_ident()?;
@@ -2237,7 +2294,11 @@ impl Parser {
self.expect(TokenKind::RBrace)?; self.expect(TokenKind::RBrace)?;
let span = start.merge(self.previous_span()); let span = start.merge(self.previous_span());
Ok(Expr::Record { fields, span }) Ok(Expr::Record {
spread,
fields,
span,
})
} }
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> { fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {

View File

@@ -527,7 +527,10 @@ impl SymbolTable {
self.visit_expr(e, scope_idx); self.visit_expr(e, scope_idx);
} }
} }
Expr::Record { fields, .. } => { Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.visit_expr(spread_expr, scope_idx);
}
for (_, e) in fields { for (_, e) in fields {
self.visit_expr(e, scope_idx); self.visit_expr(e, scope_idx);
} }

View File

@@ -339,7 +339,10 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
Expr::Lambda { body, .. } => references_params(body, params), Expr::Lambda { body, .. } => references_params(body, params),
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)), Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)), Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)), Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| references_params(s, params))
|| fields.iter().any(|(_, e)| references_params(e, params))
}
Expr::Match { scrutinee, arms, .. } => { Expr::Match { scrutinee, arms, .. } => {
references_params(scrutinee, params) references_params(scrutinee, params)
|| arms.iter().any(|a| references_params(&a.body, params)) || arms.iter().any(|a| references_params(&a.body, params))
@@ -516,8 +519,9 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => { Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
elements.iter().any(|e| has_recursive_calls(func_name, e)) elements.iter().any(|e| has_recursive_calls(func_name, e))
} }
Expr::Record { fields, .. } => { Expr::Record { spread, fields, .. } => {
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e)) spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
} }
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object), Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
Expr::Let { value, body, .. } => { Expr::Let { value, body, .. } => {
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
// Build the record expression // Build the record expression
Some(Expr::Record { Some(Expr::Record {
spread: None,
fields: field_exprs, fields: field_exprs,
span, span,
}) })
@@ -1536,7 +1541,7 @@ impl TypeChecker {
// Use the declared type if present, otherwise use inferred // Use the declared type if present, otherwise use inferred
let final_type = if let Some(ref type_expr) = let_decl.typ { let final_type = if let Some(ref type_expr) = let_decl.typ {
let declared = self.resolve_type(type_expr); let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&inferred, &declared) { if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Variable '{}' has type {}, but declared type is {}: {}", "Variable '{}' has type {}, but declared type is {}: {}",
@@ -1744,7 +1749,11 @@ impl TypeChecker {
span, span,
} => self.infer_block(statements, result, *span), } => self.infer_block(statements, result, *span),
Expr::Record { fields, span } => self.infer_record(fields, *span), Expr::Record {
spread,
fields,
span,
} => self.infer_record(spread.as_deref(), fields, *span),
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span), Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
@@ -1783,7 +1792,7 @@ impl TypeChecker {
match op { match op {
BinaryOp::Add => { BinaryOp::Add => {
// Add supports both numeric types and string concatenation // Add supports both numeric types and string concatenation
if let Err(e) = unify(&left_type, &right_type) { if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e), message: format!("Operands of '{}' must have same type: {}", op, e),
span, span,
@@ -1804,9 +1813,32 @@ impl TypeChecker {
} }
} }
BinaryOp::Concat => {
// Concat (++) supports strings and lists
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '++' must have same type: {}", e),
span,
});
}
match &left_type {
Type::String | Type::List(_) | Type::Var(_) => left_type,
_ => {
self.errors.push(TypeError {
message: format!(
"Operator '++' requires String or List operands, got {}",
left_type
),
span,
});
Type::Error
}
}
}
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => { BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
// Arithmetic: both operands must be same numeric type // Arithmetic: both operands must be same numeric type
if let Err(e) = unify(&left_type, &right_type) { if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e), message: format!("Operands of '{}' must have same type: {}", op, e),
span, span,
@@ -1830,7 +1862,7 @@ impl TypeChecker {
BinaryOp::Eq | BinaryOp::Ne => { BinaryOp::Eq | BinaryOp::Ne => {
// Equality: operands must have same type // Equality: operands must have same type
if let Err(e) = unify(&left_type, &right_type) { if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e), message: format!("Operands of '{}' must have same type: {}", op, e),
span, span,
@@ -1841,7 +1873,7 @@ impl TypeChecker {
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => { BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
// Comparison: operands must be same orderable type // Comparison: operands must be same orderable type
if let Err(e) = unify(&left_type, &right_type) { if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e), message: format!("Operands of '{}' must have same type: {}", op, e),
span, span,
@@ -1852,13 +1884,13 @@ impl TypeChecker {
BinaryOp::And | BinaryOp::Or => { BinaryOp::And | BinaryOp::Or => {
// Logical: both must be Bool // Logical: both must be Bool
if let Err(e) = unify(&left_type, &Type::Bool) { if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Left operand of '{}' must be Bool: {}", op, e), message: format!("Left operand of '{}' must be Bool: {}", op, e),
span: left.span(), span: left.span(),
}); });
} }
if let Err(e) = unify(&right_type, &Type::Bool) { if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Right operand of '{}' must be Bool: {}", op, e), message: format!("Right operand of '{}' must be Bool: {}", op, e),
span: right.span(), span: right.span(),
@@ -1872,7 +1904,7 @@ impl TypeChecker {
// right must be a function that accepts left's type // right must be a function that accepts left's type
let result_type = Type::var(); let result_type = Type::var();
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone()); let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
if let Err(e) = unify(&right_type, &expected_fn) { if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Pipe target must be a function accepting {}: {}", "Pipe target must be a function accepting {}: {}",
@@ -1904,7 +1936,7 @@ impl TypeChecker {
} }
}, },
UnaryOp::Not => { UnaryOp::Not => {
if let Err(e) = unify(&operand_type, &Type::Bool) { if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Operator '!' requires Bool operand: {}", e), message: format!("Operator '!' requires Bool operand: {}", e),
span, span,
@@ -1919,6 +1951,17 @@ impl TypeChecker {
let func_type = self.infer_expr(func); let func_type = self.infer_expr(func);
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect(); let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
// Check property constraints from where clauses // Check property constraints from where clauses
if let Expr::Var(func_id) = func { if let Expr::Var(func_id) = func {
if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() { if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() {
@@ -1955,7 +1998,7 @@ impl TypeChecker {
self.current_effects.clone(), self.current_effects.clone(),
); );
match unify(&func_type, &expected_fn) { match unify_with_env(&func_type, &expected_fn, &self.env) {
Ok(subst) => result_type.apply(&subst), Ok(subst) => result_type.apply(&subst),
Err(e) => { Err(e) => {
// Provide more detailed error message based on the type of mismatch // Provide more detailed error message based on the type of mismatch
@@ -2029,10 +2072,22 @@ impl TypeChecker {
if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) { if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) {
// It's a function call on a module field // It's a function call on a module field
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect(); let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
let result_type = Type::var(); let result_type = Type::var();
let expected_fn = Type::function(arg_types, result_type.clone()); let expected_fn = Type::function(arg_types, result_type.clone());
if let Err(e) = unify(field_type, &expected_fn) { if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Type mismatch in {}.{} call: {}", "Type mismatch in {}.{} call: {}",
@@ -2088,6 +2143,17 @@ impl TypeChecker {
// Check argument types // Check argument types
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect(); let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
if arg_types.len() != op.params.len() { if arg_types.len() != op.params.len() {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
@@ -2104,7 +2170,7 @@ impl TypeChecker {
for (i, (arg_type, (_, param_type))) in for (i, (arg_type, (_, param_type))) in
arg_types.iter().zip(op.params.iter()).enumerate() arg_types.iter().zip(op.params.iter()).enumerate()
{ {
if let Err(e) = unify(arg_type, param_type) { if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Argument {} of '{}.{}' has type {}, expected {}: {}", "Argument {} of '{}.{}' has type {}, expected {}: {}",
@@ -2137,6 +2203,7 @@ impl TypeChecker {
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type { fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
let object_type = self.infer_expr(object); let object_type = self.infer_expr(object);
let object_type = self.env.expand_type_alias(&object_type);
match &object_type { match &object_type {
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) { Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
@@ -2217,7 +2284,7 @@ impl TypeChecker {
// Check return type if specified // Check return type if specified
let ret_type = if let Some(rt) = return_type { let ret_type = if let Some(rt) = return_type {
let declared = self.resolve_type(rt); let declared = self.resolve_type(rt);
if let Err(e) = unify(&body_type, &declared) { if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Lambda body type {} doesn't match declared {}: {}", "Lambda body type {} doesn't match declared {}: {}",
@@ -2283,7 +2350,7 @@ impl TypeChecker {
span: Span, span: Span,
) -> Type { ) -> Type {
let cond_type = self.infer_expr(condition); let cond_type = self.infer_expr(condition);
if let Err(e) = unify(&cond_type, &Type::Bool) { if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("If condition must be Bool, got {}: {}", cond_type, e), message: format!("If condition must be Bool, got {}: {}", cond_type, e),
span: condition.span(), span: condition.span(),
@@ -2293,7 +2360,7 @@ impl TypeChecker {
let then_type = self.infer_expr(then_branch); let then_type = self.infer_expr(then_branch);
let else_type = self.infer_expr(else_branch); let else_type = self.infer_expr(else_branch);
match unify(&then_type, &else_type) { match unify_with_env(&then_type, &else_type, &self.env) {
Ok(subst) => then_type.apply(&subst), Ok(subst) => then_type.apply(&subst),
Err(e) => { Err(e) => {
self.errors.push(TypeError { self.errors.push(TypeError {
@@ -2334,7 +2401,7 @@ impl TypeChecker {
// Check guard if present // Check guard if present
if let Some(ref guard) = arm.guard { if let Some(ref guard) = arm.guard {
let guard_type = self.infer_expr(guard); let guard_type = self.infer_expr(guard);
if let Err(e) = unify(&guard_type, &Type::Bool) { if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Match guard must be Bool: {}", e), message: format!("Match guard must be Bool: {}", e),
span: guard.span(), span: guard.span(),
@@ -2350,7 +2417,7 @@ impl TypeChecker {
match &result_type { match &result_type {
None => result_type = Some(body_type), None => result_type = Some(body_type),
Some(prev) => { Some(prev) => {
if let Err(e) = unify(prev, &body_type) { if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Match arm has incompatible type: expected {}, got {}: {}", "Match arm has incompatible type: expected {}, got {}: {}",
@@ -2400,7 +2467,7 @@ impl TypeChecker {
Pattern::Literal(lit) => { Pattern::Literal(lit) => {
let lit_type = self.infer_literal(lit); let lit_type = self.infer_literal(lit);
if let Err(e) = unify(&lit_type, expected) { if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Pattern literal type mismatch: {}", e), message: format!("Pattern literal type mismatch: {}", e),
span: lit.span, span: lit.span,
@@ -2414,7 +2481,7 @@ impl TypeChecker {
// For now, handle Option specially // For now, handle Option specially
match name.name.as_str() { match name.name.as_str() {
"None" => { "None" => {
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) { if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"None pattern doesn't match type {}: {}", "None pattern doesn't match type {}: {}",
@@ -2427,7 +2494,7 @@ impl TypeChecker {
} }
"Some" => { "Some" => {
let inner_type = Type::var(); let inner_type = Type::var();
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone()))) if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
{ {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
@@ -2456,7 +2523,7 @@ impl TypeChecker {
Pattern::Tuple { elements, span } => { Pattern::Tuple { elements, span } => {
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect(); let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) { if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("Tuple pattern doesn't match type {}: {}", expected, e), message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
span: *span, span: *span,
@@ -2506,7 +2573,7 @@ impl TypeChecker {
if let Some(type_expr) = typ { if let Some(type_expr) = typ {
let declared = self.resolve_type(type_expr); let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&value_type, &declared) { if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Variable '{}' has type {}, but declared type is {}: {}", "Variable '{}' has type {}, but declared type is {}: {}",
@@ -2527,12 +2594,47 @@ impl TypeChecker {
self.infer_expr(result) self.infer_expr(result)
} }
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type { fn infer_record(
let field_types: Vec<(String, Type)> = fields &mut self,
spread: Option<&Expr>,
fields: &[(Ident, Expr)],
span: Span,
) -> Type {
// Start with spread fields if present
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
let spread_type = self.infer_expr(spread_expr);
let spread_type = self.env.expand_type_alias(&spread_type);
match spread_type {
Type::Record(spread_fields) => spread_fields,
_ => {
self.errors.push(TypeError {
message: format!(
"Spread expression must be a record type, got {}",
spread_type
),
span,
});
Vec::new()
}
}
} else {
Vec::new()
};
// Apply explicit field overrides
let explicit_types: Vec<(String, Type)> = fields
.iter() .iter()
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr))) .map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
.collect(); .collect();
for (name, typ) in explicit_types {
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
existing.1 = typ;
} else {
field_types.push((name, typ));
}
}
Type::Record(field_types) Type::Record(field_types)
} }
@@ -2549,7 +2651,7 @@ impl TypeChecker {
let first_type = self.infer_expr(&elements[0]); let first_type = self.infer_expr(&elements[0]);
for elem in &elements[1..] { for elem in &elements[1..] {
let elem_type = self.infer_expr(elem); let elem_type = self.infer_expr(elem);
if let Err(e) = unify(&first_type, &elem_type) { if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!("List elements must have same type: {}", e), message: format!("List elements must have same type: {}", e),
span, span,
@@ -2855,7 +2957,7 @@ impl TypeChecker {
// Check return type matches if specified // Check return type matches if specified
if let Some(ref return_type_expr) = impl_method.return_type { if let Some(ref return_type_expr) = impl_method.return_type {
let return_type = self.resolve_type(return_type_expr); let return_type = self.resolve_type(return_type_expr);
if let Err(e) = unify(&body_type, &return_type) { if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
self.errors.push(TypeError { self.errors.push(TypeError {
message: format!( message: format!(
"Method '{}' body has type {}, but declared return type is {}: {}", "Method '{}' body has type {}, but declared return type is {}: {}",
@@ -2898,6 +3000,9 @@ impl TypeChecker {
"Option" if resolved_args.len() == 1 => { "Option" if resolved_args.len() == 1 => {
return Type::Option(Box::new(resolved_args[0].clone())); return Type::Option(Box::new(resolved_args[0].clone()));
} }
"Map" if resolved_args.len() == 2 => {
return Type::Map(Box::new(resolved_args[0].clone()), Box::new(resolved_args[1].clone()));
}
_ => {} _ => {}
} }
} }

View File

@@ -47,6 +47,8 @@ pub enum Type {
List(Box<Type>), List(Box<Type>),
/// Option type (sugar for App(Option, [T])) /// Option type (sugar for App(Option, [T]))
Option(Box<Type>), Option(Box<Type>),
/// Map type (sugar for App(Map, [K, V]))
Map(Box<Type>, Box<Type>),
/// Versioned type (e.g., User @v2) /// Versioned type (e.g., User @v2)
Versioned { Versioned {
base: Box<Type>, base: Box<Type>,
@@ -119,6 +121,7 @@ impl Type {
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)), Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)), Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
Type::List(inner) | Type::Option(inner) => inner.contains_var(var), Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
Type::Map(k, v) => k.contains_var(var) || v.contains_var(var),
Type::Versioned { base, .. } => base.contains_var(var), Type::Versioned { base, .. } => base.contains_var(var),
_ => false, _ => false,
} }
@@ -158,6 +161,7 @@ impl Type {
), ),
Type::List(inner) => Type::List(Box::new(inner.apply(subst))), Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))), Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
Type::Map(k, v) => Type::Map(Box::new(k.apply(subst)), Box::new(v.apply(subst))),
Type::Versioned { base, version } => Type::Versioned { Type::Versioned { base, version } => Type::Versioned {
base: Box::new(base.apply(subst)), base: Box::new(base.apply(subst)),
version: version.clone(), version: version.clone(),
@@ -208,6 +212,11 @@ impl Type {
vars vars
} }
Type::List(inner) | Type::Option(inner) => inner.free_vars(), Type::List(inner) | Type::Option(inner) => inner.free_vars(),
Type::Map(k, v) => {
let mut vars = k.free_vars();
vars.extend(v.free_vars());
vars
}
Type::Versioned { base, .. } => base.free_vars(), Type::Versioned { base, .. } => base.free_vars(),
_ => HashSet::new(), _ => HashSet::new(),
} }
@@ -279,6 +288,7 @@ impl fmt::Display for Type {
} }
Type::List(inner) => write!(f, "List<{}>", inner), Type::List(inner) => write!(f, "List<{}>", inner),
Type::Option(inner) => write!(f, "Option<{}>", inner), Type::Option(inner) => write!(f, "Option<{}>", inner),
Type::Map(k, v) => write!(f, "Map<{}, {}>", k, v),
Type::Versioned { base, version } => { Type::Versioned { base, version } => {
write!(f, "{} {}", base, version) write!(f, "{} {}", base, version)
} }
@@ -946,6 +956,14 @@ impl TypeEnv {
params: vec![("path".to_string(), Type::String)], params: vec![("path".to_string(), Type::String)],
return_type: Type::Unit, return_type: Type::Unit,
}, },
EffectOpDef {
name: "copy".to_string(),
params: vec![
("source".to_string(), Type::String),
("dest".to_string(), Type::String),
],
return_type: Type::Unit,
},
], ],
}, },
); );
@@ -1775,6 +1793,73 @@ impl TypeEnv {
]); ]);
env.bind("Option", TypeScheme::mono(option_module_type)); env.bind("Option", TypeScheme::mono(option_module_type));
// Map module
let map_v = || Type::var();
let map_type = || Type::Map(Box::new(Type::String), Box::new(Type::var()));
let map_module_type = Type::Record(vec![
(
"new".to_string(),
Type::function(vec![], map_type()),
),
(
"set".to_string(),
Type::function(
vec![map_type(), Type::String, map_v()],
map_type(),
),
),
(
"get".to_string(),
Type::function(
vec![map_type(), Type::String],
Type::Option(Box::new(map_v())),
),
),
(
"contains".to_string(),
Type::function(vec![map_type(), Type::String], Type::Bool),
),
(
"remove".to_string(),
Type::function(vec![map_type(), Type::String], map_type()),
),
(
"keys".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(Type::String))),
),
(
"values".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(map_v()))),
),
(
"size".to_string(),
Type::function(vec![map_type()], Type::Int),
),
(
"isEmpty".to_string(),
Type::function(vec![map_type()], Type::Bool),
),
(
"fromList".to_string(),
Type::function(
vec![Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()])))],
map_type(),
),
),
(
"toList".to_string(),
Type::function(
vec![map_type()],
Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()]))),
),
),
(
"merge".to_string(),
Type::function(vec![map_type(), map_type()], map_type()),
),
]);
env.bind("Map", TypeScheme::mono(map_module_type));
// Result module // Result module
let result_type = Type::App { let result_type = Type::App {
constructor: Box::new(Type::Named("Result".to_string())), constructor: Box::new(Type::Named("Result".to_string())),
@@ -1887,6 +1972,18 @@ impl TypeEnv {
"round".to_string(), "round".to_string(),
Type::function(vec![Type::var()], Type::Int), Type::function(vec![Type::var()], Type::Int),
), ),
(
"sin".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"cos".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"atan2".to_string(),
Type::function(vec![Type::Float, Type::Float], Type::Float),
),
]); ]);
env.bind("Math", TypeScheme::mono(math_module_type)); env.bind("Math", TypeScheme::mono(math_module_type));
@@ -1896,6 +1993,10 @@ impl TypeEnv {
"toString".to_string(), "toString".to_string(),
Type::function(vec![Type::Int], Type::String), Type::function(vec![Type::Int], Type::String),
), ),
(
"toFloat".to_string(),
Type::function(vec![Type::Int], Type::Float),
),
]); ]);
env.bind("Int", TypeScheme::mono(int_module_type)); env.bind("Int", TypeScheme::mono(int_module_type));
@@ -1905,6 +2006,10 @@ impl TypeEnv {
"toString".to_string(), "toString".to_string(),
Type::function(vec![Type::Float], Type::String), Type::function(vec![Type::Float], Type::String),
), ),
(
"toInt".to_string(),
Type::function(vec![Type::Float], Type::Int),
),
]); ]);
env.bind("Float", TypeScheme::mono(float_module_type)); env.bind("Float", TypeScheme::mono(float_module_type));
@@ -1991,6 +2096,9 @@ impl TypeEnv {
Type::Option(inner) => { Type::Option(inner) => {
Type::Option(Box::new(self.expand_type_alias(inner))) Type::Option(Box::new(self.expand_type_alias(inner)))
} }
Type::Map(k, v) => {
Type::Map(Box::new(self.expand_type_alias(k)), Box::new(self.expand_type_alias(v)))
}
Type::Versioned { base, version } => { Type::Versioned { base, version } => {
Type::Versioned { Type::Versioned {
base: Box::new(self.expand_type_alias(base)), base: Box::new(self.expand_type_alias(base)),
@@ -2151,6 +2259,13 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Option // Option
(Type::Option(a), Type::Option(b)) => unify(a, b), (Type::Option(a), Type::Option(b)) => unify(a, b),
// Map
(Type::Map(k1, v1), Type::Map(k2, v2)) => {
let s1 = unify(k1, k2)?;
let s2 = unify(&v1.apply(&s1), &v2.apply(&s1))?;
Ok(s1.compose(&s2))
}
// Versioned types // Versioned types
( (
Type::Versioned { Type::Versioned {