44 Commits

Author SHA1 Message Date
fd5ed53b29 chore: bump version to 0.1.6 2026-02-19 15:22:32 -05:00
2800ce4e2d chore: sync Cargo.lock
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:26:20 -05:00
ec365ebb3f feat: add File.copy and propagate effectful callback effects (WISH-7, WISH-14)
File.copy(source, dest) copies files via interpreter (std::fs::copy) and
C backend (fread/fwrite). Effectful callbacks passed to higher-order
functions like List.map/forEach now propagate their effects to the
enclosing function's inferred effect set.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:24:28 -05:00
52dcc88051 chore: bump version to 0.1.5 2026-02-19 03:47:28 -05:00
1842b668e5 chore: sync Cargo.lock with version 0.1.4
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:47:11 -05:00
c67e3f31c3 feat: add and/or keywords, handle alias, --watch flag, JS tree-shaking
- WISH-008: `and`/`or` as aliases for `&&`/`||` boolean operators
- WISH-006: `handle` as alias for `run ... with` (same AST output)
- WISH-005: `--watch` flag for `lux compile` recompiles on file change
- WISH-009: Tree-shake unused runtime sections from JS output based on
  which effects are actually used (Console, Random, Time, Http, Dom)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:35:47 -05:00
b0ccde749c chore: bump version to 0.1.4 2026-02-19 02:48:56 -05:00
4ba7a23ae3 feat: add comprehensive compilation checks to validate.sh
Adds interpreter, JS compilation, and C compilation checks for all
examples, showcase programs, standard examples, and projects (113 total
checks). Skip lists exclude programs requiring unsupported effects or
interactive I/O.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:43:46 -05:00
89741b4a32 fix: move top-level let initialization into main() in C backend
Top-level let bindings with function calls (e.g., `let result = factorial(10)`)
were emitted as static initializers, which is invalid C since function calls
aren't compile-time constants. Now globals are declared with zero-init and
initialized inside main() before any run expressions execute.

Also fixes validate.sh to use exit codes instead of grep for cargo check/build.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:31:49 -05:00
3a2376cd49 feat: port AST definitions to Lux (self-hosting)
Translate all 30+ type definitions from src/ast.rs (727 lines Rust)
into Lux ADTs in projects/lux-compiler/ast.lux.

Types ported: Span, Ident, Visibility, Version, VersionConstraint,
BehavioralProperty, WhereClause, ModulePath, ImportDecl, Program,
Declaration, FunctionDecl, Parameter, EffectDecl, EffectOp, TypeDecl,
TypeDef, RecordField, Variant, VariantFields, Migration, HandlerDecl,
HandlerImpl, LetDecl, TraitDecl, TraitMethod, TraitBound, ImplDecl,
TraitConstraint, ImplMethod, TypeExpr, Expr (19 variants), Literal,
LiteralKind, BinaryOp, UnaryOp, Statement, MatchArm, Pattern.

Passes `lux check` and `lux run`.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:07:30 -05:00
4dfb04a1b6 chore: sync Cargo.lock with version 0.1.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:51 -05:00
3cdde02eb2 feat: add Int.toFloat/Float.toInt JS backend support and fix Map C codegen
- JS backend: Add Int/Float module dispatch in both Call and EffectOp paths
  for toFloat, toInt, and toString operations
- C backend: Fix lux_strdup → lux_string_dup in Map module codegen

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:40 -05:00
a5762d0397 feat: add built-in Map type with String keys
Add Map<String, V> as a first-class built-in type for key-value storage,
needed for self-hosting the compiler (parser/typechecker/interpreter all
rely heavily on hashmaps).

- types.rs: Type::Map(K,V) variant, all match arms (unify, apply, etc.)
- interpreter.rs: Value::Map, 12 BuiltinFn variants (new/set/get/contains/
  remove/keys/values/size/isEmpty/fromList/toList/merge), immutable semantics
- typechecker.rs: Map<K,V> resolution in resolve_type
- js_backend.rs: Map as JS Map with emit_map_operation()
- c_backend.rs: LuxMap struct (linear-scan), runtime fns, emit_map_operation()
- main.rs: 12 tests covering all Map operations
- validate.sh: now checks all projects/ directories too

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:45:13 -05:00
1132c621c6 fix: allow newlines before then in if/then/else expressions
The parser now skips newlines between the condition and `then` keyword,
enabling multiline if expressions like:
  if long_condition
    then expr1
    else expr2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:38:05 -05:00
a0fff1814e fix: JS backend scoping for let/match/if inside closures
Three related bugs fixed:
- BUG-009: let bindings inside lambdas hoisted to top-level
- BUG-011: match expressions inside lambdas hoisted to top-level
- BUG-012: variable name deduplication leaked across function scopes

Root cause: emit_expr() uses writeln() for statements, but lambdas
captured only the return value, not the emitted statements. Also,
var_substitutions from emit_function() leaked to subsequent code.

Fix: Lambda handler now captures all output emitted during body
evaluation and places it inside the function body. Both emit_function
and Lambda save/restore var_substitutions to prevent cross-scope leaks.
Lambda params are registered as identity substitutions to override any
outer bindings with the same name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:10:55 -05:00
4e9e823246 fix: record spread works with named type aliases
Resolve type aliases (e.g. Player -> { pos: Vec2, speed: Float })
before checking if spread expression is a record type. Previously
{ ...p, field: val } failed with "must be a record type, got Player"
when the variable had a named type annotation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 00:01:20 -05:00
6a2e4a7ac1 chore: bump version to 0.1.3 2026-02-18 23:06:10 -05:00
3d706cb32b feat: add record spread syntax { ...base, field: val }
Adds spread operator for records, allowing concise record updates:
  let p2 = { ...p, x: 5.0 }

Changes across the full pipeline:
- Lexer: new DotDotDot (...) token
- AST: optional spread field on Record variant
- Parser: detect ... at start of record expression
- Typechecker: merge spread record fields with explicit overrides
- Interpreter: evaluate spread, overlay explicit fields
- JS backend: emit native JS spread syntax
- C backend: copy spread into temp, assign overrides
- Formatter, linter, LSP, symbol table: propagate spread

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:27 -05:00
7c3bfa9301 feat: add Math.sin, Math.cos, Math.atan2 trig functions
Adds trigonometric functions to the Math module across interpreter,
type system, and C backend. JS backend already supported them.
Also adds #include <math.h> to C preamble and handles Math module
calls through both Call and EffectOp paths in C backend.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:12 -05:00
b56c5461f1 fix: JS const _ duplication and hardcoded version string
- JS backend now emits wildcard let bindings as side-effect statements
  instead of const _ declarations, fixing SyntaxError on multiple let _ = ...
- Version string now uses env!("CARGO_PKG_VERSION") to auto-sync with Cargo.toml
- Add -lm linker flag for math library support

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:03 -05:00
61e1469845 feat: add ++ concat operator and auto-invoke main
BUG-004: Add ++ operator for string and list concatenation across all
backends (interpreter, C, JS) with type checking and formatting support.

BUG-001: Auto-invoke top-level `let main = fn () => ...` when main is
a zero-parameter function, instead of just printing the function value.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 22:01:41 -05:00
bb0a288210 chore: bump version to 0.1.2 2026-02-18 21:16:44 -05:00
5d7f4633e1 docs: add explicit commit instructions to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:27 -05:00
d05b13d840 fix: JS backend compiles print() to console.log()
Bare `print()` calls in Lux now emit `console.log()` in JS output
instead of undefined `print()`. Fixes BUG-006.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:07 -05:00
0ee3050704 chore: bump version to 0.1.1 2026-02-18 20:41:43 -05:00
80b1276f9f fix: release script auto-bumps patch by default
Release script now supports: patch (default), minor, major, or explicit
version. Auto-updates Cargo.toml and flake.nix before building.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:41:29 -05:00
bd843d2219 fix: record type aliases now work for unification and field access
Expand type aliases via unify_with_env() everywhere in the type checker,
not just in a few places. This fixes named record types like
`type Vec2 = { x: Float, y: Float }` — they now properly unify with
anonymous records and support field access (v.x, v.y).

Also adds scripts/validate.sh for automated full-suite regression
testing (Rust tests + all 5 package test suites + type checking).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:21:29 -05:00
d76aa17b38 feat: static binary builds and automated release script
Switch reqwest from native-tls (openssl) to rustls-tls for a pure-Rust
TLS stack, enabling fully static musl builds. Add `nix build .#static`
for portable Linux binaries and `scripts/release.sh` for automated
Gitea releases with changelog generation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 19:09:32 -05:00
c23d9c7078 fix: test runner now supports module imports
The `lux test` command used Parser::parse_source() and
check_program() directly, which meant test files with `import`
statements would fail with type errors. Now uses ModuleLoader
and check_program_with_modules() to properly resolve imports,
and run_with_modules() for execution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 17:11:16 -05:00
fffacd2467 feat: C backend module import support, Int/Float.toString, Test.assertEqualMsg
The C backend can now compile programs that import user-defined modules.
Module-qualified calls like `mymodule.func(args)` are resolved to prefixed
C functions (e.g., `mymodule_func_lux`), with full support for transitive
imports and effect-passing. Also adds Int.toString/Float.toString to type
system, interpreter, and C backend, and Test.assertEqualMsg for labeled
test assertions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 16:35:24 -05:00
2ae2c132e5 docs: add language philosophy document and compiler integration
Write comprehensive PHILOSOPHY.md covering Lux's six core principles
(explicit over implicit, composition over configuration, safety without
ceremony, practical over academic, one right way, tools are the language)
with detailed comparisons against JS/TS, Python, Rust, Go, Java/C#,
Haskell/Elm, and Gleam/Elixir. Includes tooling audit and improvement
suggestions.

Add `lux philosophy` command to the compiler, update help screen with
abbreviated philosophy, and link from README.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 10:19:29 -05:00
4909ff9fff docs: add package ecosystem plan and error documentation workflow
Add PACKAGES.md analyzing the Lux package ecosystem gaps vs stdlib,
with prioritized implementation plans for markdown, xml, rss, frontmatter,
path, and sitemap packages. Add CLAUDE.md instructions for documenting
Lux language errors in ISSUES.md during every major task.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 10:01:56 -05:00
8e788c8a9f fix: embed C compiler path at build time for self-contained binary
build.rs captures the absolute path to cc/gcc/clang during compilation
and bakes it into the binary. On Nix systems this embeds the full
/nix/store path so `lux compile` works without cc on PATH.

Lookup order: $CC env var > embedded build-time path > PATH search.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 08:12:18 -05:00
dbdd3cca57 chore: move blu-site to its own repo at ~/src/blu-site
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:57:55 -05:00
3ac022c04a chore: gitignore build output (_site/, docs/)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:48:51 -05:00
6bedd37ac7 fix: show help menu when running lux with no arguments
Previously `lux` with no args entered the REPL. Now it shows the help
menu. Use `lux repl` to start the REPL explicitly.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:34:09 -05:00
2909bf14b6 fix: eliminate all non-json C backend errors (79→0)
Second round of C backend fixes, building on d8871ac which reduced
errors from 286 to 111. This eliminates all 79 non-json errors:

- Fix function references as values (wrap in LuxClosure*)
- Fix fold/map/filter with type-aware calling conventions
- Add String.indexOf/lastIndexOf emission and C runtime functions
- Add File.readDir with dirent.h implementation
- Fix string concat in closure bodies
- Exclude ADT constructors from closure free variable capture
- Fix match result type inference (prioritize pattern binding types)
- Fix Option inner type inference (usage-based for List.head)
- Fix void* to struct cast (dereference through pointer)
- Handle constructors in emit_expr_with_env

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 05:56:21 -05:00
d8871acf7e fix: improve C backend robustness, reduce compilation errors by 61%
- Fix closure captured variable types: look up actual types from var_types
  instead of hardcoding LuxInt for all captured variables
- Register function parameters in var_types so closures can find their types
- Replace is_string_expr() with infer_expr_type() for more accurate string
  detection in binary ops (concat, comparison)
- Add missing String operations to infer_expr_type (substring, indexOf, etc.)
- Add module method call type inference (String.*, List.*, Int.*, Float.*)
- Add built-in Result type (Ok/Err) to C prelude alongside Option
- Register Ok/Err/Some/None in variant_to_type and variant_field_types
- Fix variable scoping: use if-statement pattern instead of ternary when
  branches emit statements (prevents redefinition of h2/h3 etc.)
- Add RC scope management for if-else branches and match arms to prevent
  undeclared variable errors from cleanup code
- Add infer_pattern_binding_type for better match result type inference
- Add expr_emits_statements helper to detect statement-emitting expressions
- Add infer_option_inner_type for String.indexOf (returns Option<Int>)

Reduces blu-site compilation errors from 286 to 111 (remaining are mostly
unsupported json effect and function-as-value references).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 17:56:27 -05:00
73b5eee664 docs: add commit-after-every-piece-of-work instruction to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 16:21:54 -05:00
542255780d feat: add tuple index access, multiline args, and effect unification fix
- Tuple index: `pair.0`, `pair.1` syntax across parser, typechecker,
  interpreter, C/JS backends, formatter, linter, and symbol table
- Multi-line function args: allow newlines inside argument lists
- Fix effect unification for callback parameters (empty expected
  effects means "no constraint", not "must be pure")

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 16:21:48 -05:00
bac63bab2a feat: add blu-site static site generator and fix language issues
Build a complete static site generator in Lux that faithfully clones
blu.cx (elmstatic). Generates 14 post pages, section indexes, tag pages,
and a home page with snippets grid from markdown content.

Language fixes discovered during development:
- Add \{ and \} escape sequences in string literals (lexer)
- Register String.indexOf and String.lastIndexOf in type checker
- Fix formatter to preserve brace escapes in string literals
- Improve LSP hover to show documentation for let bindings and functions

ISSUES.md documents 15 Lux language limitations found during the project.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 15:43:05 -05:00
db82ca1a1c fix: improve LSP hover to show function info when cursor is on fn keyword
When hovering on declaration keywords (fn, type, effect, let, trait),
look ahead to find the declaration name and show that symbol's full
info from the symbol table instead of generic keyword documentation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 08:32:01 -05:00
98605d2b70 feat: add self-hosted Lux lexer as first step toward bootstrapping
The lexer tokenizes Lux source code written entirely in Lux itself.
Supports all token types: keywords, operators, literals, behavioral
properties, doc comments, and delimiters.

This is the first component of the Lux-in-Lux compiler, demonstrating
that Lux's pattern matching, recursion, and string handling are
sufficient for compiler construction.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 08:25:22 -05:00
e3b6f4322a fix: add Char pattern matching and Char comparison operators
- Parser: support Char literals in match patterns (e.g., 'x' => ...)
- Interpreter: add Char comparison for <, <=, >, >= operators
  Previously only Int, Float, and String supported ordering comparisons.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 08:25:15 -05:00
27 changed files with 5883 additions and 496 deletions

5
.gitignore vendored
View File

@@ -4,6 +4,11 @@
# Claude Code project instructions
CLAUDE.md
# Build output
_site/
docs/*.html
docs/*.css
# Test binaries
hello
test_rc

View File

@@ -42,15 +42,46 @@ When making changes:
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
8. **Git commits**: Always use `--no-gpg-sign` flag
### Post-work checklist (run after each major piece of work)
### Post-work checklist (run after each committable change)
**MANDATORY: Run the full validation script after every committable change:**
```bash
./scripts/validate.sh
```
This script runs ALL of the following checks and will fail if any regress:
1. `cargo check` — no Rust compilation errors
2. `cargo test` — all Rust tests pass (currently 387)
3. `cargo build --release` — release binary builds
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
5. `lux check` on every package — type checking + lint passes
If `validate.sh` is not available or you need to run manually:
```bash
nix develop --command cargo check # No Rust errors
nix develop --command cargo test # All tests pass (currently 381)
./target/release/lux check # Type check + lint all .lux files
./target/release/lux fmt # Format all .lux files
./target/release/lux lint # Standalone lint pass
nix develop --command cargo test # All Rust tests pass
nix develop --command cargo build --release # Build release binary
cd ../packages/path && ../../lang/target/release/lux test # Package tests
cd ../packages/frontmatter && ../../lang/target/release/lux test
cd ../packages/xml && ../../lang/target/release/lux test
cd ../packages/rss && ../../lang/target/release/lux test
cd ../packages/markdown && ../../lang/target/release/lux test
```
**Do NOT commit if any check fails.** Fix the issue first.
### Commit after every piece of work
**After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
**Commit workflow:**
1. Make the change
2. Run `./scripts/validate.sh` (all 13 checks must pass)
3. `git add` the relevant files
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
5. Move on to the next task
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
**IMPORTANT: Always verify Lux code you write:**
- Run with interpreter: `./target/release/lux file.lux`
- Compile to binary: `./target/release/lux compile file.lux`
@@ -68,10 +99,45 @@ nix develop --command cargo test # All tests pass (currently 381)
| `lux serve` | `lux s` | Static file server |
| `lux compile` | `lux c` | Compile to binary |
## Documenting Lux Language Errors
When working on any major task that involves writing Lux code, **document every language error, limitation, or surprising behavior** you encounter. This log is optimized for LLM consumption so future sessions can avoid repeating mistakes.
**File:** Maintain an `ISSUES.md` in the relevant project directory (e.g., `~/src/blu-site/ISSUES.md`).
**Format for each entry:**
```markdown
## Issue N: <Short descriptive title>
**Category**: Parser limitation | Type checker gap | Missing feature | Runtime error | Documentation gap
**Severity**: High | Medium | Low
**Status**: Open | **Fixed** (commit hash or version)
<1-2 sentence description of the problem>
**Reproduction:**
```lux
// Minimal code that triggers the issue
```
**Error message:** `<exact error text>`
**Workaround:** <how to accomplish the goal despite the limitation>
**Fix:** <if fixed, what was changed and where>
```
**Rules:**
- Add new issues as you encounter them during any task
- When a previously documented issue gets fixed, update its status to **Fixed** and note the commit/version
- Remove entries that are no longer relevant (e.g., the feature was redesigned entirely)
- Keep the summary table at the bottom of ISSUES.md in sync with the entries
- Do NOT duplicate issues already documented -- check existing entries first
## Code Quality
- Fix all compiler warnings before committing
- Ensure all tests pass (currently 381 tests)
- Ensure all tests pass (currently 387 tests)
- Add new tests when adding features
- Keep examples and documentation in sync

216
Cargo.lock generated
View File

@@ -135,16 +135,6 @@ dependencies = [
"libc",
]
[[package]]
name = "core-foundation"
version = "0.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2a6cd9ae233e7f62ba4e9353e81a88df7fc8a5987b8d445b4d90c879bd156f6"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]]
name = "core-foundation-sys"
version = "0.8.7"
@@ -235,7 +225,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "39cab71617ae0d63f51a36d69f866391735b51691dbda63cf6f96d042b63efeb"
dependencies = [
"libc",
"windows-sys 0.61.2",
"windows-sys 0.59.0",
]
[[package]]
@@ -297,21 +287,6 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
[[package]]
name = "foreign-types"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
dependencies = [
"foreign-types-shared",
]
[[package]]
name = "foreign-types-shared"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
[[package]]
name = "form_urlencoded"
version = "1.2.2"
@@ -552,16 +527,17 @@ dependencies = [
]
[[package]]
name = "hyper-tls"
version = "0.5.0"
name = "hyper-rustls"
version = "0.24.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905"
checksum = "ec3efd23720e2049821a693cbc7e65ea87c72f1c58ff2f9522ff332b1491e590"
dependencies = [
"bytes",
"futures-util",
"http",
"hyper",
"native-tls",
"rustls",
"tokio",
"tokio-native-tls",
"tokio-rustls",
]
[[package]]
@@ -794,7 +770,7 @@ dependencies = [
[[package]]
name = "lux"
version = "0.1.0"
version = "0.1.5"
dependencies = [
"lsp-server",
"lsp-types",
@@ -843,23 +819,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "native-tls"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d5d26952a508f321b4d3d2e80e78fc2603eaefcdf0c30783867f19586518bdc"
dependencies = [
"libc",
"log",
"openssl",
"openssl-probe",
"openssl-sys",
"schannel",
"security-framework",
"security-framework-sys",
"tempfile",
]
[[package]]
name = "nibble_vec"
version = "0.1.0"
@@ -905,50 +864,6 @@ version = "1.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
[[package]]
name = "openssl"
version = "0.10.75"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
dependencies = [
"bitflags 2.10.0",
"cfg-if",
"foreign-types",
"libc",
"once_cell",
"openssl-macros",
"openssl-sys",
]
[[package]]
name = "openssl-macros"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "openssl-probe"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
[[package]]
name = "openssl-sys"
version = "0.9.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
dependencies = [
"cc",
"libc",
"pkg-config",
"vcpkg",
]
[[package]]
name = "parking_lot"
version = "0.12.5"
@@ -1203,15 +1118,15 @@ dependencies = [
"http",
"http-body",
"hyper",
"hyper-tls",
"hyper-rustls",
"ipnet",
"js-sys",
"log",
"mime",
"native-tls",
"once_cell",
"percent-encoding",
"pin-project-lite",
"rustls",
"rustls-pemfile",
"serde",
"serde_json",
@@ -1219,15 +1134,30 @@ dependencies = [
"sync_wrapper",
"system-configuration",
"tokio",
"tokio-native-tls",
"tokio-rustls",
"tower-service",
"url",
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
"webpki-roots",
"winreg",
]
[[package]]
name = "ring"
version = "0.17.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a4689e6c2294d81e88dc6261c768b63bc4fcdb852be6d1352498b114f61383b7"
dependencies = [
"cc",
"cfg-if",
"getrandom 0.2.17",
"libc",
"untrusted",
"windows-sys 0.52.0",
]
[[package]]
name = "rusqlite"
version = "0.31.0"
@@ -1252,7 +1182,19 @@ dependencies = [
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.61.2",
"windows-sys 0.59.0",
]
[[package]]
name = "rustls"
version = "0.21.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f56a14d1f48b391359b22f731fd4bd7e43c97f3c50eee276f3aa09c94784d3e"
dependencies = [
"log",
"ring",
"rustls-webpki",
"sct",
]
[[package]]
@@ -1264,6 +1206,16 @@ dependencies = [
"base64 0.21.7",
]
[[package]]
name = "rustls-webpki"
version = "0.101.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b6275d1ee7a1cd780b64aca7726599a1dbc893b1e64144529e55c3c2f745765"
dependencies = [
"ring",
"untrusted",
]
[[package]]
name = "rustversion"
version = "1.0.22"
@@ -1298,15 +1250,6 @@ version = "1.0.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
[[package]]
name = "schannel"
version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "scopeguard"
version = "1.2.0"
@@ -1314,26 +1257,13 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "security-framework"
version = "3.6.0"
name = "sct"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d17b898a6d6948c3a8ee4372c17cb384f90d2e6e912ef00895b14fd7ab54ec38"
checksum = "da046153aa2352493d6cb7da4b6e5c0c057d8a1d0a9aa8560baffdd945acd414"
dependencies = [
"bitflags 2.10.0",
"core-foundation 0.10.1",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
[[package]]
name = "security-framework-sys"
version = "2.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "321c8673b092a9a42605034a9879d73cb79101ed5fd117bc9a597b89b4e9e61a"
dependencies = [
"core-foundation-sys",
"libc",
"ring",
"untrusted",
]
[[package]]
@@ -1521,7 +1451,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
dependencies = [
"bitflags 1.3.2",
"core-foundation 0.9.4",
"core-foundation",
"system-configuration-sys",
]
@@ -1545,7 +1475,7 @@ dependencies = [
"getrandom 0.4.1",
"once_cell",
"rustix",
"windows-sys 0.61.2",
"windows-sys 0.59.0",
]
[[package]]
@@ -1619,16 +1549,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "tokio-native-tls"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
dependencies = [
"native-tls",
"tokio",
]
[[package]]
name = "tokio-postgres"
version = "0.7.16"
@@ -1655,6 +1575,16 @@ dependencies = [
"whoami",
]
[[package]]
name = "tokio-rustls"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c28327cf380ac148141087fbfb9de9d7bd4e84ab5d2c28fbc911d753de8a7081"
dependencies = [
"rustls",
"tokio",
]
[[package]]
name = "tokio-util"
version = "0.7.18"
@@ -1750,6 +1680,12 @@ version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
[[package]]
name = "untrusted"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1"
[[package]]
name = "url"
version = "2.5.8"
@@ -1941,6 +1877,12 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "webpki-roots"
version = "0.25.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f20c57d8d7db6d3b86154206ae5d8fba62dd39573114de97c2cb0578251f8e1"
[[package]]
name = "whoami"
version = "2.1.1"

View File

@@ -1,6 +1,6 @@
[package]
name = "lux"
version = "0.1.0"
version = "0.1.6"
edition = "2021"
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
license = "MIT"
@@ -13,7 +13,7 @@ lsp-types = "0.94"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
rand = "0.8"
reqwest = { version = "0.11", features = ["blocking", "json"] }
reqwest = { version = "0.11", default-features = false, features = ["blocking", "json", "rustls-tls"] }
tiny_http = "0.12"
rusqlite = { version = "0.31", features = ["bundled"] }
postgres = "0.19"

367
PACKAGES.md Normal file
View File

@@ -0,0 +1,367 @@
# Lux Package Ecosystem Plan
## Current State
### Stdlib (built-in)
| Module | Coverage |
|--------|----------|
| String | Comprehensive (split, join, trim, indexOf, replace, etc.) |
| List | Good (map, filter, fold, head, tail, concat, range, find, any, all, take, drop) |
| Option | Basic (map, flatMap, getOrElse, isSome, isNone) |
| Result | Basic (map, flatMap, getOrElse, isOk, isErr) |
| Math | Basic (abs, min, max, sqrt, pow, floor, ceil, round) |
| Json | Comprehensive (parse, stringify, get, typed extractors, constructors) |
| File | Good (read, write, append, exists, delete, readDir, isDir, mkdir) |
| Console | Good (print, read, readLine, readInt) |
| Process | Good (exec, execStatus, env, args, exit, cwd) |
| Http | Basic (get, post, put, delete, setHeader) |
| HttpServer | Basic (listen, accept, respond) |
| Time | Minimal (now, sleep) |
| Random | Basic (int, float, bool) |
| Sql | Good (SQLite: open, query, execute, transactions) |
| Postgres | Good (connect, query, execute, transactions) |
| Schema | Niche (versioned data migration) |
| Test | Good (assert, assertEqual, assertTrue) |
| Concurrent | Experimental (spawn, await, yield, cancel) |
| Channel | Experimental (create, send, receive) |
### Registry (pkgs.lux) - 3 packages
| Package | Version | Notes |
|---------|---------|-------|
| json | 1.0.0 | Wraps stdlib Json with convenience functions (getPath, getString, etc.) |
| http-client | 0.1.0 | Wraps stdlib Http with JSON helpers, URL encoding |
| testing | 0.1.0 | Wraps stdlib Test with describe/it structure |
---
## Gap Analysis
### What's Missing vs Other Languages
Compared to ecosystems like Rust/cargo, Go, Python, Elm, Gleam:
| Category | Gap | Impact | Notes |
|----------|-----|--------|-------|
| **Collections** | No HashMap, Set, Queue, Stack | Critical | List-of-pairs with O(n) lookup is the only option |
| **Sorting** | No List.sort or List.sortBy | High | Must implement insertion sort manually |
| **Date/Time** | Only `Time.now()` (epoch ms), no parsing/formatting | High | blu-site does string-based date formatting manually |
| **Markdown** | No markdown parser | High | blu-site has 300+ lines of hand-rolled markdown |
| **XML/RSS** | No XML generation | High | Can't generate RSS feeds or sitemaps |
| **Regex** | No pattern matching on strings | High | Character-by-character scanning required |
| **Path** | No file path utilities | Medium | basename/dirname manually reimplemented |
| **YAML/TOML** | No config file parsing (beyond JSON) | Medium | Frontmatter parsing is manual |
| **Template** | No string templating | Medium | HTML built via raw string concatenation |
| **URL** | No URL parsing/encoding | Medium | http-client has basic urlEncode but no parser |
| **Crypto** | No hashing (SHA256, etc.) | Medium | Can't do checksums, content hashing |
| **Base64** | No encoding/decoding | Low | Needed for data URIs, some auth |
| **CSV** | No CSV parsing | Low | Common data format |
| **UUID** | No UUID generation | Low | Useful for IDs |
| **Logging** | No structured logging | Low | Just Console.print |
| **CLI** | No argument parsing library | Low | Manual arg handling |
### What Should Be Stdlib vs Package
**Should be stdlib additions** (too fundamental to be packages):
- HashMap / Map type (requires runtime support)
- List.sort / List.sortBy (fundamental operation)
- Better Time module (date parsing, formatting)
- Regex (needs runtime/C support for performance)
- Path module (cross-platform file path handling)
**Should be packages** (application-level, opinionated, composable):
- markdown
- xml
- rss/atom
- frontmatter
- template
- csv
- crypto
- ssg (static site generator framework)
---
## Priority Package Plans
Ordered by what unblocks blu-site fixes first, then general ecosystem value.
---
### Package 1: `markdown` (Priority: HIGHEST)
**Why:** The 300-line markdown parser in blu-site's main.lux is general-purpose code that belongs in a reusable package. It's also the most complex part of blu-site and has known bugs (e.g., `### ` inside list items renders literally).
**Scope:**
```
markdown/
lux.toml
lib.lux # Public API: parse, parseInline
src/
inline.lux # Inline parsing (bold, italic, links, images, code)
block.lux # Block parsing (headings, lists, code blocks, blockquotes, hr)
types.lux # AST types (optional - could emit HTML directly)
```
**Public API:**
```lux
// Convert markdown string to HTML string
pub fn toHtml(markdown: String): String
// Convert inline markdown only (no blocks)
pub fn inlineToHtml(text: String): String
// Escape HTML entities
pub fn escapeHtml(s: String): String
```
**Improvements over current blu-site code:**
- Fix heading-inside-list-item rendering (`- ### Title` should work)
- Support nested lists (currently flat only)
- Support reference-style links `[text][ref]`
- Handle edge cases (empty lines in code blocks, nested blockquotes)
- Proper HTML entity escaping in more contexts
**Depends on:** Nothing (pure string processing)
**Estimated size:** ~400-500 lines of Lux
---
### Package 2: `xml` (Priority: HIGH)
**Why:** Needed for RSS/Atom feed generation, sitemap.xml, and robots.txt generation. General-purpose XML builder that doesn't try to parse XML (which would need regex), just emits it.
**Scope:**
```
xml/
lux.toml
lib.lux # Public API: element, document, serialize
```
**Public API:**
```lux
type XmlNode =
| Element(String, List<XmlAttr>, List<XmlNode>)
| Text(String)
| CData(String)
| Comment(String)
| Declaration(String, String) // version, encoding
type XmlAttr =
| Attr(String, String)
// Build an XML element
pub fn element(tag: String, attrs: List<XmlAttr>, children: List<XmlNode>): XmlNode
// Build a text node (auto-escapes)
pub fn text(content: String): XmlNode
// Build a CDATA section
pub fn cdata(content: String): XmlNode
// Serialize XML tree to string
pub fn serialize(node: XmlNode): String
// Serialize with XML declaration header
pub fn document(version: String, encoding: String, root: XmlNode): String
// Convenience: self-closing element
pub fn selfClosing(tag: String, attrs: List<XmlAttr>): XmlNode
```
**Depends on:** Nothing
**Estimated size:** ~150-200 lines
---
### Package 3: `rss` (Priority: HIGH)
**Why:** Directly needed for blu-site's #6 priority fix (add RSS feed). Builds on `xml` package.
**Scope:**
```
rss/
lux.toml # depends on xml
lib.lux # Public API: feed, item, toXml, toAtom
```
**Public API:**
```lux
type FeedInfo =
| FeedInfo(String, String, String, String, String)
// title, link, description, language, lastBuildDate
type FeedItem =
| FeedItem(String, String, String, String, String, String)
// title, link, description, pubDate, guid, categories (comma-separated)
// Generate RSS 2.0 XML string
pub fn toRss(info: FeedInfo, items: List<FeedItem>): String
// Generate Atom 1.0 XML string
pub fn toAtom(info: FeedInfo, items: List<FeedItem>): String
```
**Depends on:** `xml`
**Estimated size:** ~100-150 lines
---
### Package 4: `frontmatter` (Priority: HIGH)
**Why:** blu-site has ~50 lines of fragile frontmatter parsing. This is a common need for any content-driven Lux project. The current parser uses `String.indexOf(line, ": ")` which breaks on values containing `: `.
**Scope:**
```
frontmatter/
lux.toml
lib.lux # Public API: parse
```
**Public API:**
```lux
type FrontmatterResult =
| FrontmatterResult(List<(String, String)>, String)
// key-value pairs, remaining body
// Parse frontmatter from a string (--- delimited YAML-like header)
pub fn parse(content: String): FrontmatterResult
// Get a value by key from parsed frontmatter
pub fn get(pairs: List<(String, String)>, key: String): Option<String>
// Get a value or default
pub fn getOrDefault(pairs: List<(String, String)>, key: String, default: String): String
// Parse a space-separated tag string into a list
pub fn parseTags(tagString: String): List<String>
```
**Improvements over current blu-site code:**
- Handle values with `: ` in them (only split on first `: `)
- Handle multi-line values (indented continuation)
- Handle quoted values with embedded newlines
- Strip quotes from values consistently
**Depends on:** Nothing
**Estimated size:** ~100-150 lines
---
### Package 5: `path` (Priority: MEDIUM)
**Why:** blu-site manually implements `basename` and `dirname`. Any file-processing Lux program needs these. Tiny but universally useful.
**Scope:**
```
path/
lux.toml
lib.lux
```
**Public API:**
```lux
// Get filename from path: "/foo/bar.txt" -> "bar.txt"
pub fn basename(p: String): String
// Get directory from path: "/foo/bar.txt" -> "/foo"
pub fn dirname(p: String): String
// Get file extension: "file.txt" -> "txt", "file" -> ""
pub fn extension(p: String): String
// Remove file extension: "file.txt" -> "file"
pub fn stem(p: String): String
// Join path segments: join("foo", "bar") -> "foo/bar"
pub fn join(a: String, b: String): String
// Normalize path: "foo//bar/../baz" -> "foo/baz"
pub fn normalize(p: String): String
// Check if path is absolute
pub fn isAbsolute(p: String): Bool
```
**Depends on:** Nothing
**Estimated size:** ~80-120 lines
---
### Package 6: `sitemap` (Priority: MEDIUM)
**Why:** Directly needed for blu-site's #9 priority fix. Simple package that generates sitemap.xml.
**Scope:**
```
sitemap/
lux.toml # depends on xml
lib.lux
```
**Public API:**
```lux
type SitemapEntry =
| SitemapEntry(String, String, String, String)
// url, lastmod (ISO date), changefreq, priority
// Generate sitemap.xml string
pub fn generate(entries: List<SitemapEntry>): String
// Generate a simple robots.txt pointing to the sitemap
pub fn robotsTxt(sitemapUrl: String): String
```
**Depends on:** `xml`
**Estimated size:** ~50-70 lines
---
### Package 7: `ssg` (Priority: LOW - future)
**Why:** Once markdown, frontmatter, rss, sitemap, and path packages exist, the remaining logic in blu-site's main.lux is generic SSG framework code: read content dirs, parse posts, sort by date, generate section indexes, generate tag pages, copy static assets. This could be extracted into a framework package that other Lux users could use to build their own static sites.
**This should wait** until the foundation packages above are stable and battle-tested through blu-site usage.
---
## Non-Package Stdlib Improvements Needed
These gaps are too fundamental to be packages and should be added to the Lux language itself:
### HashMap (Critical)
Every package above that needs key-value lookups (frontmatter, xml attributes, etc.) is working around the lack of HashMap with `List<(String, String)>`. This is O(n) per lookup and makes code verbose. A stdlib `Map` module would transform the ecosystem.
### List.sort / List.sortBy (High)
blu-site implements insertion sort manually. Every content-driven app needs sorting. This should be a stdlib function.
### Time.format / Time.parse (High)
blu-site manually parses "2025-01-15" by substring extraction and maps month numbers to names. A proper date/time library (even just ISO 8601 parsing and basic formatting) would help every package above.
---
## Implementation Order
```
Phase 1 (unblock blu-site fixes):
1. markdown - extract from blu-site, fix bugs, publish
2. frontmatter - extract from blu-site, improve robustness
3. path - tiny, universally useful
4. xml - needed by rss and sitemap
Phase 2 (complete blu-site features):
5. rss - depends on xml
6. sitemap - depends on xml
Phase 3 (ecosystem growth):
7. template - string templating (mustache-like)
8. csv - data processing
9. cli - argument parsing
10. ssg - framework extraction from blu-site
```
Each package should be developed in its own directory under `~/src/`, published to the git.qrty.ink registry, and tested by integrating it into blu-site.

View File

@@ -2,15 +2,22 @@
A functional programming language with first-class effects, schema evolution, and behavioral types.
## Vision
## Philosophy
Most programming languages treat three critical concerns as afterthoughts:
**Make the important things visible.**
1. **Effects** — What can this code do? (Hidden, untraceable, untestable)
2. **Data Evolution** — Types change, data persists. (Manual migrations, runtime failures)
3. **Behavioral Properties** — Is this idempotent? Does it terminate? (Comments and hope)
Most languages hide what matters most: what code can do (effects), how data changes over time (schema evolution), and what guarantees functions provide (behavioral properties). Lux makes all three first-class, compiler-checked language features.
Lux makes these first-class language features. The compiler knows what your code does, how your data evolves, and what properties your functions guarantee.
| Principle | What it means |
|-----------|--------------|
| **Explicit over implicit** | Effects in types — see what code does |
| **Composition over configuration** | No DI frameworks — effects compose naturally |
| **Safety without ceremony** | Type inference + explicit signatures where they matter |
| **Practical over academic** | Familiar syntax, ML semantics, no monads |
| **One right way** | Opinionated formatter, integrated tooling, built-in test framework |
| **Tools are the language** | `lux fmt/lint/check/test/compile` — one binary, not seven tools |
See [docs/PHILOSOPHY.md](./docs/PHILOSOPHY.md) for the full philosophy with language comparisons and design rationale.
## Core Principles

38
build.rs Normal file
View File

@@ -0,0 +1,38 @@
use std::path::PathBuf;
fn main() {
// Capture the absolute C compiler path at build time so the binary is self-contained.
// This is critical for Nix builds where cc/gcc live in /nix/store paths.
let cc_path = std::env::var("CC").ok()
.filter(|s| !s.is_empty())
.and_then(|s| resolve_absolute(&s))
.or_else(|| find_in_path("cc"))
.or_else(|| find_in_path("gcc"))
.or_else(|| find_in_path("clang"))
.unwrap_or_default();
println!("cargo:rustc-env=LUX_CC_PATH={}", cc_path);
println!("cargo:rerun-if-env-changed=CC");
println!("cargo:rerun-if-env-changed=PATH");
}
/// Resolve a command name to its absolute path by searching PATH.
fn find_in_path(cmd: &str) -> Option<String> {
let path_var = std::env::var("PATH").ok()?;
for dir in path_var.split(':') {
let candidate = PathBuf::from(dir).join(cmd);
if candidate.is_file() {
return Some(candidate.to_string_lossy().into_owned());
}
}
None
}
/// If the path is already absolute and exists, return it. Otherwise search PATH.
fn resolve_absolute(cmd: &str) -> Option<String> {
let p = PathBuf::from(cmd);
if p.is_absolute() && p.is_file() {
return Some(cmd.to_string());
}
find_in_path(cmd)
}

449
docs/PHILOSOPHY.md Normal file
View File

@@ -0,0 +1,449 @@
# The Lux Philosophy
## In One Sentence
**Make the important things visible.**
## The Three Pillars
Most programming languages hide the things that matter most in production:
1. **What can this code do?** — Side effects are invisible in function signatures
2. **How does data change over time?** — Schema evolution is a deployment problem, not a language one
3. **What guarantees does this code provide?** — Properties like idempotency live in comments and hope
Lux makes all three first-class, compiler-checked language features.
---
## Core Principles
### 1. Explicit Over Implicit
Every function signature tells you what it does:
```lux
fn processOrder(order: Order): Receipt with {Database, Email, Logger}
```
You don't need to read the body, trace call chains, or check documentation. The signature *is* the documentation. Code review becomes: "should this function really send emails?"
**What this means in practice:**
- Effects are declared in types, not hidden behind interfaces
- No dependency injection frameworks — just swap handlers
- No mocking libraries — test with different effect implementations
- No "spooky action at a distance" — if a function can fail, its type says so
**How this compares:**
| Language | Side effects | Lux equivalent |
|----------|-------------|----------------|
| JavaScript | Anything, anywhere, silently | `with {Console, Http, File}` |
| Python | Implicit, discovered by reading code | Effect declarations in signature |
| Java | Checked exceptions (partial), DI frameworks | Effects + handlers |
| Go | Return error values (partial) | `with {Fail}` or `Result` |
| Rust | `unsafe` blocks, `Result`/`Option` | Effects for I/O, Result for values |
| Haskell | Monad transformers (explicit but heavy) | Effects (explicit and lightweight) |
| Koka | Algebraic effects (similar) | Same family, more familiar syntax |
### 2. Composition Over Configuration
Things combine naturally without glue code:
```lux
// Multiple effects compose by listing them
fn sync(id: UserId): User with {Database, Http, Logger} = ...
// Handlers compose by providing them
run sync(id) with {
Database = postgres(conn),
Http = realHttp,
Logger = consoleLogger
}
```
No monad transformers. No middleware stacks. No factory factories. Effects are sets; they union naturally.
**What this means in practice:**
- Functions compose with `|>` (pipes)
- Effects compose by set union
- Types compose via generics and ADTs
- Tests compose by handler substitution
### 3. Safety Without Ceremony
The type system catches errors at compile time, but doesn't make you fight it:
```lux
// Type inference keeps code clean
let x = 42 // Int, inferred
let names = ["Alice", "Bob"] // List<String>, inferred
// But function signatures are always explicit
fn greet(name: String): String = "Hello, {name}"
```
**The balance:**
- Function signatures: always annotated (documentation + API contract)
- Local bindings: inferred (reduces noise in implementation)
- Effects: declared or inferred (explicit at boundaries, lightweight inside)
- Behavioral properties: opt-in (`is pure`, `is total` — add when valuable)
### 4. Practical Over Academic
Lux borrows from the best of programming language research, but wraps it in familiar syntax:
```lux
// This is algebraic effects. But it reads like normal code.
fn main(): Unit with {Console} = {
Console.print("What's your name?")
let name = Console.readLine()
Console.print("Hello, {name}!")
}
```
Compare with Haskell's equivalent:
```haskell
main :: IO ()
main = do
putStrLn "What's your name?"
name <- getLine
putStrLn ("Hello, " ++ name ++ "!")
```
Both are explicit about effects. Lux chooses syntax that reads like imperative code while maintaining the same guarantees.
**What this means in practice:**
- ML-family semantics, C-family appearance
- No monads to learn (effects replace them)
- No category theory prerequisites
- The learning curve is: functions → types → effects (days, not months)
### 5. One Right Way
Like Go and Python, Lux favors having one obvious way to do things:
- **One formatter** (`lux fmt`) — opinionated, not configurable, ends all style debates
- **One test framework** (built-in `Test` effect) — no framework shopping
- **One way to handle effects** — declare, handle, compose
- **One package manager** (`lux pkg`) — integrated, not bolted on
This is a deliberate rejection of the JavaScript/Ruby approach where every project assembles its own stack from dozens of competing libraries.
### 6. Tools Are Part of the Language
The compiler, linter, formatter, LSP, package manager, and test runner are one thing, not seven:
```bash
lux fmt # Format
lux lint # Lint (with --explain for education)
lux check # Type check + lint
lux test # Run tests
lux compile # Build a binary
lux serve # Serve files
lux --lsp # Editor integration
```
This follows Go's philosophy: a language is its toolchain. The formatter knows the AST. The linter knows the type system. The LSP knows the effects. They're not afterthoughts.
---
## Design Decisions and Their Reasons
### Why algebraic effects instead of monads?
Monads are powerful but have poor ergonomics for composition. Combining `IO`, `State`, and `Error` in Haskell requires monad transformers — a notoriously difficult concept. Effects compose naturally:
```lux
// Just list the effects you need. No transformers.
fn app(): Unit with {Console, File, Http, Time} = ...
```
### Why not just `async/await`?
`async/await` solves one effect (concurrency). Effects solve all of them: I/O, state, randomness, failure, concurrency, logging, databases. One mechanism, universally applicable.
### Why require function type annotations?
Three reasons:
1. **Documentation**: Every function signature is self-documenting
2. **Error messages**: Inference failures produce confusing errors; annotations localize them
3. **API stability**: Changing a function body shouldn't silently change its type
### Why an opinionated formatter?
Style debates waste engineering time. `gofmt` proved that an opinionated, non-configurable formatter eliminates an entire category of bikeshedding. `lux fmt` does the same.
### Why immutable by default?
Mutable state is the root of most concurrency bugs and many logic bugs. Immutability makes code easier to reason about. When you need state, the `State` effect makes it explicit and trackable.
### Why behavioral types?
Properties like "this function is idempotent" or "this function always terminates" are critical for correctness but typically live in comments. Making them part of the type system means:
- The compiler can verify them (or generate property tests)
- Callers can require them (`where F is idempotent`)
- They serve as machine-readable documentation
---
## Comparison with Popular Languages
### JavaScript / TypeScript (SO #1 / #6 by usage)
| Aspect | JavaScript/TypeScript | Lux |
|--------|----------------------|-----|
| **Type system** | Optional/gradual (TS) | Required, Hindley-Milner |
| **Side effects** | Anywhere, implicit | Declared in types |
| **Testing** | Mock libraries (Jest, etc.) | Swap effect handlers |
| **Formatting** | Prettier (configurable) | `lux fmt` (opinionated) |
| **Package management** | npm (massive ecosystem) | `lux pkg` (small ecosystem) |
| **Paradigm** | Multi-paradigm | Functional-first |
| **Null safety** | Optional chaining (partial) | `Option<T>`, no null |
| **Error handling** | try/catch (unchecked) | `Result<T, E>` + `Fail` effect |
| **Shared** | Familiar syntax, first-class functions, closures, string interpolation |
**What Lux learns from JS/TS:** Familiar syntax matters. String interpolation, arrow functions, and readable code lower the barrier to entry.
**What Lux rejects:** Implicit `any`, unchecked exceptions, the "pick your own adventure" toolchain.
### Python (SO #4 by usage, #1 most desired)
| Aspect | Python | Lux |
|--------|--------|-----|
| **Type system** | Optional (type hints) | Required, static |
| **Side effects** | Implicit | Explicit |
| **Performance** | Slow (interpreted) | Faster (compiled to C) |
| **Syntax** | Whitespace-significant | Braces/keywords |
| **Immutability** | Mutable by default | Immutable by default |
| **Tooling** | Fragmented (black, ruff, mypy, pytest...) | Unified (`lux` binary) |
| **Shared** | Clean syntax philosophy, "one way to do it", readability focus |
**What Lux learns from Python:** Readability counts. The Zen of Python's emphasis on one obvious way to do things resonates with Lux's design.
**What Lux rejects:** Dynamic typing, mutable-by-default, fragmented tooling.
### Rust (SO #1 most admired)
| Aspect | Rust | Lux |
|--------|------|-----|
| **Memory** | Ownership/borrowing (manual) | Reference counting (automatic) |
| **Type system** | Traits, generics, lifetimes | ADTs, effects, generics |
| **Side effects** | Implicit (except `unsafe`) | Explicit (effect system) |
| **Error handling** | `Result<T, E>` + `?` | `Result<T, E>` + `Fail` effect |
| **Performance** | Zero-cost, systems-level | Good, not systems-level |
| **Learning curve** | Steep (ownership) | Moderate (effects) |
| **Pattern matching** | Excellent, exhaustive | Excellent, exhaustive |
| **Shared** | ADTs, pattern matching, `Option`/`Result`, no null, immutable by default, strong type system |
**What Lux learns from Rust:** ADTs with exhaustive matching, `Option`/`Result` instead of null/exceptions, excellent error messages, integrated tooling (cargo model).
**What Lux rejects:** Ownership complexity (Lux uses GC/RC instead), lifetimes, `unsafe`.
### Go (SO #13 by usage, #11 most admired)
| Aspect | Go | Lux |
|--------|-----|-----|
| **Type system** | Structural, simple | HM inference, ADTs |
| **Side effects** | Implicit | Explicit |
| **Error handling** | Multiple returns (`val, err`) | `Result<T, E>` + effects |
| **Formatting** | `gofmt` (opinionated) | `lux fmt` (opinionated) |
| **Tooling** | All-in-one (`go` binary) | All-in-one (`lux` binary) |
| **Concurrency** | Goroutines + channels | `Concurrent` + `Channel` effects |
| **Generics** | Added late, limited | First-class from day one |
| **Shared** | Opinionated formatter, unified tooling, practical philosophy |
**What Lux learns from Go:** Unified toolchain, opinionated formatting, simplicity as a feature, fast compilation.
**What Lux rejects:** Verbose error handling (`if err != nil`), no ADTs, no generics (historically), nil.
### Java / C# (SO #7 / #8 by usage)
| Aspect | Java/C# | Lux |
|--------|---------|-----|
| **Paradigm** | OOP-first | FP-first |
| **Effects** | DI frameworks (Spring, etc.) | Language-level effects |
| **Testing** | Mock frameworks (Mockito, etc.) | Handler swapping |
| **Null safety** | Nullable (Java), nullable ref types (C#) | `Option<T>` |
| **Boilerplate** | High (getters, setters, factories) | Low (records, inference) |
| **Shared** | Static typing, generics, pattern matching (recent), established ecosystems |
**What Lux learns from Java/C#:** Enterprise needs (database effects, HTTP, serialization) matter. Testability is a first-class concern.
**What Lux rejects:** OOP ceremony, DI frameworks, null, boilerplate.
### Haskell / OCaml / Elm (FP family)
| Aspect | Haskell | Elm | Lux |
|--------|---------|-----|-----|
| **Effects** | Monads + transformers | Cmd/Sub (Elm Architecture) | Algebraic effects |
| **Learning curve** | Steep | Moderate | Moderate |
| **Error messages** | Improving | Excellent | Good (aspiring to Elm-quality) |
| **Practical focus** | Academic-leaning | Web-focused | General-purpose |
| **Syntax** | Unique | Unique | Familiar (C-family feel) |
| **Shared** | Immutability, ADTs, pattern matching, type inference, no null |
**What Lux learns from Haskell:** Effects must be explicit. Types must be powerful. Purity matters.
**What Lux learns from Elm:** Error messages should teach. Tooling should be integrated. Simplicity beats power.
**What Lux rejects (from Haskell):** Monad transformers, academic syntax, steep learning curve.
### Gleam / Elixir (SO #2 / #3 most admired, 2025)
| Aspect | Gleam | Elixir | Lux |
|--------|-------|--------|-----|
| **Type system** | Static, HM | Dynamic | Static, HM |
| **Effects** | No special tracking | Implicit | First-class |
| **Concurrency** | BEAM (built-in) | BEAM (built-in) | Effect-based |
| **Error handling** | `Result` | Pattern matching on tuples | `Result` + `Fail` effect |
| **Shared** | Friendly errors, pipe operator, functional style, immutability |
**What Lux learns from Gleam:** Friendly developer experience, clear error messages, and pragmatic FP resonate with developers.
---
## Tooling Philosophy Audit
### Does the linter follow the philosophy?
**Yes, strongly.** The linter embodies "make the important things visible":
- `could-be-pure`: Nudges users toward declaring purity — making guarantees visible
- `could-be-total`: Same for termination
- `unnecessary-effect-decl`: Keeps effect signatures honest — don't claim effects you don't use
- `unused-variable/import/function`: Keeps code focused — everything visible should be meaningful
- `single-arm-match` / `manual-map-option`: Teaches idiomatic patterns
The category system (correctness > suspicious > idiom > style > pedantic) reflects the philosophy of being practical, not academic: real bugs are errors, style preferences are opt-in.
### Does the formatter follow the philosophy?
**Yes, with one gap.** The formatter is opinionated and non-configurable, matching the "one right way" principle. It enforces consistent style across all Lux code.
**Gap:** `max_width` and `trailing_commas` are declared in `FormatConfig` but never used. This is harmless but inconsistent — either remove the unused config or implement line wrapping.
### Does the type checker follow the philosophy?
**Yes.** The type checker embodies every core principle:
- Effects are tracked and verified in function types
- Behavioral properties are checked where possible
- Error messages include context and suggestions
- Type inference reduces ceremony while maintaining safety
---
## What Could Be Improved
### High-value additions (improve experience significantly, low verbosity cost)
1. **Pipe-friendly standard library**
- Currently: `List.map(myList, fn(x: Int): Int => x * 2)`
- Better: Allow `myList |> List.map(fn(x: Int): Int => x * 2)`
- Many languages (Elixir, F#, Gleam) make the pipe operator the primary composition tool. If the first argument of stdlib functions is always the data, pipes become natural. This is a **library convention**, not a language change.
- **LLM impact:** Pipe chains are easier for LLMs to generate and read — linear data flow with no nesting.
- **Human impact:** Reduces cognitive load. Reading left-to-right matches how humans think about data transformation.
2. **Exhaustive `match` warnings for non-enum types**
- The linter warns about `wildcard-on-small-enum`, but could also warn when a match on `Option` or `Result` uses a wildcard instead of handling both cases explicitly.
- **Both audiences:** Prevents subtle bugs where new variants are silently caught by `_`.
3. **Error message improvements toward Elm quality**
- Current errors show the right information but could be more conversational and suggest fixes more consistently.
- Example improvement: When a function is called with wrong argument count, show the expected signature and highlight which argument is wrong.
- **LLM impact:** Structured error messages with clear "expected X, got Y" patterns are easier for LLMs to parse and fix.
- **Human impact:** Friendly errors reduce frustration, especially for beginners.
4. **`let ... else` for fallible destructuring**
- Rust's `let ... else` pattern handles the "unwrap or bail" case elegantly:
```lux
let Some(value) = maybeValue else return defaultValue
```
- Currently requires a full `match` expression for this common pattern.
- **Both audiences:** Reduces boilerplate for the most common Option/Result handling pattern.
5. **Trait/typeclass system for overloading**
- Currently `toString`, `==`, and similar operations are built-in. A trait system would let users define their own:
```lux
trait Show<T> { fn show(value: T): String }
impl Show<User> { fn show(u: User): String = "User({u.name})" }
```
- **Note:** This exists partially. Expanding it would enable more generic programming without losing explicitness.
- **LLM impact:** Traits provide clear, greppable contracts. LLMs can generate trait impls from examples.
### Medium-value additions (good improvements, some verbosity cost)
6. **Named arguments or builder pattern for records**
- When functions take many parameters, the linter already warns at 5+. Named arguments or record-punning would help:
```lux
fn createUser({ name, email, age }: UserConfig): User = ...
createUser({ name: "Alice", email: "alice@ex.com", age: 30 })
```
- **Trade-off:** Adds syntax, but the linter already pushes users toward records for many params.
7. **Async/concurrent effect sugar**
- The `Concurrent` effect exists but could benefit from syntactic sugar:
```lux
let (a, b) = concurrent {
fetch("/api/users"),
fetch("/api/posts")
}
```
- **Trade-off:** Adds syntax, but concurrent code is important enough to warrant it.
8. **Module-level documentation with `///` doc comments**
- The `missing-doc-comment` lint exists, but the doc generation system could be enhanced with richer doc comments that include examples, parameter descriptions, and effect documentation.
- **LLM impact:** Structured documentation is the single highest-value feature for LLM code understanding.
### Lower-value or risky additions (consider carefully)
9. **Type inference for function return types**
- Would reduce ceremony: `fn double(x: Int) = x * 2` instead of `fn double(x: Int): Int = x * 2`
- **Risk:** Violates the "function signatures are documentation" principle. A body change could silently change the API. Current approach is the right trade-off.
10. **Operator overloading**
- Tempting for numeric types, but quickly leads to the C++ problem where `+` could mean anything.
- **Risk:** Violates "make the important things visible" — you can't tell what `a + b` does.
- **Better:** Keep operators for built-in numeric types. Use named functions for everything else.
11. **Macros**
- Powerful but drastically complicate tooling, error messages, and readability.
- **Risk:** Rust's macro system is powerful but produces some of the worst error messages in the language.
- **Better:** Solve specific problems with language features (effects, generics) rather than a general metaprogramming escape hatch.
---
## The LLM Perspective
Lux has several properties that make it unusually well-suited for LLM-assisted programming:
1. **Effect signatures are machine-readable contracts.** An LLM reading `fn f(): T with {Database, Logger}` knows exactly what capabilities to provide when generating handler code.
2. **Behavioral properties are verifiable assertions.** `is pure`, `is idempotent` give LLMs clear constraints to check their own output against.
3. **The opinionated formatter eliminates style ambiguity.** LLMs don't need to guess indentation, brace style, or naming conventions — `lux fmt` handles it.
4. **Exhaustive pattern matching forces completeness.** LLMs that generate `match` expressions are reminded by the compiler when they miss cases.
5. **Small, consistent standard library.** `List.map`, `String.split`, `Option.map` — uniform `Module.function` convention is easy to learn from few examples.
6. **Effect-based testing needs no framework knowledge.** An LLM doesn't need to know Jest, pytest, or JUnit — just swap handlers.
**What would help LLMs more:**
- Structured error output (JSON mode) for programmatic error fixing
- Example-rich documentation that LLMs can learn patterns from
- A canonical set of "Lux patterns" (like Go's proverbs) that encode best practices in memorable form
---
## Summary
Lux's philosophy can be compressed to five words: **Make the important things visible.**
This manifests as:
- **Effects in types** — see what code does
- **Properties in types** — see what code guarantees
- **Versions in types** — see how data evolves
- **One tool for everything** — see how to build
- **One format for all** — see consistent style
The language is in the sweet spot between Haskell's rigor and Python's practicality, with Go's tooling philosophy and Elm's developer experience aspirations. It doesn't try to be everything — it tries to make the things that matter most in real software visible, composable, and verifiable.

View File

@@ -14,6 +14,7 @@
pkgs = import nixpkgs { inherit system overlays; };
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
extensions = [ "rust-src" "rust-analyzer" ];
targets = [ "x86_64-unknown-linux-musl" ];
};
in
{
@@ -22,8 +23,8 @@
rustToolchain
cargo-watch
cargo-edit
pkg-config
openssl
# Static builds
pkgsStatic.stdenv.cc
# Benchmark tools
hyperfine
poop
@@ -43,7 +44,7 @@
printf "\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m v0.1.0\n"
printf " \033[1;35m \033[0m v0.1.6\n"
printf "\n"
printf " Functional language with first-class effects\n"
printf "\n"
@@ -61,18 +62,47 @@
packages.default = pkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.0";
version = "0.1.6";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
nativeBuildInputs = [ pkgs.pkg-config ];
buildInputs = [ pkgs.openssl ];
doCheck = false;
};
# Benchmark scripts
packages.static = let
muslPkgs = import nixpkgs {
inherit system;
crossSystem = {
config = "x86_64-unknown-linux-musl";
isStatic = true;
};
};
in muslPkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.6";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
CARGO_BUILD_TARGET = "x86_64-unknown-linux-musl";
CARGO_BUILD_RUSTFLAGS = "-C target-feature=+crt-static";
doCheck = false;
postInstall = ''
$STRIP $out/bin/lux 2>/dev/null || true
'';
};
apps = {
# Release automation
release = {
type = "app";
program = toString (pkgs.writeShellScript "lux-release" ''
exec ${self}/scripts/release.sh "$@"
'');
};
# Benchmark scripts
# Run hyperfine benchmark comparison
bench = {
type = "app";

View File

@@ -0,0 +1,225 @@
// Lux AST — Self-hosted Abstract Syntax Tree definitions
//
// Direct translation of src/ast.rs into Lux ADTs.
// These types represent the parsed structure of a Lux program.
//
// Naming conventions to avoid collisions:
// Ex = Expr variant, Pat = Pattern, Te = TypeExpr
// Td = TypeDef, Vf = VariantFields, Op = Operator
// Decl = Declaration, St = Statement
// === Source Location ===
type Span = | Span(Int, Int)
// === Identifiers ===
type Ident = | Ident(String, Span)
// === Visibility ===
type Visibility = | Public | Private
// === Schema Evolution ===
type Version = | Version(Int, Span)
type VersionConstraint =
| VcExact(Version)
| VcAtLeast(Version)
| VcLatest(Span)
// === Behavioral Types ===
type BehavioralProperty =
| BpPure
| BpTotal
| BpIdempotent
| BpDeterministic
| BpCommutative
// === Trait Bound (needed before WhereClause) ===
type TraitBound = | TraitBound(Ident, List<TypeExpr>, Span)
// === Trait Constraint (needed before WhereClause) ===
type TraitConstraint = | TraitConstraint(Ident, List<TraitBound>, Span)
// === Where Clauses ===
type WhereClause =
| WcProperty(Ident, BehavioralProperty, Span)
| WcResult(Expr, Span)
| WcTrait(TraitConstraint)
// === Module Path ===
type ModulePath = | ModulePath(List<Ident>, Span)
// === Import ===
// path, alias, items, wildcard, span
type ImportDecl = | ImportDecl(ModulePath, Option<Ident>, Option<List<Ident>>, Bool, Span)
// === Program ===
type Program = | Program(List<ImportDecl>, List<Declaration>)
// === Declarations ===
type Declaration =
| DeclFunction(FunctionDecl)
| DeclEffect(EffectDecl)
| DeclType(TypeDecl)
| DeclHandler(HandlerDecl)
| DeclLet(LetDecl)
| DeclTrait(TraitDecl)
| DeclImpl(ImplDecl)
// === Parameter ===
type Parameter = | Parameter(Ident, TypeExpr, Span)
// === Effect Operation ===
type EffectOp = | EffectOp(Ident, List<Parameter>, TypeExpr, Span)
// === Record Field ===
type RecordField = | RecordField(Ident, TypeExpr, Span)
// === Variant Fields ===
type VariantFields =
| VfUnit
| VfTuple(List<TypeExpr>)
| VfRecord(List<RecordField>)
// === Variant ===
type Variant = | Variant(Ident, VariantFields, Span)
// === Migration ===
type Migration = | Migration(Version, Expr, Span)
// === Handler Impl ===
// op_name, params, resume, body, span
type HandlerImpl = | HandlerImpl(Ident, List<Ident>, Option<Ident>, Expr, Span)
// === Impl Method ===
// name, params, return_type, body, span
type ImplMethod = | ImplMethod(Ident, List<Parameter>, Option<TypeExpr>, Expr, Span)
// === Trait Method ===
// name, type_params, params, return_type, default_impl, span
type TraitMethod = | TraitMethod(Ident, List<Ident>, List<Parameter>, TypeExpr, Option<Expr>, Span)
// === Type Expressions ===
type TypeExpr =
| TeNamed(Ident)
| TeApp(TypeExpr, List<TypeExpr>)
| TeFunction(List<TypeExpr>, TypeExpr, List<Ident>)
| TeTuple(List<TypeExpr>)
| TeRecord(List<RecordField>)
| TeUnit
| TeVersioned(TypeExpr, VersionConstraint)
// === Literal ===
type LiteralKind =
| LitInt(Int)
| LitFloat(String)
| LitString(String)
| LitChar(Char)
| LitBool(Bool)
| LitUnit
type Literal = | Literal(LiteralKind, Span)
// === Binary Operators ===
type BinaryOp =
| OpAdd | OpSub | OpMul | OpDiv | OpMod
| OpEq | OpNe | OpLt | OpLe | OpGt | OpGe
| OpAnd | OpOr
| OpPipe | OpConcat
// === Unary Operators ===
type UnaryOp = | OpNeg | OpNot
// === Statements ===
type Statement =
| StExpr(Expr)
| StLet(Ident, Option<TypeExpr>, Expr, Span)
// === Match Arms ===
type MatchArm = | MatchArm(Pattern, Option<Expr>, Expr, Span)
// === Patterns ===
type Pattern =
| PatWildcard(Span)
| PatVar(Ident)
| PatLiteral(Literal)
| PatConstructor(Ident, List<Pattern>, Span)
| PatRecord(List<(Ident, Pattern)>, Span)
| PatTuple(List<Pattern>, Span)
// === Function Declaration ===
// visibility, doc, name, type_params, params, return_type, effects, properties, where_clauses, body, span
type FunctionDecl = | FunctionDecl(Visibility, Option<String>, Ident, List<Ident>, List<Parameter>, TypeExpr, List<Ident>, List<BehavioralProperty>, List<WhereClause>, Expr, Span)
// === Effect Declaration ===
// doc, name, type_params, operations, span
type EffectDecl = | EffectDecl(Option<String>, Ident, List<Ident>, List<EffectOp>, Span)
// === Type Declaration ===
// visibility, doc, name, type_params, version, definition, migrations, span
type TypeDecl = | TypeDecl(Visibility, Option<String>, Ident, List<Ident>, Option<Version>, TypeDef, List<Migration>, Span)
// === Handler Declaration ===
// name, params, effect, implementations, span
type HandlerDecl = | HandlerDecl(Ident, List<Parameter>, Ident, List<HandlerImpl>, Span)
// === Let Declaration ===
// visibility, doc, name, typ, value, span
type LetDecl = | LetDecl(Visibility, Option<String>, Ident, Option<TypeExpr>, Expr, Span)
// === Trait Declaration ===
// visibility, doc, name, type_params, super_traits, methods, span
type TraitDecl = | TraitDecl(Visibility, Option<String>, Ident, List<Ident>, List<TraitBound>, List<TraitMethod>, Span)
// === Impl Declaration ===
// type_params, constraints, trait_name, trait_args, target_type, methods, span
type ImplDecl = | ImplDecl(List<Ident>, List<TraitConstraint>, Ident, List<TypeExpr>, TypeExpr, List<ImplMethod>, Span)
// === Expressions ===
type Expr =
| ExLiteral(Literal)
| ExVar(Ident)
| ExBinaryOp(BinaryOp, Expr, Expr, Span)
| ExUnaryOp(UnaryOp, Expr, Span)
| ExCall(Expr, List<Expr>, Span)
| ExEffectOp(Ident, Ident, List<Expr>, Span)
| ExField(Expr, Ident, Span)
| ExTupleIndex(Expr, Int, Span)
| ExLambda(List<Parameter>, Option<TypeExpr>, List<Ident>, Expr, Span)
| ExLet(Ident, Option<TypeExpr>, Expr, Expr, Span)
| ExIf(Expr, Expr, Expr, Span)
| ExMatch(Expr, List<MatchArm>, Span)
| ExBlock(List<Statement>, Expr, Span)
| ExRecord(Option<Expr>, List<(Ident, Expr)>, Span)
| ExTuple(List<Expr>, Span)
| ExList(List<Expr>, Span)
| ExRun(Expr, List<(Ident, Expr)>, Span)
| ExResume(Expr, Span)

View File

@@ -0,0 +1,512 @@
// Lux Lexer — Self-hosted lexer for the Lux language
//
// This is the first component of the Lux-in-Lux compiler.
// It tokenizes Lux source code into a list of tokens.
//
// Design:
// - Recursive descent character scanning
// - Immutable state (ParseState tracks chars + position)
// - Pattern matching for all token types
// === Token types ===
type TokenKind =
// Literals
| TkInt(Int)
| TkFloat(String)
| TkString(String)
| TkChar(Char)
| TkBool(Bool)
// Identifiers
| TkIdent(String)
// Keywords
| TkFn | TkLet | TkIf | TkThen | TkElse | TkMatch
| TkWith | TkEffect | TkHandler | TkRun | TkResume
| TkType | TkImport | TkPub | TkAs | TkFrom
| TkTrait | TkImpl | TkFor
// Behavioral
| TkIs | TkPure | TkTotal | TkIdempotent
| TkDeterministic | TkCommutative
| TkWhere | TkAssume
// Operators
| TkPlus | TkMinus | TkStar | TkSlash | TkPercent
| TkEq | TkEqEq | TkNe | TkLt | TkLe | TkGt | TkGe
| TkAnd | TkOr | TkNot
| TkPipe | TkPipeGt | TkArrow | TkThinArrow
| TkDot | TkColon | TkColonColon | TkComma | TkSemi | TkAt
// Delimiters
| TkLParen | TkRParen | TkLBrace | TkRBrace
| TkLBracket | TkRBracket
// Special
| TkUnderscore | TkNewline | TkEof
// Doc comment
| TkDocComment(String)
type Token =
| Token(TokenKind, Int, Int) // kind, start, end
type LexState =
| LexState(List<Char>, Int) // chars, position
type LexResult =
| LexOk(Token, LexState)
| LexErr(String, Int)
// === Character utilities ===
fn peek(state: LexState): Option<Char> =
match state {
LexState(chars, pos) => List.get(chars, pos)
}
fn peekAt(state: LexState, offset: Int): Option<Char> =
match state {
LexState(chars, pos) => List.get(chars, pos + offset)
}
fn advance(state: LexState): LexState =
match state {
LexState(chars, pos) => LexState(chars, pos + 1)
}
fn position(state: LexState): Int =
match state { LexState(_, pos) => pos }
fn isDigit(c: Char): Bool =
c == '0' || c == '1' || c == '2' || c == '3' || c == '4' ||
c == '5' || c == '6' || c == '7' || c == '8' || c == '9'
fn isAlpha(c: Char): Bool =
(c >= 'a' && c <= 'z') || (c >= 'A' && c <= 'Z') || c == '_'
fn isAlphaNumeric(c: Char): Bool =
isAlpha(c) || isDigit(c)
fn isWhitespace(c: Char): Bool =
c == ' ' || c == '\t' || c == '\r'
// === Core lexing ===
fn skipLineComment(state: LexState): LexState =
match peek(state) {
None => state,
Some(c) =>
if c == '\n' then state
else skipLineComment(advance(state))
}
fn skipWhitespaceAndComments(state: LexState): LexState =
match peek(state) {
None => state,
Some(c) =>
if isWhitespace(c) then
skipWhitespaceAndComments(advance(state))
else if c == '/' then
match peekAt(state, 1) {
Some('/') =>
// Check for doc comment (///)
match peekAt(state, 2) {
Some('/') => state, // Don't skip doc comments
_ => skipWhitespaceAndComments(skipLineComment(advance(advance(state))))
},
_ => state
}
else state
}
// Collect identifier characters
fn collectIdent(state: LexState, acc: List<Char>): (List<Char>, LexState) =
match peek(state) {
None => (acc, state),
Some(c) =>
if isAlphaNumeric(c) then
collectIdent(advance(state), List.concat(acc, [c]))
else (acc, state)
}
// Collect number characters (digits only)
fn collectDigits(state: LexState, acc: List<Char>): (List<Char>, LexState) =
match peek(state) {
None => (acc, state),
Some(c) =>
if isDigit(c) then
collectDigits(advance(state), List.concat(acc, [c]))
else (acc, state)
}
// Convert list of digit chars to int
fn charsToInt(chars: List<Char>): Int =
List.fold(chars, 0, fn(acc, c) => acc * 10 + charToDigit(c))
fn charToDigit(c: Char): Int =
if c == '0' then 0
else if c == '1' then 1
else if c == '2' then 2
else if c == '3' then 3
else if c == '4' then 4
else if c == '5' then 5
else if c == '6' then 6
else if c == '7' then 7
else if c == '8' then 8
else 9
// Map identifier string to keyword token or ident
fn identToToken(name: String): TokenKind =
if name == "fn" then TkFn
else if name == "let" then TkLet
else if name == "if" then TkIf
else if name == "then" then TkThen
else if name == "else" then TkElse
else if name == "match" then TkMatch
else if name == "with" then TkWith
else if name == "effect" then TkEffect
else if name == "handler" then TkHandler
else if name == "run" then TkRun
else if name == "resume" then TkResume
else if name == "type" then TkType
else if name == "true" then TkBool(true)
else if name == "false" then TkBool(false)
else if name == "import" then TkImport
else if name == "pub" then TkPub
else if name == "as" then TkAs
else if name == "from" then TkFrom
else if name == "trait" then TkTrait
else if name == "impl" then TkImpl
else if name == "for" then TkFor
else if name == "is" then TkIs
else if name == "pure" then TkPure
else if name == "total" then TkTotal
else if name == "idempotent" then TkIdempotent
else if name == "deterministic" then TkDeterministic
else if name == "commutative" then TkCommutative
else if name == "where" then TkWhere
else if name == "assume" then TkAssume
else TkIdent(name)
// Lex a string literal (after opening quote consumed)
fn lexStringBody(state: LexState, acc: List<Char>): (List<Char>, LexState) =
match peek(state) {
None => (acc, state),
Some(c) =>
if c == '"' then (acc, advance(state))
else if c == '\\' then
match peekAt(state, 1) {
Some('n') => lexStringBody(advance(advance(state)), List.concat(acc, ['\n'])),
Some('t') => lexStringBody(advance(advance(state)), List.concat(acc, ['\t'])),
Some('\\') => lexStringBody(advance(advance(state)), List.concat(acc, ['\\'])),
Some('"') => lexStringBody(advance(advance(state)), List.concat(acc, ['"'])),
_ => lexStringBody(advance(state), List.concat(acc, [c]))
}
else lexStringBody(advance(state), List.concat(acc, [c]))
}
// Lex a char literal (after opening quote consumed)
fn lexCharLiteral(state: LexState): LexResult =
let start = position(state) - 1;
match peek(state) {
None => LexErr("Unexpected end of input in char literal", start),
Some(c) =>
if c == '\\' then
match peekAt(state, 1) {
Some('n') =>
match peekAt(state, 2) {
Some('\'') => LexOk(Token(TkChar('\n'), start, position(state) + 3), advance(advance(advance(state)))),
_ => LexErr("Expected closing quote", position(state))
},
Some('t') =>
match peekAt(state, 2) {
Some('\'') => LexOk(Token(TkChar('\t'), start, position(state) + 3), advance(advance(advance(state)))),
_ => LexErr("Expected closing quote", position(state))
},
Some('\\') =>
match peekAt(state, 2) {
Some('\'') => LexOk(Token(TkChar('\\'), start, position(state) + 3), advance(advance(advance(state)))),
_ => LexErr("Expected closing quote", position(state))
},
_ => LexErr("Unknown escape sequence", position(state))
}
else
match peekAt(state, 1) {
Some('\'') => LexOk(Token(TkChar(c), start, position(state) + 2), advance(advance(state))),
_ => LexErr("Expected closing quote", position(state))
}
}
// Collect doc comment text (after /// consumed)
fn collectDocComment(state: LexState, acc: List<Char>): (List<Char>, LexState) =
match peek(state) {
None => (acc, state),
Some(c) =>
if c == '\n' then (acc, state)
else collectDocComment(advance(state), List.concat(acc, [c]))
}
// Lex a single token
fn lexToken(state: LexState): LexResult =
let state = skipWhitespaceAndComments(state);
let start = position(state);
match peek(state) {
None => LexOk(Token(TkEof, start, start), state),
Some(c) =>
if c == '\n' then
LexOk(Token(TkNewline, start, start + 1), advance(state))
// Numbers
else if isDigit(c) then
let result = collectDigits(state, []);
match result {
(digits, nextState) =>
// Check for float
match peek(nextState) {
Some('.') =>
match peekAt(nextState, 1) {
Some(d) =>
if isDigit(d) then
let fracResult = collectDigits(advance(nextState), []);
match fracResult {
(fracDigits, finalState) =>
let intPart = String.join(List.map(digits, fn(ch) => String.fromChar(ch)), "");
let fracPart = String.join(List.map(fracDigits, fn(ch) => String.fromChar(ch)), "");
LexOk(Token(TkFloat(intPart + "." + fracPart), start, position(finalState)), finalState)
}
else
LexOk(Token(TkInt(charsToInt(digits)), start, position(nextState)), nextState),
None =>
LexOk(Token(TkInt(charsToInt(digits)), start, position(nextState)), nextState)
},
_ => LexOk(Token(TkInt(charsToInt(digits)), start, position(nextState)), nextState)
}
}
// Identifiers and keywords
else if isAlpha(c) then
let result = collectIdent(state, []);
match result {
(chars, nextState) =>
let name = String.join(List.map(chars, fn(ch) => String.fromChar(ch)), "");
LexOk(Token(identToToken(name), start, position(nextState)), nextState)
}
// String literals
else if c == '"' then
let result = lexStringBody(advance(state), []);
match result {
(chars, nextState) =>
let str = String.join(List.map(chars, fn(ch) => String.fromChar(ch)), "");
LexOk(Token(TkString(str), start, position(nextState)), nextState)
}
// Char literals
else if c == '\'' then
lexCharLiteral(advance(state))
// Doc comments (///)
else if c == '/' then
match peekAt(state, 1) {
Some('/') =>
match peekAt(state, 2) {
Some('/') =>
// Skip the "/// " prefix
let docState = advance(advance(advance(state)));
let docState = match peek(docState) {
Some(' ') => advance(docState),
_ => docState
};
let result = collectDocComment(docState, []);
match result {
(chars, nextState) =>
let text = String.join(List.map(chars, fn(ch) => String.fromChar(ch)), "");
LexOk(Token(TkDocComment(text), start, position(nextState)), nextState)
},
_ => LexOk(Token(TkSlash, start, start + 1), advance(state))
},
_ => LexOk(Token(TkSlash, start, start + 1), advance(state))
}
// Two-character operators
else if c == '=' then
match peekAt(state, 1) {
Some('=') => LexOk(Token(TkEqEq, start, start + 2), advance(advance(state))),
Some('>') => LexOk(Token(TkArrow, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkEq, start, start + 1), advance(state))
}
else if c == '!' then
match peekAt(state, 1) {
Some('=') => LexOk(Token(TkNe, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkNot, start, start + 1), advance(state))
}
else if c == '<' then
match peekAt(state, 1) {
Some('=') => LexOk(Token(TkLe, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkLt, start, start + 1), advance(state))
}
else if c == '>' then
match peekAt(state, 1) {
Some('=') => LexOk(Token(TkGe, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkGt, start, start + 1), advance(state))
}
else if c == '&' then
match peekAt(state, 1) {
Some('&') => LexOk(Token(TkAnd, start, start + 2), advance(advance(state))),
_ => LexErr("Expected '&&'", start)
}
else if c == '|' then
match peekAt(state, 1) {
Some('|') => LexOk(Token(TkOr, start, start + 2), advance(advance(state))),
Some('>') => LexOk(Token(TkPipeGt, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkPipe, start, start + 1), advance(state))
}
else if c == '-' then
match peekAt(state, 1) {
Some('>') => LexOk(Token(TkThinArrow, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkMinus, start, start + 1), advance(state))
}
else if c == ':' then
match peekAt(state, 1) {
Some(':') => LexOk(Token(TkColonColon, start, start + 2), advance(advance(state))),
_ => LexOk(Token(TkColon, start, start + 1), advance(state))
}
// Single-character tokens
else if c == '+' then LexOk(Token(TkPlus, start, start + 1), advance(state))
else if c == '*' then LexOk(Token(TkStar, start, start + 1), advance(state))
else if c == '%' then LexOk(Token(TkPercent, start, start + 1), advance(state))
else if c == '.' then LexOk(Token(TkDot, start, start + 1), advance(state))
else if c == ',' then LexOk(Token(TkComma, start, start + 1), advance(state))
else if c == ';' then LexOk(Token(TkSemi, start, start + 1), advance(state))
else if c == '@' then LexOk(Token(TkAt, start, start + 1), advance(state))
else if c == '(' then LexOk(Token(TkLParen, start, start + 1), advance(state))
else if c == ')' then LexOk(Token(TkRParen, start, start + 1), advance(state))
else if c == '{' then LexOk(Token(TkLBrace, start, start + 1), advance(state))
else if c == '}' then LexOk(Token(TkRBrace, start, start + 1), advance(state))
else if c == '[' then LexOk(Token(TkLBracket, start, start + 1), advance(state))
else if c == ']' then LexOk(Token(TkRBracket, start, start + 1), advance(state))
else if c == '_' then
// Check if it's just underscore or start of ident
match peekAt(state, 1) {
Some(next) =>
if isAlphaNumeric(next) then
let result = collectIdent(state, []);
match result {
(chars, nextState) =>
let name = String.join(List.map(chars, fn(ch) => String.fromChar(ch)), "");
LexOk(Token(TkIdent(name), start, position(nextState)), nextState)
}
else LexOk(Token(TkUnderscore, start, start + 1), advance(state)),
None => LexOk(Token(TkUnderscore, start, start + 1), advance(state))
}
else LexErr("Unexpected character: " + String.fromChar(c), start)
}
// Lex all tokens from source
fn lexAll(state: LexState, acc: List<Token>): List<Token> =
match lexToken(state) {
LexErr(msg, pos) =>
// On error, skip the character and continue
List.concat(acc, [Token(TkEof, pos, pos)]),
LexOk(token, nextState) =>
match token {
Token(TkEof, _, _) => List.concat(acc, [token]),
Token(TkNewline, _, _) =>
// Skip consecutive newlines
lexAll(nextState, List.concat(acc, [token])),
_ => lexAll(nextState, List.concat(acc, [token]))
}
}
// Public API: tokenize a source string
fn tokenize(source: String): List<Token> =
let chars = String.chars(source);
let state = LexState(chars, 0);
lexAll(state, [])
// === Token display ===
fn tokenKindToString(kind: TokenKind): String =
match kind {
TkInt(n) => "Int(" + toString(n) + ")",
TkFloat(s) => "Float(" + s + ")",
TkString(s) => "String(\"" + s + "\")",
TkChar(c) => "Char('" + String.fromChar(c) + "')",
TkBool(b) => if b then "true" else "false",
TkIdent(name) => "Ident(" + name + ")",
TkFn => "fn", TkLet => "let", TkIf => "if",
TkThen => "then", TkElse => "else", TkMatch => "match",
TkWith => "with", TkEffect => "effect", TkHandler => "handler",
TkRun => "run", TkResume => "resume", TkType => "type",
TkImport => "import", TkPub => "pub", TkAs => "as",
TkFrom => "from", TkTrait => "trait", TkImpl => "impl", TkFor => "for",
TkIs => "is", TkPure => "pure", TkTotal => "total",
TkIdempotent => "idempotent", TkDeterministic => "deterministic",
TkCommutative => "commutative", TkWhere => "where", TkAssume => "assume",
TkPlus => "+", TkMinus => "-", TkStar => "*", TkSlash => "/",
TkPercent => "%", TkEq => "=", TkEqEq => "==", TkNe => "!=",
TkLt => "<", TkLe => "<=", TkGt => ">", TkGe => ">=",
TkAnd => "&&", TkOr => "||", TkNot => "!",
TkPipe => "|", TkPipeGt => "|>",
TkArrow => "=>", TkThinArrow => "->",
TkDot => ".", TkColon => ":", TkColonColon => "::",
TkComma => ",", TkSemi => ";", TkAt => "@",
TkLParen => "(", TkRParen => ")", TkLBrace => "{", TkRBrace => "}",
TkLBracket => "[", TkRBracket => "]",
TkUnderscore => "_", TkNewline => "\\n", TkEof => "EOF",
TkDocComment(text) => "DocComment(\"" + text + "\")",
_ => "?"
}
fn tokenToString(token: Token): String =
match token {
Token(kind, start, end) =>
tokenKindToString(kind) + " [" + toString(start) + ".." + toString(end) + "]"
}
// === Tests ===
fn printTokens(tokens: List<Token>): Unit with {Console} =
match List.head(tokens) {
None => Console.print(""),
Some(t) => {
Console.print(" " + tokenToString(t));
match List.tail(tokens) {
Some(rest) => printTokens(rest),
None => Console.print("")
}
}
}
fn testLexer(label: String, source: String): Unit with {Console} = {
Console.print("--- " + label + " ---");
Console.print(" Input: \"" + source + "\"");
let tokens = tokenize(source);
printTokens(tokens)
}
fn main(): Unit with {Console} = {
Console.print("=== Lux Self-Hosted Lexer ===");
Console.print("");
// Basic tokens
testLexer("numbers", "42 3");
Console.print("");
// Identifiers and keywords
testLexer("keywords", "fn main let x");
Console.print("");
// Operators
testLexer("operators", "a + b == c");
Console.print("");
// String literal
testLexer("string", "\"hello world\"");
Console.print("");
// Function declaration
testLexer("function", "fn add(a: Int, b: Int): Int = a + b");
Console.print("");
// Behavioral properties
testLexer("behavioral", "fn add(a: Int): Int is pure = a");
Console.print("");
// Complex expression
testLexer("complex", "let result = if x > 0 then x else 0 - x");
Console.print("");
Console.print("=== Lexer test complete ===")
}
let _ = run main() with {}

213
scripts/release.sh Executable file
View File

@@ -0,0 +1,213 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Release Script
# Builds a static binary, generates changelog, and creates a Gitea release.
#
# Usage:
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
# ./scripts/release.sh patch # same as above
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
# ./scripts/release.sh v1.2.3 # explicit version
#
# Environment:
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
REPO_OWNER="blu"
REPO_NAME="lux"
API_BASE="$GITEA_URL/api/v1"
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
info() { printf "${CYAN}::${NC} %s\n" "$1"; }
ok() { printf "${GREEN}ok${NC} %s\n" "$1"; }
warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
# --- Determine version ---
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
BUMP="${1:-patch}"
bump_version() {
local ver="$1" part="$2"
IFS='.' read -r major minor patch <<< "$ver"
case "$part" in
major) echo "$((major + 1)).0.0" ;;
minor) echo "$major.$((minor + 1)).0" ;;
patch) echo "$major.$minor.$((patch + 1))" ;;
*) echo "$part" ;; # treat as explicit version
esac
}
case "$BUMP" in
major|minor|patch)
VERSION=$(bump_version "$CURRENT" "$BUMP")
info "Bumping $BUMP: $CURRENT$VERSION"
;;
*)
# Explicit version — strip v prefix if present
VERSION="${BUMP#v}"
info "Explicit version: $VERSION"
;;
esac
TAG="v$VERSION"
# --- Check for clean working tree ---
if [ -n "$(git status --porcelain)" ]; then
warn "Working tree has uncommitted changes:"
git status --short
printf "\n"
read -rp "Continue anyway? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || exit 1
fi
# --- Check if tag already exists ---
if git rev-parse "$TAG" >/dev/null 2>&1; then
err "Tag $TAG already exists. Choose a different version."
fi
# --- Update version in source files ---
if [ "$VERSION" != "$CURRENT" ]; then
info "Updating version in Cargo.toml and flake.nix..."
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
git add Cargo.toml flake.nix
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
ok "Version updated and committed"
fi
# --- Generate changelog ---
info "Generating changelog..."
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
if [ -n "$LAST_TAG" ]; then
RANGE="$LAST_TAG..HEAD"
info "Changes since $LAST_TAG:"
else
RANGE="HEAD"
info "First release — summarizing recent commits:"
fi
CHANGELOG=$(git log "$RANGE" --pretty=format:"- %s" --no-merges 2>/dev/null | head -50 || true)
if [ -z "$CHANGELOG" ]; then
CHANGELOG="- Initial release"
fi
# --- Build static binary ---
info "Building static binary (nix build .#static)..."
nix build .#static
BINARY="result/bin/lux"
if [ ! -f "$BINARY" ]; then
err "Static binary not found at $BINARY"
fi
BINARY_SIZE=$(ls -lh "$BINARY" | awk '{print $5}')
BINARY_TYPE=$(file "$BINARY" | sed 's/.*: //')
ok "Binary: $BINARY_SIZE, $BINARY_TYPE"
# --- Prepare release artifact ---
ARTIFACT="/tmp/lux-${TAG}-linux-x86_64"
cp "$BINARY" "$ARTIFACT"
chmod +x "$ARTIFACT"
# --- Show release summary ---
printf "\n"
printf "${BOLD}═══ Release Summary ═══${NC}\n"
printf "\n"
printf " ${BOLD}Tag:${NC} %s\n" "$TAG"
printf " ${BOLD}Binary:${NC} %s (%s)\n" "lux-${TAG}-linux-x86_64" "$BINARY_SIZE"
printf " ${BOLD}Commit:${NC} %s\n" "$(git rev-parse --short HEAD)"
printf "\n"
printf "${BOLD}Changelog:${NC}\n"
printf "%s\n" "$CHANGELOG"
printf "\n"
# --- Confirm ---
read -rp "Create release $TAG? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || { info "Aborted."; exit 0; }
# --- Get Gitea token ---
if [ -z "${GITEA_TOKEN:-}" ]; then
printf "\n"
info "Gitea API token required (create at $GITEA_URL/user/settings/applications)"
read -rsp "Token: " GITEA_TOKEN
printf "\n"
fi
if [ -z "$GITEA_TOKEN" ]; then
err "No token provided"
fi
# --- Create and push tag ---
info "Creating tag $TAG..."
git tag -a "$TAG" -m "Release $TAG" --no-sign
ok "Tag created"
info "Pushing tag to origin..."
git push origin "$TAG"
ok "Tag pushed"
# --- Create Gitea release ---
info "Creating release on Gitea..."
RELEASE_BODY=$(printf "## Lux %s\n\n### Changes\n\n%s\n\n### Installation\n\n\`\`\`bash\ncurl -Lo lux %s/%s/%s/releases/download/%s/lux-linux-x86_64\nchmod +x lux\n./lux --version\n\`\`\`" \
"$TAG" "$CHANGELOG" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG")
RELEASE_JSON=$(jq -n \
--arg tag "$TAG" \
--arg name "Lux $TAG" \
--arg body "$RELEASE_BODY" \
'{tag_name: $tag, name: $name, body: $body, draft: false, prerelease: false}')
RELEASE_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/json" \
-d "$RELEASE_JSON")
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id // empty')
if [ -z "$RELEASE_ID" ]; then
echo "$RELEASE_RESPONSE" | jq . 2>/dev/null || echo "$RELEASE_RESPONSE"
err "Failed to create release"
fi
ok "Release created (id: $RELEASE_ID)"
# --- Upload binary ---
info "Uploading binary..."
UPLOAD_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases/$RELEASE_ID/assets?name=lux-linux-x86_64" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/octet-stream" \
--data-binary "@$ARTIFACT")
ASSET_NAME=$(echo "$UPLOAD_RESPONSE" | jq -r '.name // empty')
if [ -z "$ASSET_NAME" ]; then
echo "$UPLOAD_RESPONSE" | jq . 2>/dev/null || echo "$UPLOAD_RESPONSE"
err "Failed to upload binary"
fi
ok "Binary uploaded: $ASSET_NAME"
# --- Done ---
printf "\n"
printf "${GREEN}${BOLD}Release $TAG published!${NC}\n"
printf "\n"
printf " ${BOLD}URL:${NC} %s/%s/%s/releases/tag/%s\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf " ${BOLD}Download:${NC} %s/%s/%s/releases/download/%s/lux-linux-x86_64\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf "\n"
# Cleanup
rm -f "$ARTIFACT"

211
scripts/validate.sh Executable file
View File

@@ -0,0 +1,211 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Full Validation Script
# Runs all checks: Rust tests, package tests, type checking, example compilation.
# Run after every committable change to ensure no regressions.
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
LUX="$(pwd)/target/release/lux"
PACKAGES_DIR="$(pwd)/../packages"
PROJECTS_DIR="$(pwd)/projects"
EXAMPLES_DIR="$(pwd)/examples"
RED='\033[0;31m'
GREEN='\033[0;32m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
FAILED=0
TOTAL=0
step() {
TOTAL=$((TOTAL + 1))
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
}
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
# --- Rust checks ---
step "cargo check"
if nix develop --command cargo check 2>/dev/null; then ok; else fail; fi
step "cargo test"
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
# --- Build release binary ---
step "cargo build --release"
if nix develop --command cargo build --release 2>/dev/null; then ok; else fail; fi
# --- Package tests ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux test ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
fi
done
# --- Lux check on packages ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux check ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
fi
done
# --- Project checks ---
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
if [ -f "$proj_dir/main.lux" ]; then
step "lux check (project: $proj)"
OUTPUT=$("$LUX" check "$proj_dir/main.lux" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
fi
# Check any standalone .lux files in the project
for lux_file in "$proj_dir"/*.lux; do
[ -f "$lux_file" ] || continue
fname=$(basename "$lux_file")
[ "$fname" = "main.lux" ] && continue
step "lux check (project: $proj/$fname)"
OUTPUT=$("$LUX" check "$lux_file" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
done
done
# === Compilation & Interpreter Checks ===
# --- Interpreter: examples ---
# Skip: http_api, http, http_router, http_server (network), postgres_demo (db),
# random, property_testing (Random effect), shell (Process), json (File I/O),
# file_io (File I/O), test_math, test_lists (Test effect), stress_shared_rc,
# test_rc_comparison (internal tests), modules/* (need cwd)
INTERP_SKIP="http_api http http_router http_server postgres_demo random property_testing shell json file_io test_math test_lists stress_shared_rc test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $INTERP_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (examples/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/standard ---
# Skip: guessing_game (reads stdin)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "guessing_game" ] && continue
step "interpreter (standard/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/showcase ---
# Skip: task_manager (parse error in current version)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "interpreter (showcase/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: projects ---
# Skip: guessing-game (Random), rest-api (HttpServer)
PROJ_INTERP_SKIP="guessing-game rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $PROJ_INTERP_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (project: $proj)"
if timeout 10 "$LUX" "$proj_dir/main.lux" >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples ---
# Skip files that fail JS compilation (unsupported features)
JS_SKIP="http_api http http_router postgres_demo property_testing json test_lists test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $JS_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (examples/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/standard ---
# Skip: stdlib_demo (uses String.toUpper not in JS backend)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "stdlib_demo" ] && continue
step "compile JS (standard/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/showcase ---
# Skip: task_manager (unsupported features)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "compile JS (showcase/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: projects ---
# Skip: json-parser, rest-api (unsupported features)
JS_PROJ_SKIP="json-parser rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $JS_PROJ_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (project: $proj)"
if "$LUX" compile "$proj_dir/main.lux" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples ---
# Only compile examples known to work with C backend
C_EXAMPLES="hello factorial pipelines tailcall jit_test"
for name in $C_EXAMPLES; do
f="$EXAMPLES_DIR/$name.lux"
[ -f "$f" ] || continue
step "compile C (examples/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples/standard ---
C_STD_EXAMPLES="hello_world factorial fizzbuzz primes guessing_game"
for name in $C_STD_EXAMPLES; do
f="$EXAMPLES_DIR/standard/$name.lux"
[ -f "$f" ] || continue
step "compile C (standard/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- Cleanup ---
rm -f /tmp/lux_validate.js /tmp/lux_validate_bin
# --- Summary ---
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
if [ $FAILED -eq 0 ]; then
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
else
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
exit 1
fi

View File

@@ -499,6 +499,12 @@ pub enum Expr {
field: Ident,
span: Span,
},
/// Tuple index access: tuple.0, tuple.1
TupleIndex {
object: Box<Expr>,
index: usize,
span: Span,
},
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
Lambda {
params: Vec<Parameter>,
@@ -535,7 +541,9 @@ pub enum Expr {
span: Span,
},
/// Record literal: { name: "Alice", age: 30 }
/// With optional spread: { ...base, name: "Bob" }
Record {
spread: Option<Box<Expr>>,
fields: Vec<(Ident, Expr)>,
span: Span,
},
@@ -563,6 +571,7 @@ impl Expr {
Expr::Call { span, .. } => *span,
Expr::EffectOp { span, .. } => *span,
Expr::Field { span, .. } => *span,
Expr::TupleIndex { span, .. } => *span,
Expr::Lambda { span, .. } => *span,
Expr::Let { span, .. } => *span,
Expr::If { span, .. } => *span,
@@ -614,7 +623,8 @@ pub enum BinaryOp {
And,
Or,
// Other
Pipe, // |>
Pipe, // |>
Concat, // ++
}
impl fmt::Display for BinaryOp {
@@ -634,6 +644,7 @@ impl fmt::Display for BinaryOp {
BinaryOp::And => write!(f, "&&"),
BinaryOp::Or => write!(f, "||"),
BinaryOp::Pipe => write!(f, "|>"),
BinaryOp::Concat => write!(f, "++"),
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -69,6 +69,8 @@ pub struct JsBackend {
has_handlers: bool,
/// Variable substitutions for let binding
var_substitutions: HashMap<String, String>,
/// Effects actually used in the program (for tree-shaking runtime)
used_effects: HashSet<String>,
}
impl JsBackend {
@@ -90,6 +92,7 @@ impl JsBackend {
effectful_functions: HashSet::new(),
has_handlers: false,
var_substitutions: HashMap::new(),
used_effects: HashSet::new(),
}
}
@@ -97,9 +100,6 @@ impl JsBackend {
pub fn generate(&mut self, program: &Program) -> Result<String, JsGenError> {
self.output.clear();
// Emit runtime helpers
self.emit_runtime();
// First pass: collect all function names, types, and effects
for decl in &program.declarations {
match decl {
@@ -116,6 +116,12 @@ impl JsBackend {
}
}
// Collect used effects for tree-shaking
self.collect_used_effects(program);
// Emit runtime helpers (tree-shaken based on used effects)
self.emit_runtime();
// Emit type constructors
for decl in &program.declarations {
if let Declaration::Type(t) = decl {
@@ -163,32 +169,181 @@ impl JsBackend {
Ok(self.output.clone())
}
/// Emit the minimal Lux runtime
/// Collect all effects used in the program for runtime tree-shaking
fn collect_used_effects(&mut self, program: &Program) {
for decl in &program.declarations {
match decl {
Declaration::Function(f) => {
for effect in &f.effects {
self.used_effects.insert(effect.name.clone());
}
self.collect_effects_from_expr(&f.body);
}
Declaration::Let(l) => {
self.collect_effects_from_expr(&l.value);
}
Declaration::Handler(h) => {
self.used_effects.insert(h.effect.name.clone());
for imp in &h.implementations {
self.collect_effects_from_expr(&imp.body);
}
}
_ => {}
}
}
}
/// Recursively collect effect names from an expression
fn collect_effects_from_expr(&mut self, expr: &Expr) {
match expr {
Expr::EffectOp { effect, args, .. } => {
self.used_effects.insert(effect.name.clone());
for arg in args {
self.collect_effects_from_expr(arg);
}
}
Expr::Run { expr, handlers, .. } => {
self.collect_effects_from_expr(expr);
for (effect, handler) in handlers {
self.used_effects.insert(effect.name.clone());
self.collect_effects_from_expr(handler);
}
}
Expr::Call { func, args, .. } => {
self.collect_effects_from_expr(func);
for arg in args {
self.collect_effects_from_expr(arg);
}
}
Expr::Lambda { body, effects, .. } => {
for effect in effects {
self.used_effects.insert(effect.name.clone());
}
self.collect_effects_from_expr(body);
}
Expr::Let { value, body, .. } => {
self.collect_effects_from_expr(value);
self.collect_effects_from_expr(body);
}
Expr::If { condition, then_branch, else_branch, .. } => {
self.collect_effects_from_expr(condition);
self.collect_effects_from_expr(then_branch);
self.collect_effects_from_expr(else_branch);
}
Expr::Match { scrutinee, arms, .. } => {
self.collect_effects_from_expr(scrutinee);
for arm in arms {
self.collect_effects_from_expr(&arm.body);
if let Some(guard) = &arm.guard {
self.collect_effects_from_expr(guard);
}
}
}
Expr::Block { statements, result, .. } => {
for stmt in statements {
match stmt {
Statement::Expr(e) => self.collect_effects_from_expr(e),
Statement::Let { value, .. } => self.collect_effects_from_expr(value),
}
}
self.collect_effects_from_expr(result);
}
Expr::BinaryOp { left, right, .. } => {
self.collect_effects_from_expr(left);
self.collect_effects_from_expr(right);
}
Expr::UnaryOp { operand, .. } => {
self.collect_effects_from_expr(operand);
}
Expr::Field { object, .. } => {
self.collect_effects_from_expr(object);
}
Expr::TupleIndex { object, .. } => {
self.collect_effects_from_expr(object);
}
Expr::Record { spread, fields, .. } => {
if let Some(s) = spread {
self.collect_effects_from_expr(s);
}
for (_, expr) in fields {
self.collect_effects_from_expr(expr);
}
}
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
for el in elements {
self.collect_effects_from_expr(el);
}
}
Expr::Resume { value, .. } => {
self.collect_effects_from_expr(value);
}
Expr::Literal(_) | Expr::Var(_) => {}
}
}
/// Emit the Lux runtime, tree-shaken based on used effects
fn emit_runtime(&mut self) {
let uses_console = self.used_effects.contains("Console");
let uses_random = self.used_effects.contains("Random");
let uses_time = self.used_effects.contains("Time");
let uses_http = self.used_effects.contains("Http");
let uses_dom = self.used_effects.contains("Dom");
let uses_html = self.used_effects.contains("Html") || uses_dom;
self.writeln("// Lux Runtime");
self.writeln("const Lux = {");
self.indent += 1;
// Option helpers
// Core helpers — always emitted
self.writeln("Some: (value) => ({ tag: \"Some\", value }),");
self.writeln("None: () => ({ tag: \"None\" }),");
self.writeln("");
// Result helpers
self.writeln("Ok: (value) => ({ tag: \"Ok\", value }),");
self.writeln("Err: (error) => ({ tag: \"Err\", error }),");
self.writeln("");
// List helpers
self.writeln("Cons: (head, tail) => [head, ...tail],");
self.writeln("Nil: () => [],");
self.writeln("");
// Default handlers for effects
// Default handlers — only include effects that are used
self.writeln("defaultHandlers: {");
self.indent += 1;
// Console effect
if uses_console {
self.emit_console_handler();
}
if uses_random {
self.emit_random_handler();
}
if uses_time {
self.emit_time_handler();
}
if uses_http {
self.emit_http_handler();
}
if uses_dom {
self.emit_dom_handler();
}
self.indent -= 1;
self.writeln("},");
// HTML rendering — only if Html or Dom effects are used
if uses_html {
self.emit_html_helpers();
}
// TEA runtime — only if Dom is used
if uses_dom {
self.emit_tea_runtime();
}
self.indent -= 1;
self.writeln("};");
self.writeln("");
}
fn emit_console_handler(&mut self) {
self.writeln("Console: {");
self.indent += 1;
self.writeln("print: (msg) => console.log(msg),");
@@ -207,8 +362,9 @@ impl JsBackend {
self.writeln("readInt: () => parseInt(Lux.defaultHandlers.Console.readLine(), 10)");
self.indent -= 1;
self.writeln("},");
}
// Random effect
fn emit_random_handler(&mut self) {
self.writeln("Random: {");
self.indent += 1;
self.writeln("int: (min, max) => Math.floor(Math.random() * (max - min + 1)) + min,");
@@ -216,16 +372,18 @@ impl JsBackend {
self.writeln("float: () => Math.random()");
self.indent -= 1;
self.writeln("},");
}
// Time effect
fn emit_time_handler(&mut self) {
self.writeln("Time: {");
self.indent += 1;
self.writeln("now: () => Date.now(),");
self.writeln("sleep: (ms) => new Promise(resolve => setTimeout(resolve, ms))");
self.indent -= 1;
self.writeln("},");
}
// Http effect (browser/Node compatible)
fn emit_http_handler(&mut self) {
self.writeln("Http: {");
self.indent += 1;
self.writeln("get: async (url) => {");
@@ -287,8 +445,9 @@ impl JsBackend {
self.writeln("}");
self.indent -= 1;
self.writeln("},");
}
// Dom effect (browser only - stubs for Node.js)
fn emit_dom_handler(&mut self) {
self.writeln("Dom: {");
self.indent += 1;
@@ -316,7 +475,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Element creation
self.writeln("createElement: (tag) => {");
self.indent += 1;
self.writeln("if (typeof document === 'undefined') return null;");
@@ -331,7 +489,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// DOM manipulation
self.writeln("appendChild: (parent, child) => {");
self.indent += 1;
self.writeln("if (parent && child) parent.appendChild(child);");
@@ -356,7 +513,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Content
self.writeln("setTextContent: (el, text) => {");
self.indent += 1;
self.writeln("if (el) el.textContent = text;");
@@ -381,7 +537,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Attributes
self.writeln("setAttribute: (el, name, value) => {");
self.indent += 1;
self.writeln("if (el) el.setAttribute(name, value);");
@@ -408,7 +563,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Classes
self.writeln("addClass: (el, className) => {");
self.indent += 1;
self.writeln("if (el) el.classList.add(className);");
@@ -433,7 +587,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Styles
self.writeln("setStyle: (el, property, value) => {");
self.indent += 1;
self.writeln("if (el) el.style[property] = value;");
@@ -446,7 +599,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Form elements
self.writeln("getValue: (el) => {");
self.indent += 1;
self.writeln("return el ? el.value : '';");
@@ -471,7 +623,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Events
self.writeln("addEventListener: (el, event, handler) => {");
self.indent += 1;
self.writeln("if (el) el.addEventListener(event, handler);");
@@ -484,7 +635,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Focus
self.writeln("focus: (el) => {");
self.indent += 1;
self.writeln("if (el && el.focus) el.focus();");
@@ -497,7 +647,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Document
self.writeln("getBody: () => {");
self.indent += 1;
self.writeln("if (typeof document === 'undefined') return null;");
@@ -512,7 +661,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Window
self.writeln("getWindow: () => {");
self.indent += 1;
self.writeln("if (typeof window === 'undefined') return null;");
@@ -545,7 +693,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Scroll
self.writeln("scrollTo: (x, y) => {");
self.indent += 1;
self.writeln("if (typeof window !== 'undefined') window.scrollTo(x, y);");
@@ -558,7 +705,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Dimensions
self.writeln("getBoundingClientRect: (el) => {");
self.indent += 1;
self.writeln("if (!el) return { top: 0, left: 0, width: 0, height: 0, right: 0, bottom: 0 };");
@@ -574,13 +720,11 @@ impl JsBackend {
self.indent -= 1;
self.writeln("}");
self.indent -= 1;
self.writeln("}");
self.indent -= 1;
self.writeln("},");
}
// HTML rendering helpers
fn emit_html_helpers(&mut self) {
self.writeln("");
self.writeln("// HTML rendering");
self.writeln("renderHtml: (node) => {");
@@ -682,8 +826,9 @@ impl JsBackend {
self.writeln("return el;");
self.indent -= 1;
self.writeln("},");
}
// TEA (The Elm Architecture) runtime
fn emit_tea_runtime(&mut self) {
self.writeln("");
self.writeln("// The Elm Architecture (TEA) runtime");
self.writeln("app: (config) => {");
@@ -727,7 +872,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Simple app (for string-based views like the counter example)
self.writeln("");
self.writeln("// Simple TEA app (string-based view)");
self.writeln("simpleApp: (config) => {");
@@ -757,7 +901,6 @@ impl JsBackend {
self.indent -= 1;
self.writeln("},");
// Diff and patch (basic implementation for view_deps optimization)
self.writeln("");
self.writeln("// Basic diff - checks if model fields changed");
self.writeln("hasChanged: (oldModel, newModel, ...paths) => {");
@@ -777,11 +920,7 @@ impl JsBackend {
self.writeln("}");
self.writeln("return false;");
self.indent -= 1;
self.writeln("}");
self.indent -= 1;
self.writeln("};");
self.writeln("");
self.writeln("},");
}
/// Collect type information from a type declaration
@@ -888,7 +1027,8 @@ impl JsBackend {
let prev_has_handlers = self.has_handlers;
self.has_handlers = is_effectful;
// Clear var substitutions for this function
// Save and clear var substitutions for this function scope
let saved_substitutions = self.var_substitutions.clone();
self.var_substitutions.clear();
// Emit function body
@@ -896,6 +1036,7 @@ impl JsBackend {
self.writeln(&format!("return {};", body_code));
self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
self.indent -= 1;
self.writeln("}");
@@ -909,13 +1050,16 @@ impl JsBackend {
let val = self.emit_expr(&let_decl.value)?;
let var_name = &let_decl.name.name;
// Check if this is a run expression (often results in undefined)
// We still want to execute it for its side effects
self.writeln(&format!("const {} = {};", var_name, val));
if var_name == "_" {
// Wildcard binding: just execute for side effects
self.writeln(&format!("{};", val));
} else {
self.writeln(&format!("const {} = {};", var_name, val));
// Register the variable for future use
self.var_substitutions
.insert(var_name.clone(), var_name.clone());
// Register the variable for future use
self.var_substitutions
.insert(var_name.clone(), var_name.clone());
}
Ok(())
}
@@ -954,12 +1098,17 @@ impl JsBackend {
let r = self.emit_expr(right)?;
// Check for string concatenation
if matches!(op, BinaryOp::Add) {
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
if self.is_string_expr(left) || self.is_string_expr(right) {
return Ok(format!("({} + {})", l, r));
}
}
// ++ on lists: use .concat()
if matches!(op, BinaryOp::Concat) {
return Ok(format!("{}.concat({})", l, r));
}
let op_str = match op {
BinaryOp::Add => "+",
BinaryOp::Sub => "-",
@@ -974,6 +1123,7 @@ impl JsBackend {
BinaryOp::Ge => ">=",
BinaryOp::And => "&&",
BinaryOp::Or => "||",
BinaryOp::Concat => unreachable!("handled above"),
BinaryOp::Pipe => {
// Pipe operator: x |> f becomes f(x)
return Ok(format!("{}({})", r, l));
@@ -1034,18 +1184,26 @@ impl JsBackend {
name, value, body, ..
} => {
let val = self.emit_expr(value)?;
let var_name = format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
if name.name == "_" {
// Wildcard binding: just execute for side effects
self.writeln(&format!("{};", val));
} else {
let var_name = format!("{}_{}", name.name, self.fresh_name());
// Add substitution
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
self.writeln(&format!("const {} = {};", var_name, val));
// Add substitution
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
let body_result = self.emit_expr(body)?;
// Remove substitution
self.var_substitutions.remove(&name.name);
if name.name != "_" {
self.var_substitutions.remove(&name.name);
}
Ok(body_result)
}
@@ -1057,6 +1215,31 @@ impl JsBackend {
if module_name.name == "List" {
return self.emit_list_operation(&field.name, args);
}
if module_name.name == "Map" {
return self.emit_map_operation(&field.name, args);
}
}
}
// Int/Float module operations
if let Expr::Field { object, field, .. } = func.as_ref() {
if let Expr::Var(module_name) = object.as_ref() {
if module_name.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toFloat" => return Ok(arg),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
if module_name.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
}
}
@@ -1066,6 +1249,10 @@ impl JsBackend {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("String({})", arg));
}
if ident.name == "print" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("console.log({})", arg));
}
}
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
@@ -1142,6 +1329,26 @@ impl JsBackend {
return self.emit_math_operation(&operation.name, args);
}
// Special case: Int module operations
if effect.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toFloat" => return Ok(arg), // JS numbers are already floats
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Float module operations
if effect.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Result module operations (not an effect)
if effect.name == "Result" {
return self.emit_result_operation(&operation.name, args);
@@ -1152,6 +1359,11 @@ impl JsBackend {
return self.emit_json_operation(&operation.name, args);
}
// Special case: Map module operations (not an effect)
if effect.name == "Map" {
return self.emit_map_operation(&operation.name, args);
}
// Special case: Html module operations (not an effect)
if effect.name == "Html" {
return self.emit_html_operation(&operation.name, args);
@@ -1197,18 +1409,39 @@ impl JsBackend {
param_names
};
// Save handler state
// Save state
let prev_has_handlers = self.has_handlers;
let saved_substitutions = self.var_substitutions.clone();
self.has_handlers = !effects.is_empty();
// Register lambda params as themselves (override any outer substitutions)
for p in &all_params {
self.var_substitutions.insert(p.clone(), p.clone());
}
// Capture any statements emitted during body evaluation
let output_start = self.output.len();
let prev_indent = self.indent;
self.indent += 1;
let body_code = self.emit_expr(body)?;
self.writeln(&format!("return {};", body_code));
// Extract body statements and restore output
let body_statements = self.output[output_start..].to_string();
self.output.truncate(output_start);
self.indent = prev_indent;
// Restore state
self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
let indent_str = " ".repeat(self.indent);
Ok(format!(
"(function({}) {{ return {}; }})",
"(function({}) {{\n{}{}}})",
all_params.join(", "),
body_code
body_statements,
indent_str,
))
}
@@ -1228,10 +1461,15 @@ impl JsBackend {
}
Statement::Let { name, value, .. } => {
let val = self.emit_expr(value)?;
let var_name = format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
if name.name == "_" {
self.writeln(&format!("{};", val));
} else {
let var_name =
format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
}
}
}
@@ -1240,15 +1478,19 @@ impl JsBackend {
self.emit_expr(result)
}
Expr::Record { fields, .. } => {
let field_strs: Result<Vec<_>, _> = fields
.iter()
.map(|(name, expr)| {
let val = self.emit_expr(expr)?;
Ok(format!("{}: {}", name.name, val))
})
.collect();
Ok(format!("{{ {} }}", field_strs?.join(", ")))
Expr::Record {
spread, fields, ..
} => {
let mut parts = Vec::new();
if let Some(spread_expr) = spread {
let spread_code = self.emit_expr(spread_expr)?;
parts.push(format!("...{}", spread_code));
}
for (name, expr) in fields {
let val = self.emit_expr(expr)?;
parts.push(format!("{}: {}", name.name, val));
}
Ok(format!("{{ {} }}", parts.join(", ")))
}
Expr::Tuple { elements, .. } => {
@@ -1268,6 +1510,11 @@ impl JsBackend {
Ok(format!("{}.{}", obj, field.name))
}
Expr::TupleIndex { object, index, .. } => {
let obj = self.emit_expr(object)?;
Ok(format!("{}[{}]", obj, index))
}
Expr::Run {
expr, handlers, ..
} => {
@@ -2062,6 +2309,86 @@ impl JsBackend {
}
}
/// Emit Map module operations using JS Map
fn emit_map_operation(
&mut self,
operation: &str,
args: &[Expr],
) -> Result<String, JsGenError> {
match operation {
"new" => Ok("new Map()".to_string()),
"set" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let val = self.emit_expr(&args[2])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.set({}, {}); return m; }})()",
map, key, val
))
}
"get" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"({0}.has({1}) ? Lux.Some({0}.get({1})) : Lux.None())",
map, key
))
}
"contains" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!("{}.has({})", map, key))
}
"remove" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.delete({}); return m; }})()",
map, key
))
}
"keys" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("Array.from({}.keys()).sort()", map))
}
"values" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({0}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }}).map(function(e) {{ return e[1]; }})",
map
))
}
"size" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("{}.size", map))
}
"isEmpty" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("({}.size === 0)", map))
}
"fromList" => {
let list = self.emit_expr(&args[0])?;
Ok(format!("new Map({}.map(function(t) {{ return [t[0], t[1]]; }}))", list))
}
"toList" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }})",
map
))
}
"merge" => {
let m1 = self.emit_expr(&args[0])?;
let m2 = self.emit_expr(&args[1])?;
Ok(format!("new Map([...{}, ...{}])", m1, m2))
}
_ => Err(JsGenError {
message: format!("Unknown Map operation: {}", operation),
span: None,
}),
}
}
/// Emit Html module operations for type-safe HTML construction
fn emit_html_operation(
&mut self,
@@ -2333,7 +2660,7 @@ impl JsBackend {
}
}
Expr::BinaryOp { op, left, right, .. } => {
matches!(op, BinaryOp::Add)
matches!(op, BinaryOp::Add | BinaryOp::Concat)
&& (self.is_string_expr(left) || self.is_string_expr(right))
}
_ => false,
@@ -3732,7 +4059,7 @@ line3"
#[test]
fn test_js_runtime_generated() {
// Test that the Lux runtime is properly generated
// Test that the Lux runtime core is always generated
use crate::parser::Parser;
let source = r#"
@@ -3743,21 +4070,51 @@ line3"
let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate");
// Check that Lux runtime includes key functions
// Core runtime is always present
assert!(js_code.contains("const Lux = {"), "Lux object should be defined");
assert!(js_code.contains("Some:"), "Option Some should be defined");
assert!(js_code.contains("None:"), "Option None should be defined");
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined");
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined");
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined");
assert!(js_code.contains("app:"), "TEA app should be defined");
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined");
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined");
// Console-only program should NOT include Dom, Html, or TEA sections
assert!(!js_code.contains("Dom:"), "Dom handler should not be in Console-only program");
assert!(!js_code.contains("renderHtml:"), "renderHtml should not be in Console-only program");
assert!(!js_code.contains("app:"), "TEA app should not be in Console-only program");
assert!(!js_code.contains("Http:"), "Http should not be in Console-only program");
// Console should be present
assert!(js_code.contains("Console:"), "Console handler should exist");
}
#[test]
fn test_js_runtime_tree_shaking_all_effects() {
// Test that all effects are included when all are used
use crate::parser::Parser;
let source = r#"
fn main(): Unit with {Console, Dom} = {
Console.print("Hello")
let _ = Dom.getElementById("app")
()
}
"#;
let program = Parser::parse_source(source).expect("Should parse");
let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate");
assert!(js_code.contains("Console:"), "Console handler should exist");
assert!(js_code.contains("Dom:"), "Dom handler should exist");
assert!(js_code.contains("renderHtml:"), "renderHtml should be defined when Dom is used");
assert!(js_code.contains("renderToDom:"), "renderToDom should be defined when Dom is used");
assert!(js_code.contains("escapeHtml:"), "escapeHtml should be defined when Dom is used");
assert!(js_code.contains("app:"), "TEA app should be defined when Dom is used");
assert!(js_code.contains("simpleApp:"), "simpleApp should be defined when Dom is used");
assert!(js_code.contains("hasChanged:"), "hasChanged should be defined when Dom is used");
}
#[test]
fn test_js_runtime_default_handlers() {
// Test that default handlers are properly generated
// Test that only used effect handlers are generated
use crate::parser::Parser;
let source = r#"
@@ -3768,12 +4125,12 @@ line3"
let mut backend = JsBackend::new();
let js_code = backend.generate(&program).expect("Should generate");
// Check that default handlers include all effects
// Only Console should be present
assert!(js_code.contains("Console:"), "Console handler should exist");
assert!(js_code.contains("Random:"), "Random handler should exist");
assert!(js_code.contains("Time:"), "Time handler should exist");
assert!(js_code.contains("Http:"), "Http handler should exist");
assert!(js_code.contains("Dom:"), "Dom handler should exist");
assert!(!js_code.contains("Random:"), "Random handler should not exist in Console-only program");
assert!(!js_code.contains("Time:"), "Time handler should not exist in Console-only program");
assert!(!js_code.contains("Http:"), "Http handler should not exist in Console-only program");
assert!(!js_code.contains("Dom:"), "Dom handler should not exist in Console-only program");
}
#[test]

View File

@@ -598,6 +598,9 @@ impl Formatter {
Expr::Field { object, field, .. } => {
format!("{}.{}", self.format_expr(object), field.name)
}
Expr::TupleIndex { object, index, .. } => {
format!("{}.{}", self.format_expr(object), index)
}
Expr::If { condition, then_branch, else_branch, .. } => {
format!(
"if {} then {} else {}",
@@ -685,15 +688,17 @@ impl Formatter {
.join(", ")
)
}
Expr::Record { fields, .. } => {
format!(
"{{ {} }}",
fields
.iter()
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val)))
.collect::<Vec<_>>()
.join(", ")
)
Expr::Record {
spread, fields, ..
} => {
let mut parts = Vec::new();
if let Some(spread_expr) = spread {
parts.push(format!("...{}", self.format_expr(spread_expr)));
}
for (name, val) in fields {
parts.push(format!("{}: {}", name.name, self.format_expr(val)));
}
format!("{{ {} }}", parts.join(", "))
}
Expr::EffectOp { effect, operation, args, .. } => {
format!(
@@ -728,7 +733,7 @@ impl Formatter {
match &lit.kind {
LiteralKind::Int(n) => n.to_string(),
LiteralKind::Float(f) => format!("{}", f),
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"")),
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"").replace('{', "\\{").replace('}', "\\}")),
LiteralKind::Char(c) => format!("'{}'", c),
LiteralKind::Bool(b) => b.to_string(),
LiteralKind::Unit => "()".to_string(),
@@ -750,6 +755,7 @@ impl Formatter {
BinaryOp::Ge => ">=",
BinaryOp::And => "&&",
BinaryOp::Or => "||",
BinaryOp::Concat => "++",
BinaryOp::Pipe => "|>",
}
}

View File

@@ -74,6 +74,9 @@ pub enum BuiltinFn {
MathFloor,
MathCeil,
MathRound,
MathSin,
MathCos,
MathAtan2,
// Additional List operations
ListIsEmpty,
@@ -95,6 +98,12 @@ pub enum BuiltinFn {
StringLastIndexOf,
StringRepeat,
// Int/Float operations
IntToString,
IntToFloat,
FloatToString,
FloatToInt,
// JSON operations
JsonParse,
JsonStringify,
@@ -115,6 +124,20 @@ pub enum BuiltinFn {
JsonString,
JsonArray,
JsonObject,
// Map operations
MapNew,
MapSet,
MapGet,
MapContains,
MapRemove,
MapKeys,
MapValues,
MapSize,
MapIsEmpty,
MapFromList,
MapToList,
MapMerge,
}
/// Runtime value
@@ -129,6 +152,7 @@ pub enum Value {
List(Vec<Value>),
Tuple(Vec<Value>),
Record(HashMap<String, Value>),
Map(HashMap<String, Value>),
Function(Rc<Closure>),
Handler(Rc<HandlerValue>),
/// Built-in function
@@ -160,6 +184,7 @@ impl Value {
Value::List(_) => "List",
Value::Tuple(_) => "Tuple",
Value::Record(_) => "Record",
Value::Map(_) => "Map",
Value::Function(_) => "Function",
Value::Handler(_) => "Handler",
Value::Builtin(_) => "Function",
@@ -208,6 +233,11 @@ impl Value {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Map(xs), Value::Map(ys)) => {
xs.len() == ys.len() && xs.iter().all(|(k, v)| {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
}
@@ -278,6 +308,16 @@ impl TryFromValue for Vec<Value> {
}
}
impl TryFromValue for HashMap<String, Value> {
const TYPE_NAME: &'static str = "Map";
fn try_from_value(value: &Value) -> Option<Self> {
match value {
Value::Map(m) => Some(m.clone()),
_ => None,
}
}
}
impl TryFromValue for Value {
const TYPE_NAME: &'static str = "any";
fn try_from_value(value: &Value) -> Option<Self> {
@@ -324,6 +364,18 @@ impl fmt::Display for Value {
}
write!(f, " }}")
}
Value::Map(entries) => {
write!(f, "Map {{")?;
let mut sorted: Vec<_> = entries.iter().collect();
sorted.sort_by_key(|(k, _)| (*k).clone());
for (i, (key, value)) in sorted.iter().enumerate() {
if i > 0 {
write!(f, ", ")?;
}
write!(f, "\"{}\": {}", key, value)?;
}
write!(f, "}}")
}
Value::Function(_) => write!(f, "<function>"),
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
Value::Handler(_) => write!(f, "<handler>"),
@@ -1068,9 +1120,26 @@ impl Interpreter {
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
]));
env.define("Math", math_module);
// Int module
let int_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
("toFloat".to_string(), Value::Builtin(BuiltinFn::IntToFloat)),
]));
env.define("Int", int_module);
// Float module
let float_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
("toInt".to_string(), Value::Builtin(BuiltinFn::FloatToInt)),
]));
env.define("Float", float_module);
// JSON module
let json_module = Value::Record(HashMap::from([
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
@@ -1094,16 +1163,72 @@ impl Interpreter {
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
]));
env.define("Json", json_module);
// Map module
let map_module = Value::Record(HashMap::from([
("new".to_string(), Value::Builtin(BuiltinFn::MapNew)),
("set".to_string(), Value::Builtin(BuiltinFn::MapSet)),
("get".to_string(), Value::Builtin(BuiltinFn::MapGet)),
("contains".to_string(), Value::Builtin(BuiltinFn::MapContains)),
("remove".to_string(), Value::Builtin(BuiltinFn::MapRemove)),
("keys".to_string(), Value::Builtin(BuiltinFn::MapKeys)),
("values".to_string(), Value::Builtin(BuiltinFn::MapValues)),
("size".to_string(), Value::Builtin(BuiltinFn::MapSize)),
("isEmpty".to_string(), Value::Builtin(BuiltinFn::MapIsEmpty)),
("fromList".to_string(), Value::Builtin(BuiltinFn::MapFromList)),
("toList".to_string(), Value::Builtin(BuiltinFn::MapToList)),
("merge".to_string(), Value::Builtin(BuiltinFn::MapMerge)),
]));
env.define("Map", map_module);
}
/// Execute a program
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
let mut last_value = Value::Unit;
let mut has_main_let = false;
for decl in &program.declarations {
// Track if there's a top-level `let main = ...`
if let Declaration::Let(let_decl) = decl {
if let_decl.name.name == "main" {
has_main_let = true;
}
}
last_value = self.eval_declaration(decl)?;
}
// Auto-invoke main if it was defined as a let binding with a function value
if has_main_let {
if let Some(main_val) = self.global_env.get("main") {
if let Value::Function(ref closure) = main_val {
if closure.params.is_empty() {
let span = Span { start: 0, end: 0 };
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
// Trampoline loop
loop {
match result {
EvalResult::Value(v) => {
last_value = v;
break;
}
EvalResult::Effect(req) => {
last_value = self.handle_effect(req)?;
break;
}
EvalResult::TailCall { func, args, span } => {
result = self.eval_call(func, args, span)?;
}
EvalResult::Resume(v) => {
last_value = v;
break;
}
}
}
}
}
}
}
Ok(last_value)
}
@@ -1415,6 +1540,34 @@ impl Interpreter {
}
}
Expr::TupleIndex {
object,
index,
span,
} => {
let obj_val = self.eval_expr(object, env)?;
match obj_val {
Value::Tuple(elements) => {
if *index < elements.len() {
Ok(EvalResult::Value(elements[*index].clone()))
} else {
Err(RuntimeError {
message: format!(
"Tuple index {} out of bounds for tuple with {} elements",
index,
elements.len()
),
span: Some(*span),
})
}
}
_ => Err(RuntimeError {
message: format!("Cannot use tuple index on {}", obj_val.type_name()),
span: Some(*span),
}),
}
}
Expr::Lambda { params, body, .. } => {
let closure = Closure {
params: params.iter().map(|p| p.name.name.clone()).collect(),
@@ -1481,8 +1634,28 @@ impl Interpreter {
self.eval_expr_tail(result, &block_env, tail)
}
Expr::Record { fields, .. } => {
Expr::Record {
spread, fields, ..
} => {
let mut record = HashMap::new();
// If there's a spread, evaluate it and start with its fields
if let Some(spread_expr) = spread {
let spread_val = self.eval_expr(spread_expr, env)?;
if let Value::Record(spread_fields) = spread_val {
record = spread_fields;
} else {
return Err(RuntimeError {
message: format!(
"Spread expression must evaluate to a record, got {}",
spread_val.type_name()
),
span: Some(expr.span()),
});
}
}
// Override with explicit fields
for (name, expr) in fields {
let val = self.eval_expr(expr, env)?;
record.insert(name.name.clone(), val);
@@ -1555,6 +1728,18 @@ impl Interpreter {
span: Some(span),
}),
},
BinaryOp::Concat => match (left, right) {
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
(Value::List(a), Value::List(b)) => {
let mut result = a;
result.extend(b);
Ok(Value::List(result))
}
(l, r) => Err(RuntimeError {
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
span: Some(span),
}),
},
BinaryOp::Sub => match (left, right) {
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
@@ -1610,6 +1795,7 @@ impl Interpreter {
(Value::Int(a), Value::Int(b)) => Ok(Value::Bool(a < b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Bool(a < b)),
(Value::String(a), Value::String(b)) => Ok(Value::Bool(a < b)),
(Value::Char(a), Value::Char(b)) => Ok(Value::Bool(a < b)),
(l, r) => Err(RuntimeError {
message: format!("Cannot compare {} and {}", l.type_name(), r.type_name()),
span: Some(span),
@@ -1619,6 +1805,7 @@ impl Interpreter {
(Value::Int(a), Value::Int(b)) => Ok(Value::Bool(a <= b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Bool(a <= b)),
(Value::String(a), Value::String(b)) => Ok(Value::Bool(a <= b)),
(Value::Char(a), Value::Char(b)) => Ok(Value::Bool(a <= b)),
(l, r) => Err(RuntimeError {
message: format!("Cannot compare {} and {}", l.type_name(), r.type_name()),
span: Some(span),
@@ -1628,6 +1815,7 @@ impl Interpreter {
(Value::Int(a), Value::Int(b)) => Ok(Value::Bool(a > b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Bool(a > b)),
(Value::String(a), Value::String(b)) => Ok(Value::Bool(a > b)),
(Value::Char(a), Value::Char(b)) => Ok(Value::Bool(a > b)),
(l, r) => Err(RuntimeError {
message: format!("Cannot compare {} and {}", l.type_name(), r.type_name()),
span: Some(span),
@@ -1637,6 +1825,7 @@ impl Interpreter {
(Value::Int(a), Value::Int(b)) => Ok(Value::Bool(a >= b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Bool(a >= b)),
(Value::String(a), Value::String(b)) => Ok(Value::Bool(a >= b)),
(Value::Char(a), Value::Char(b)) => Ok(Value::Bool(a >= b)),
(l, r) => Err(RuntimeError {
message: format!("Cannot compare {} and {}", l.type_name(), r.type_name()),
span: Some(span),
@@ -2219,6 +2408,46 @@ impl Interpreter {
Ok(EvalResult::Value(Value::String(result)))
}
BuiltinFn::IntToString => {
if args.len() != 1 {
return Err(err("Int.toString requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::String(format!("{}", n)))),
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
}
}
BuiltinFn::FloatToString => {
if args.len() != 1 {
return Err(err("Float.toString requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::String(format!("{}", f)))),
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
}
}
BuiltinFn::IntToFloat => {
if args.len() != 1 {
return Err(err("Int.toFloat requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::Float(*n as f64))),
v => Err(err(&format!("Int.toFloat expects Int, got {}", v.type_name()))),
}
}
BuiltinFn::FloatToInt => {
if args.len() != 1 {
return Err(err("Float.toInt requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::Int(*f as i64))),
v => Err(err(&format!("Float.toInt expects Float, got {}", v.type_name()))),
}
}
BuiltinFn::TypeOf => {
if args.len() != 1 {
return Err(err("typeOf requires 1 argument"));
@@ -2395,6 +2624,45 @@ impl Interpreter {
}
}
BuiltinFn::MathSin => {
if args.len() != 1 {
return Err(err("Math.sin requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathCos => {
if args.len() != 1 {
return Err(err("Math.cos requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathAtan2 => {
if args.len() != 2 {
return Err(err("Math.atan2 requires 2 arguments: y, x"));
}
let y = match &args[0] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
let x = match &args[1] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
}
// Additional List operations
BuiltinFn::ListIsEmpty => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
@@ -2884,6 +3152,128 @@ impl Interpreter {
}
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
}
// Map operations
BuiltinFn::MapNew => {
Ok(EvalResult::Value(Value::Map(HashMap::new())))
}
BuiltinFn::MapSet => {
if args.len() != 3 {
return Err(err("Map.set requires 3 arguments: map, key, value"));
}
let mut map = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.set expects Map as first argument, got {}", v.type_name()))),
};
let key = match &args[1] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.set expects String key, got {}", v.type_name()))),
};
map.insert(key, args[2].clone());
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapGet => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.get", span)?;
match map.get(&key) {
Some(v) => Ok(EvalResult::Value(Value::Constructor {
name: "Some".to_string(),
fields: vec![v.clone()],
})),
None => Ok(EvalResult::Value(Value::Constructor {
name: "None".to_string(),
fields: vec![],
})),
}
}
BuiltinFn::MapContains => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.contains", span)?;
Ok(EvalResult::Value(Value::Bool(map.contains_key(&key))))
}
BuiltinFn::MapRemove => {
let (mut map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.remove", span)?;
map.remove(&key);
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapKeys => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.keys", span)?;
let mut keys: Vec<String> = map.keys().cloned().collect();
keys.sort();
Ok(EvalResult::Value(Value::List(
keys.into_iter().map(Value::String).collect(),
)))
}
BuiltinFn::MapValues => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.values", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries.into_iter().map(|(_, v)| v).collect(),
)))
}
BuiltinFn::MapSize => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.size", span)?;
Ok(EvalResult::Value(Value::Int(map.len() as i64)))
}
BuiltinFn::MapIsEmpty => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.isEmpty", span)?;
Ok(EvalResult::Value(Value::Bool(map.is_empty())))
}
BuiltinFn::MapFromList => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "Map.fromList", span)?;
let mut map = HashMap::new();
for item in list {
match item {
Value::Tuple(fields) if fields.len() == 2 => {
let key = match &fields[0] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.fromList expects (String, V) tuples, got {} key", v.type_name()))),
};
map.insert(key, fields[1].clone());
}
_ => return Err(err("Map.fromList expects List<(String, V)>")),
}
}
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapToList => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.toList", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries
.into_iter()
.map(|(k, v)| Value::Tuple(vec![Value::String(k), v]))
.collect(),
)))
}
BuiltinFn::MapMerge => {
if args.len() != 2 {
return Err(err("Map.merge requires 2 arguments: map1, map2"));
}
let mut map1 = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as first argument, got {}", v.type_name()))),
};
let map2 = match &args[1] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as second argument, got {}", v.type_name()))),
};
for (k, v) in map2 {
map1.insert(k, v);
}
Ok(EvalResult::Value(Value::Map(map1)))
}
}
}
@@ -3049,6 +3439,11 @@ impl Interpreter {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(Value::Map(a), Value::Map(b)) => {
a.len() == b.len() && a.iter().all(|(k, v)| {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(
Value::Constructor {
name: n1,
@@ -3469,6 +3864,30 @@ impl Interpreter {
}
}
("File", "copy") => {
let source = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string source path".to_string(),
span: None,
}),
};
let dest = match request.args.get(1) {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string destination path".to_string(),
span: None,
}),
};
match std::fs::copy(&source, &dest) {
Ok(_) => Ok(Value::Unit),
Err(e) => Err(RuntimeError {
message: format!("Failed to copy '{}' to '{}': {}", source, dest, e),
span: None,
}),
}
}
// ===== Process Effect =====
("Process", "exec") => {
use std::process::Command;
@@ -3824,6 +4243,26 @@ impl Interpreter {
}
Ok(Value::Unit)
}
("Test", "assertEqualMsg") => {
let expected = request.args.first().cloned().unwrap_or(Value::Unit);
let actual = request.args.get(1).cloned().unwrap_or(Value::Unit);
let label = match request.args.get(2) {
Some(Value::String(s)) => s.clone(),
_ => "Values not equal".to_string(),
};
if Value::values_equal(&expected, &actual) {
self.test_results.borrow_mut().passed += 1;
} else {
self.test_results.borrow_mut().failed += 1;
self.test_results.borrow_mut().failures.push(TestFailure {
message: label,
expected: Some(format!("{}", expected)),
actual: Some(format!("{}", actual)),
});
}
Ok(Value::Unit)
}
("Test", "assertNotEqual") => {
let a = request.args.first().cloned().unwrap_or(Value::Unit);
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
@@ -4956,6 +5395,7 @@ mod tests {
// Create a simple migration that adds a field
// Migration: old.name -> { name: old.name, email: "unknown" }
let migration_body = Expr::Record {
spread: None,
fields: vec![
(
Ident::new("name", Span::default()),

View File

@@ -42,6 +42,7 @@ pub enum TokenKind {
Effect,
Handler,
Run,
Handle,
Resume,
Type,
True,
@@ -70,6 +71,7 @@ pub enum TokenKind {
// Operators
Plus, // +
PlusPlus, // ++
Minus, // -
Star, // *
Slash, // /
@@ -89,6 +91,7 @@ pub enum TokenKind {
Arrow, // =>
ThinArrow, // ->
Dot, // .
DotDotDot, // ...
Colon, // :
ColonColon, // ::
Comma, // ,
@@ -138,6 +141,7 @@ impl fmt::Display for TokenKind {
TokenKind::Effect => write!(f, "effect"),
TokenKind::Handler => write!(f, "handler"),
TokenKind::Run => write!(f, "run"),
TokenKind::Handle => write!(f, "handle"),
TokenKind::Resume => write!(f, "resume"),
TokenKind::Type => write!(f, "type"),
TokenKind::Import => write!(f, "import"),
@@ -160,6 +164,7 @@ impl fmt::Display for TokenKind {
TokenKind::True => write!(f, "true"),
TokenKind::False => write!(f, "false"),
TokenKind::Plus => write!(f, "+"),
TokenKind::PlusPlus => write!(f, "++"),
TokenKind::Minus => write!(f, "-"),
TokenKind::Star => write!(f, "*"),
TokenKind::Slash => write!(f, "/"),
@@ -179,6 +184,7 @@ impl fmt::Display for TokenKind {
TokenKind::Arrow => write!(f, "=>"),
TokenKind::ThinArrow => write!(f, "->"),
TokenKind::Dot => write!(f, "."),
TokenKind::DotDotDot => write!(f, "..."),
TokenKind::Colon => write!(f, ":"),
TokenKind::ColonColon => write!(f, "::"),
TokenKind::Comma => write!(f, ","),
@@ -268,7 +274,14 @@ impl<'a> Lexer<'a> {
let kind = match c {
// Single-character tokens
'+' => TokenKind::Plus,
'+' => {
if self.peek() == Some('+') {
self.advance();
TokenKind::PlusPlus
} else {
TokenKind::Plus
}
}
'*' => TokenKind::Star,
'%' => TokenKind::Percent,
'(' => TokenKind::LParen,
@@ -364,7 +377,22 @@ impl<'a> Lexer<'a> {
TokenKind::Pipe
}
}
'.' => TokenKind::Dot,
'.' => {
if self.peek() == Some('.') {
// Check for ... (need to peek past second dot)
// We look at source directly since we can only peek one ahead
let next_next = self.source[self.pos..].chars().nth(1);
if next_next == Some('.') {
self.advance(); // consume second '.'
self.advance(); // consume third '.'
TokenKind::DotDotDot
} else {
TokenKind::Dot
}
} else {
TokenKind::Dot
}
}
':' => {
if self.peek() == Some(':') {
self.advance();
@@ -493,6 +521,8 @@ impl<'a> Lexer<'a> {
Some('"') => '"',
Some('0') => '\0',
Some('\'') => '\'',
Some('{') => '{',
Some('}') => '}',
Some('x') => {
// Hex escape \xNN
let h1 = self.advance().and_then(|c| c.to_digit(16));
@@ -743,6 +773,7 @@ impl<'a> Lexer<'a> {
"effect" => TokenKind::Effect,
"handler" => TokenKind::Handler,
"run" => TokenKind::Run,
"handle" => TokenKind::Handle,
"resume" => TokenKind::Resume,
"type" => TokenKind::Type,
"import" => TokenKind::Import,
@@ -761,6 +792,8 @@ impl<'a> Lexer<'a> {
"commutative" => TokenKind::Commutative,
"where" => TokenKind::Where,
"assume" => TokenKind::Assume,
"and" => TokenKind::And,
"or" => TokenKind::Or,
"true" => TokenKind::Bool(true),
"false" => TokenKind::Bool(false),
_ => TokenKind::Ident(ident.to_string()),

View File

@@ -510,10 +510,13 @@ impl Linter {
self.collect_refs_expr(&arm.body);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.collect_refs_expr(object);
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_refs_expr(spread_expr);
}
for (_, val) in fields {
self.collect_refs_expr(val);
}

View File

@@ -317,66 +317,227 @@ impl LspServer {
let doc = self.documents.get(&uri)?;
let source = &doc.text;
// Try to get info from symbol table first
// Try to get info from symbol table first (position-based lookup)
if let Some(ref table) = doc.symbol_table {
let offset = self.position_to_offset(source, position);
if let Some(symbol) = table.definition_at_position(offset) {
let signature = symbol.type_signature.as_ref()
.map(|s| s.as_str())
.unwrap_or(&symbol.name);
let kind_str = match symbol.kind {
SymbolKind::Function => "function",
SymbolKind::Variable => "variable",
SymbolKind::Parameter => "parameter",
SymbolKind::Type => "type",
SymbolKind::TypeParameter => "type parameter",
SymbolKind::Variant => "variant",
SymbolKind::Effect => "effect",
SymbolKind::EffectOperation => "effect operation",
SymbolKind::Field => "field",
SymbolKind::Module => "module",
};
let doc_str = symbol.documentation.as_ref()
.map(|d| format!("\n\n{}", d))
.unwrap_or_default();
// Format signature: wrap long signatures onto multiple lines
let formatted_sig = format_signature_for_hover(signature);
// Add behavioral property documentation if present
let property_docs = extract_property_docs(signature);
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n*{}*{}{}", formatted_sig, kind_str, property_docs, doc_str),
}),
range: None,
});
return Some(self.format_symbol_hover(symbol));
}
}
// Fall back to hardcoded info
// Extract the word at the cursor position
// Get the word under cursor
let word = self.get_word_at_position(source, position)?;
// Look up rich documentation for known symbols
let info = self.get_rich_symbol_info(&word)
.or_else(|| self.get_symbol_info(&word).map(|(s, d)| (s.to_string(), d.to_string())));
// When hovering on a keyword like 'fn', 'type', 'effect', 'let', 'trait',
// look ahead to find the declaration name and show that symbol's info
if let Some(ref table) = doc.symbol_table {
if matches!(word.as_str(), "fn" | "type" | "effect" | "let" | "trait" | "handler" | "impl") {
let offset = self.position_to_offset(source, position);
if let Some(name) = self.find_next_ident(source, offset + word.len()) {
for sym in table.global_symbols() {
if sym.name == name {
return Some(self.format_symbol_hover(sym));
}
}
}
}
if let Some((signature, doc)) = info {
let formatted_sig = format_signature_for_hover(&signature);
Some(Hover {
// Try name-based lookup in symbol table (for usage sites)
for sym in table.global_symbols() {
if sym.name == word {
return Some(self.format_symbol_hover(sym));
}
}
}
// Check for module names (Console, List, String, etc.)
if let Some(hover) = self.get_module_hover(&word) {
return Some(hover);
}
// Rich documentation for behavioral property keywords
if let Some((signature, doc_text)) = self.get_rich_symbol_info(&word) {
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n{}", formatted_sig, doc),
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
}),
range: None,
})
} else {
None
});
}
// Builtin keyword/function info
if let Some((signature, doc_text)) = self.get_symbol_info(&word) {
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
}),
range: None,
});
}
None
}
/// Format a symbol into a hover response
fn format_symbol_hover(&self, symbol: &crate::symbol_table::Symbol) -> Hover {
let signature = symbol.type_signature.as_ref()
.map(|s| s.as_str())
.unwrap_or(&symbol.name);
let kind_str = match symbol.kind {
SymbolKind::Function => "function",
SymbolKind::Variable => "variable",
SymbolKind::Parameter => "parameter",
SymbolKind::Type => "type",
SymbolKind::TypeParameter => "type parameter",
SymbolKind::Variant => "variant",
SymbolKind::Effect => "effect",
SymbolKind::EffectOperation => "effect operation",
SymbolKind::Field => "field",
SymbolKind::Module => "module",
};
let doc_str = symbol.documentation.as_ref()
.map(|d| format!("\n\n{}", d))
.unwrap_or_default();
let formatted_sig = format_signature_for_hover(signature);
let property_docs = extract_property_docs(signature);
Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!(
"```lux\n{}\n```\n*{}*{}{}",
formatted_sig, kind_str, property_docs, doc_str
),
}),
range: None,
}
}
/// Get hover info for built-in module names
fn get_module_hover(&self, name: &str) -> Option<Hover> {
let (sig, doc) = match name {
"Console" => (
"effect Console",
"**Console I/O**\n\n\
- `Console.print(msg: String): Unit` — print to stdout\n\
- `Console.readLine(): String` — read a line from stdin\n\
- `Console.readInt(): Int` — read an integer from stdin",
),
"File" => (
"effect File",
"**File System**\n\n\
- `File.read(path: String): String` — read file contents\n\
- `File.write(path: String, content: String): Unit` — write to file\n\
- `File.append(path: String, content: String): Unit` — append to file\n\
- `File.exists(path: String): Bool` — check if file exists\n\
- `File.delete(path: String): Unit` — delete a file\n\
- `File.list(path: String): List<String>` — list directory",
),
"Http" => (
"effect Http",
"**HTTP Client**\n\n\
- `Http.get(url: String): String` — GET request\n\
- `Http.post(url: String, body: String): String` — POST request\n\
- `Http.put(url: String, body: String): String` — PUT request\n\
- `Http.delete(url: String): String` — DELETE request",
),
"Sql" => (
"effect Sql",
"**SQL Database**\n\n\
- `Sql.open(path: String): Connection` — open database\n\
- `Sql.execute(conn: Connection, sql: String): Unit` — execute SQL\n\
- `Sql.query(conn: Connection, sql: String): List<Row>` — query rows\n\
- `Sql.close(conn: Connection): Unit` — close connection",
),
"Random" => (
"effect Random",
"**Random Number Generation**\n\n\
- `Random.int(min: Int, max: Int): Int` — random integer\n\
- `Random.float(): Float` — random float 0.01.0\n\
- `Random.bool(): Bool` — random boolean",
),
"Time" => (
"effect Time",
"**Time**\n\n\
- `Time.now(): Int` — current Unix timestamp (ms)\n\
- `Time.sleep(ms: Int): Unit` — sleep for milliseconds",
),
"Process" => (
"effect Process",
"**Process / System**\n\n\
- `Process.exec(cmd: String): String` — run shell command\n\
- `Process.env(name: String): String` — get env variable\n\
- `Process.args(): List<String>` — command-line arguments\n\
- `Process.exit(code: Int): Unit` — exit with code",
),
"Math" => (
"module Math",
"**Math Functions**\n\n\
- `Math.abs(n: Int): Int` — absolute value\n\
- `Math.min(a: Int, b: Int): Int` — minimum\n\
- `Math.max(a: Int, b: Int): Int` — maximum\n\
- `Math.sqrt(n: Float): Float` — square root\n\
- `Math.pow(base: Float, exp: Float): Float` — power\n\
- `Math.floor(n: Float): Int` — round down\n\
- `Math.ceil(n: Float): Int` — round up",
),
"List" => (
"module List",
"**List Operations**\n\n\
- `List.map(list, f)` — transform each element\n\
- `List.filter(list, p)` — keep matching elements\n\
- `List.fold(list, init, f)` — reduce to single value\n\
- `List.head(list)` — first element (Option)\n\
- `List.tail(list)` — all except first (Option)\n\
- `List.length(list)` — number of elements\n\
- `List.concat(a, b)` — concatenate lists\n\
- `List.range(start, end)` — integer range\n\
- `List.reverse(list)` — reverse order\n\
- `List.get(list, i)` — element at index (Option)",
),
"String" => (
"module String",
"**String Operations**\n\n\
- `String.length(s)` — string length\n\
- `String.split(s, delim)` — split by delimiter\n\
- `String.join(list, delim)` — join with delimiter\n\
- `String.trim(s)` — trim whitespace\n\
- `String.contains(s, sub)` — check substring\n\
- `String.replace(s, from, to)` — replace occurrences\n\
- `String.startsWith(s, prefix)` — check prefix\n\
- `String.endsWith(s, suffix)` — check suffix\n\
- `String.substring(s, start, end)` — extract range\n\
- `String.chars(s)` — list of characters",
),
"Option" => (
"type Option<A> = Some(A) | None",
"**Optional Value**\n\n\
- `Option.isSome(opt)` — has a value?\n\
- `Option.isNone(opt)` — is empty?\n\
- `Option.getOrElse(opt, default)` — unwrap or default\n\
- `Option.map(opt, f)` — transform if present\n\
- `Option.flatMap(opt, f)` — chain operations",
),
"Result" => (
"type Result<A, E> = Ok(A) | Err(E)",
"**Result of Fallible Operation**\n\n\
- `Result.isOk(r)` — succeeded?\n\
- `Result.isErr(r)` — failed?\n\
- `Result.map(r, f)` — transform success value\n\
- `Result.mapErr(r, f)` — transform error value",
),
_ => return None,
};
Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n{}", sig, doc),
}),
range: None,
})
}
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
@@ -402,6 +563,26 @@ impl LspServer {
}
}
/// Find the next identifier in source after the given offset (skipping whitespace)
fn find_next_ident(&self, source: &str, start: usize) -> Option<String> {
let chars: Vec<char> = source.chars().collect();
let mut pos = start;
// Skip whitespace
while pos < chars.len() && (chars[pos] == ' ' || chars[pos] == '\t' || chars[pos] == '\n' || chars[pos] == '\r') {
pos += 1;
}
// Collect identifier
let ident_start = pos;
while pos < chars.len() && (chars[pos].is_alphanumeric() || chars[pos] == '_') {
pos += 1;
}
if pos > ident_start {
Some(chars[ident_start..pos].iter().collect())
} else {
None
}
}
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
match word {
// Keywords
@@ -607,17 +788,11 @@ impl LspServer {
fn position_to_offset(&self, source: &str, position: Position) -> usize {
let mut offset = 0;
let mut line = 0u32;
for (i, c) in source.char_indices() {
if line == position.line {
let col = i - offset;
return offset + (position.character as usize).min(col + 1);
}
if c == '\n' {
line += 1;
offset = i + 1;
for (line_idx, line) in source.lines().enumerate() {
if line_idx == position.line as usize {
return offset + (position.character as usize).min(line.len());
}
offset += line.len() + 1; // +1 for newline
}
source.len()
}
@@ -1396,12 +1571,15 @@ fn collect_call_site_hints(
collect_call_site_hints(source, e, param_names, hints);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
collect_call_site_hints(source, spread_expr, param_names, hints);
}
for (_, e) in fields {
collect_call_site_hints(source, e, param_names, hints);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
collect_call_site_hints(source, object, param_names, hints);
}
Expr::Run { expr, handlers, .. } => {

View File

@@ -1,4 +1,7 @@
//! Lux - A functional programming language with first-class effects
//! Lux — Make the important things visible.
//!
//! A functional programming language with first-class effects, schema evolution,
//! and behavioral types. See `lux philosophy` or docs/PHILOSOPHY.md.
mod analysis;
mod ast;
@@ -34,7 +37,7 @@ use std::borrow::Cow;
use std::collections::HashSet;
use typechecker::TypeChecker;
const VERSION: &str = "0.1.0";
const VERSION: &str = env!("CARGO_PKG_VERSION");
const HELP: &str = r#"
Lux - A functional language with first-class effects
@@ -171,9 +174,14 @@ fn main() {
.and_then(|s| s.parse::<u16>().ok())
.unwrap_or(8080);
let dir = args.get(2)
.filter(|a| !a.starts_with('-'))
.map(|s| s.as_str())
let port_value_idx = args.iter()
.position(|a| a == "--port" || a == "-p")
.map(|i| i + 1);
let dir = args.iter().enumerate()
.skip(2)
.filter(|(i, a)| !a.starts_with('-') && Some(*i) != port_value_idx)
.map(|(_, a)| a.as_str())
.next()
.unwrap_or(".");
serve_static_files(dir, port);
@@ -185,10 +193,12 @@ fn main() {
eprintln!(" lux compile <file.lux> --run");
eprintln!(" lux compile <file.lux> --emit-c [-o file.c]");
eprintln!(" lux compile <file.lux> --target js [-o file.js]");
eprintln!(" lux compile <file.lux> --watch");
std::process::exit(1);
}
let run_after = args.iter().any(|a| a == "--run");
let emit_c = args.iter().any(|a| a == "--emit-c");
let watch = args.iter().any(|a| a == "--watch");
let target_js = args.iter()
.position(|a| a == "--target")
.and_then(|i| args.get(i + 1))
@@ -204,17 +214,34 @@ fn main() {
} else {
compile_to_c(&args[2], output_path, run_after, emit_c);
}
if watch {
// Build the args to replay for each recompilation (without --watch)
let compile_args: Vec<String> = args.iter()
.skip(1)
.filter(|a| a.as_str() != "--watch")
.cloned()
.collect();
watch_and_rerun(&args[2], &compile_args);
}
}
"repl" => {
// Start REPL
run_repl();
}
"doc" => {
// Generate API documentation
generate_docs(&args[2..]);
}
"philosophy" => {
print_philosophy();
}
cmd => {
// Check if it looks like a command typo
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
let known_commands = vec![
"fmt", "lint", "test", "watch", "init", "check", "debug",
"pkg", "registry", "serve", "compile", "doc",
"pkg", "registry", "serve", "compile", "doc", "repl", "philosophy",
];
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
if !suggestions.is_empty() {
@@ -229,18 +256,24 @@ fn main() {
}
}
} else {
// Start REPL
run_repl();
// No arguments — show help
print_help();
}
}
fn print_help() {
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
println!("{}", c(colors::DIM, "A functional language with first-class effects"));
println!("{}", c(colors::DIM, "Make the important things visible."));
println!();
println!(" {} Effects in types — see what code does", c(colors::DIM, "·"));
println!(" {} Composition over configuration — no DI frameworks", c(colors::DIM, "·"));
println!(" {} Safety without ceremony — inference where it helps", c(colors::DIM, "·"));
println!(" {} One right way — opinionated formatter, integrated tools", c(colors::DIM, "·"));
println!();
println!("{}", bc("", "Usage:"));
println!();
println!(" {} Start the REPL", bc(colors::CYAN, "lux"));
println!(" {} Show this help", bc(colors::CYAN, "lux"));
println!(" {} Start the REPL", bc(colors::CYAN, "lux repl"));
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
@@ -275,6 +308,8 @@ fn print_help() {
c(colors::DIM, "(alias: s)"));
println!(" {} {} {} Generate API documentation",
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
println!(" {} {} Show language philosophy",
bc(colors::CYAN, "lux"), bc(colors::CYAN, "philosophy"));
println!(" {} {} Start LSP server",
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
println!(" {} {} Show this help",
@@ -283,6 +318,36 @@ fn print_help() {
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
}
fn print_philosophy() {
println!("{}", bc(colors::GREEN, &format!("The Lux Philosophy")));
println!();
println!(" {}", bc("", "Make the important things visible."));
println!();
println!(" Most languages hide what matters most in production: what code");
println!(" can do, how data changes over time, and what guarantees functions");
println!(" provide. Lux makes all three first-class, compiler-checked features.");
println!();
println!(" {} {}", bc(colors::CYAN, "1. Explicit over implicit"), c(colors::DIM, "— effects in types, not hidden behind interfaces"));
println!(" fn processOrder(order: Order): Receipt {} {}", c(colors::YELLOW, "with {Database, Email}"), c(colors::DIM, "// signature IS documentation"));
println!();
println!(" {} {}", bc(colors::CYAN, "2. Composition over configuration"), c(colors::DIM, "— no DI frameworks, no monad transformers"));
println!(" run app() {} {}", c(colors::YELLOW, "with { Database = mock, Http = mock }"), c(colors::DIM, "// swap handlers, not libraries"));
println!();
println!(" {} {}", bc(colors::CYAN, "3. Safety without ceremony"), c(colors::DIM, "— type inference where it helps, annotations where they document"));
println!(" let x = 42 {}", c(colors::DIM, "// inferred"));
println!(" fn f(x: Int): Int = x * 2 {}", c(colors::DIM, "// annotated: API contract"));
println!();
println!(" {} {}", bc(colors::CYAN, "4. Practical over academic"), c(colors::DIM, "— ML semantics in C-family syntax, no monads to learn"));
println!(" {} {} {}", c(colors::DIM, "fn main(): Unit"), c(colors::YELLOW, "with {Console}"), c(colors::DIM, "= Console.print(\"Hello!\")"));
println!();
println!(" {} {}", bc(colors::CYAN, "5. One right way"), c(colors::DIM, "— opinionated formatter, integrated tooling, built-in testing"));
println!(" lux fmt | lux lint | lux check | lux test | lux compile");
println!();
println!(" {} {}", bc(colors::CYAN, "6. Tools are the language"), c(colors::DIM, "— formatter knows the AST, linter knows the types, LSP knows the effects"));
println!();
println!(" See {} for the full philosophy with language comparisons.", c(colors::CYAN, "docs/PHILOSOPHY.md"));
}
fn format_files(args: &[String]) {
use formatter::{format, FormatConfig};
use std::path::Path;
@@ -721,6 +786,36 @@ fn collect_lux_files_nonrecursive(dir: &str, pattern: Option<&str>, files: &mut
}
}
/// Find a C compiler. Priority: $CC env var, build-time embedded path, PATH search.
fn find_c_compiler() -> String {
// 1. Explicit env var
if let Ok(cc) = std::env::var("CC") {
if !cc.is_empty() {
return cc;
}
}
// 2. Path captured at build time (e.g. absolute nix store path)
let built_in = env!("LUX_CC_PATH");
if !built_in.is_empty() && std::path::Path::new(built_in).exists() {
return built_in.to_string();
}
// 3. Search PATH
for name in &["cc", "gcc", "clang"] {
if let Ok(output) = std::process::Command::new("which").arg(name).output() {
if output.status.success() {
if let Ok(p) = String::from_utf8(output.stdout) {
let p = p.trim();
if !p.is_empty() {
return p.to_string();
}
}
}
}
}
// 4. Last resort
"cc".to_string()
}
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
use codegen::c_backend::CBackend;
use modules::ModuleLoader;
@@ -764,7 +859,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
// Generate C code
let mut backend = CBackend::new();
let c_code = match backend.generate(&program) {
let c_code = match backend.generate(&program, loader.module_cache()) {
Ok(code) => code,
Err(e) => {
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
@@ -812,13 +907,14 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
std::process::exit(1);
}
// Find C compiler
let cc = std::env::var("CC").unwrap_or_else(|_| "cc".to_string());
// Find C compiler: $CC env var > embedded build-time path > PATH search
let cc = find_c_compiler();
let compile_result = Command::new(&cc)
.args(["-O2", "-o"])
.arg(&output_bin)
.arg(&temp_c)
.arg("-lm")
.output();
match compile_result {
@@ -1002,7 +1098,7 @@ fn run_tests(args: &[String]) {
for test_file in &test_files {
let path_str = test_file.to_string_lossy().to_string();
// Read and parse the file
// Read and parse the file (with module loading)
let source = match fs::read_to_string(test_file) {
Ok(s) => s,
Err(e) => {
@@ -1012,7 +1108,13 @@ fn run_tests(args: &[String]) {
}
};
let program = match Parser::parse_source(&source) {
use modules::ModuleLoader;
let mut loader = ModuleLoader::new();
if let Some(parent) = test_file.parent() {
loader.add_search_path(parent.to_path_buf());
}
let program = match loader.load_source(&source, Some(test_file.as_path())) {
Ok(p) => p,
Err(e) => {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
@@ -1021,9 +1123,9 @@ fn run_tests(args: &[String]) {
}
};
// Type check
// Type check with module support
let mut checker = typechecker::TypeChecker::new();
if let Err(errors) = checker.check_program(&program) {
if let Err(errors) = checker.check_program_with_modules(&program, &loader) {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
for err in errors {
eprintln!(" {}", err);
@@ -1051,7 +1153,7 @@ fn run_tests(args: &[String]) {
interp.register_auto_migrations(&auto_migrations);
interp.reset_test_results();
match interp.run(&program) {
match interp.run_with_modules(&program, &loader) {
Ok(_) => {
let results = interp.get_test_results();
if results.failed == 0 && results.passed == 0 {
@@ -1085,8 +1187,8 @@ fn run_tests(args: &[String]) {
interp.register_auto_migrations(&auto_migrations);
interp.reset_test_results();
// First run the file to define all functions
if let Err(e) = interp.run(&program) {
// First run the file to define all functions and load imports
if let Err(e) = interp.run_with_modules(&program, &loader) {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
total_failed += 1;
continue;
@@ -1261,6 +1363,64 @@ fn watch_file(path: &str) {
}
}
fn watch_and_rerun(path: &str, compile_args: &[String]) {
use std::time::{Duration, SystemTime};
use std::path::Path;
let file_path = Path::new(path);
if !file_path.exists() {
eprintln!("File not found: {}", path);
std::process::exit(1);
}
println!();
println!("Watching {} for changes (Ctrl+C to stop)...", path);
let mut last_modified = std::fs::metadata(file_path)
.and_then(|m| m.modified())
.unwrap_or(SystemTime::UNIX_EPOCH);
loop {
std::thread::sleep(Duration::from_millis(500));
let modified = match std::fs::metadata(file_path).and_then(|m| m.modified()) {
Ok(m) => m,
Err(_) => continue,
};
if modified > last_modified {
last_modified = modified;
// Clear screen
print!("\x1B[2J\x1B[H");
println!("=== Compiling {} ===", path);
println!();
let result = std::process::Command::new(std::env::current_exe().unwrap())
.args(compile_args)
.status();
match result {
Ok(status) if status.success() => {
println!();
println!("=== Success ===");
}
Ok(_) => {
println!();
println!("=== Failed ===");
}
Err(e) => {
eprintln!("Error running compiler: {}", e);
}
}
println!();
println!("Watching for changes...");
}
}
}
fn serve_static_files(dir: &str, port: u16) {
use std::io::{Write, BufRead, BufReader};
use std::net::TcpListener;
@@ -4831,6 +4991,71 @@ c")"#;
}
}
// ============ Multi-line Arguments Tests ============
#[test]
fn test_multiline_function_args() {
let source = r#"
fn add(a: Int, b: Int): Int = a + b
let result = add(
1,
2
)
"#;
assert_eq!(eval(source).unwrap(), "3");
}
#[test]
fn test_multiline_function_args_with_lambda() {
let source = r#"
let xs = List.map(
[1, 2, 3],
fn(x) => x * 2
)
"#;
assert_eq!(eval(source).unwrap(), "[2, 4, 6]");
}
// ============ Tuple Index Tests ============
#[test]
fn test_tuple_index_access() {
let source = r#"
let pair = (42, "hello")
let first = pair.0
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_tuple_index_access_second() {
let source = r#"
let pair = (42, "hello")
let second = pair.1
"#;
assert_eq!(eval(source).unwrap(), "\"hello\"");
}
#[test]
fn test_tuple_index_triple() {
let source = r#"
let triple = (1, 2, 3)
let sum = triple.0 + triple.1 + triple.2
"#;
assert_eq!(eval(source).unwrap(), "6");
}
#[test]
fn test_tuple_index_in_function() {
let source = r#"
fn first(pair: (Int, String)): Int = pair.0
fn second(pair: (Int, String)): String = pair.1
let p = (42, "hello")
let result = first(p)
"#;
assert_eq!(eval(source).unwrap(), "42");
}
// Exhaustiveness checking tests
mod exhaustiveness_tests {
use super::*;
@@ -5286,4 +5511,173 @@ c")"#;
check_file("projects/rest-api/main.lux").unwrap();
}
}
// === Map type tests ===
#[test]
fn test_map_new_and_size() {
let source = r#"
let m = Map.new()
let result = Map.size(m)
"#;
assert_eq!(eval(source).unwrap(), "0");
}
#[test]
fn test_map_set_and_get() {
let source = r#"
let m = Map.new()
let m2 = Map.set(m, "name", "Alice")
let result = Map.get(m2, "name")
"#;
assert_eq!(eval(source).unwrap(), "Some(\"Alice\")");
}
#[test]
fn test_map_get_missing() {
let source = r#"
let m = Map.new()
let result = Map.get(m, "missing")
"#;
assert_eq!(eval(source).unwrap(), "None");
}
#[test]
fn test_map_contains() {
let source = r#"
let m = Map.set(Map.new(), "x", 1)
let result = (Map.contains(m, "x"), Map.contains(m, "y"))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_remove() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "a", 1), "b", 2)
let m2 = Map.remove(m, "a")
let result = (Map.size(m2), Map.contains(m2, "a"), Map.contains(m2, "b"))
"#;
assert_eq!(eval(source).unwrap(), "(1, false, true)");
}
#[test]
fn test_map_keys_and_values() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.keys(m)
"#;
assert_eq!(eval(source).unwrap(), "[\"a\", \"b\"]");
}
#[test]
fn test_map_from_list() {
let source = r#"
let m = Map.fromList([("x", 10), ("y", 20)])
let result = (Map.get(m, "x"), Map.size(m))
"#;
assert_eq!(eval(source).unwrap(), "(Some(10), 2)");
}
#[test]
fn test_map_to_list() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.toList(m)
"#;
assert_eq!(eval(source).unwrap(), "[(\"a\", 1), (\"b\", 2)]");
}
#[test]
fn test_map_merge() {
let source = r#"
let m1 = Map.fromList([("a", 1), ("b", 2)])
let m2 = Map.fromList([("b", 3), ("c", 4)])
let merged = Map.merge(m1, m2)
let result = (Map.get(merged, "a"), Map.get(merged, "b"), Map.get(merged, "c"))
"#;
assert_eq!(eval(source).unwrap(), "(Some(1), Some(3), Some(4))");
}
#[test]
fn test_map_immutability() {
let source = r#"
let m1 = Map.fromList([("a", 1)])
let m2 = Map.set(m1, "b", 2)
let result = (Map.size(m1), Map.size(m2))
"#;
assert_eq!(eval(source).unwrap(), "(1, 2)");
}
#[test]
fn test_map_is_empty() {
let source = r#"
let m1 = Map.new()
let m2 = Map.set(m1, "x", 1)
let result = (Map.isEmpty(m1), Map.isEmpty(m2))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_type_annotation() {
let source = r#"
fn lookup(m: Map<String, Int>, key: String): Option<Int> =
Map.get(m, key)
let m = Map.fromList([("age", 30)])
let result = lookup(m, "age")
"#;
assert_eq!(eval(source).unwrap(), "Some(30)");
}
#[test]
fn test_file_copy() {
use std::io::Write;
// Create a temp file, copy it, verify contents
let dir = std::env::temp_dir().join("lux_test_file_copy");
let _ = std::fs::create_dir_all(&dir);
let src = dir.join("src.txt");
let dst = dir.join("dst.txt");
std::fs::File::create(&src).unwrap().write_all(b"hello copy").unwrap();
let _ = std::fs::remove_file(&dst);
let source = format!(r#"
fn main(): Unit with {{File}} =
File.copy("{}", "{}")
let _ = run main() with {{}}
let result = "done"
"#, src.display(), dst.display());
let result = eval(&source);
assert!(result.is_ok(), "File.copy failed: {:?}", result);
let contents = std::fs::read_to_string(&dst).unwrap();
assert_eq!(contents, "hello copy");
// Cleanup
let _ = std::fs::remove_dir_all(&dir);
}
#[test]
fn test_effectful_callback_propagation() {
// WISH-7: effectful callbacks in List.forEach should propagate effects
// This should type-check successfully because Console effect is inferred
let source = r#"
fn printAll(items: List<String>): Unit =
List.forEach(items, fn(x: String): Unit => Console.print(x))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback should type-check: {:?}", result);
}
#[test]
fn test_effectful_callback_in_map() {
// Effectful callback in List.map should propagate effects
let source = r#"
fn readAll(paths: List<String>): List<String> =
List.map(paths, fn(p: String): String => File.read(p))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback in map should type-check: {:?}", result);
}
}

View File

@@ -305,6 +305,11 @@ impl ModuleLoader {
self.cache.iter()
}
/// Get the module cache (for passing to C backend)
pub fn module_cache(&self) -> &HashMap<String, Module> {
&self.cache
}
/// Clear the module cache
pub fn clear_cache(&mut self) {
self.cache.clear();

View File

@@ -245,6 +245,7 @@ impl Parser {
TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)),
TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)),
TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")),
TokenKind::Handle => Err(self.error("Bare 'handle' expressions are not allowed at top level. Use 'let _ = handle ...' or 'let result = handle ...'")),
_ => Err(self.error("Expected declaration (fn, effect, handler, type, trait, impl, or let)")),
}
}
@@ -1558,6 +1559,7 @@ impl Parser {
loop {
let op = match self.peek_kind() {
TokenKind::Plus => BinaryOp::Add,
TokenKind::PlusPlus => BinaryOp::Concat,
TokenKind::Minus => BinaryOp::Sub,
_ => break,
};
@@ -1646,6 +1648,20 @@ impl Parser {
} else if self.check(TokenKind::Dot) {
let start = expr.span();
self.advance();
// Check for tuple index access: expr.0, expr.1, etc.
if let TokenKind::Int(n) = self.peek_kind() {
let index = n as usize;
self.advance();
let span = start.merge(self.previous_span());
expr = Expr::TupleIndex {
object: Box::new(expr),
index,
span,
};
continue;
}
let field = self.parse_ident()?;
// Check if this is an effect operation: Effect.operation(args)
@@ -1681,11 +1697,14 @@ impl Parser {
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
let mut args = Vec::new();
self.skip_newlines();
while !self.check(TokenKind::RParen) {
args.push(self.parse_expr()?);
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
@@ -1757,6 +1776,7 @@ impl Parser {
TokenKind::Let => self.parse_let_expr(),
TokenKind::Fn => self.parse_lambda_expr(),
TokenKind::Run => self.parse_run_expr(),
TokenKind::Handle => self.parse_handle_expr(),
TokenKind::Resume => self.parse_resume_expr(),
// Delimiters
@@ -1774,6 +1794,7 @@ impl Parser {
let condition = Box::new(self.parse_expr()?);
self.skip_newlines();
self.expect(TokenKind::Then)?;
self.skip_newlines();
let then_branch = Box::new(self.parse_expr()?);
@@ -1887,6 +1908,14 @@ impl Parser {
span: token.span,
}))
}
TokenKind::Char(c) => {
let c = *c;
self.advance();
Ok(Pattern::Literal(Literal {
kind: LiteralKind::Char(c),
span: token.span,
}))
}
TokenKind::Ident(name) => {
// Check if it starts with uppercase (constructor) or lowercase (variable)
if name.chars().next().map_or(false, |c| c.is_uppercase()) {
@@ -2124,6 +2153,40 @@ impl Parser {
})
}
fn parse_handle_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Handle)?;
let expr = Box::new(self.parse_call_expr()?);
self.expect(TokenKind::With)?;
self.expect(TokenKind::LBrace)?;
self.skip_newlines();
let mut handlers = Vec::new();
while !self.check(TokenKind::RBrace) {
let effect = self.parse_ident()?;
self.expect(TokenKind::Eq)?;
let handler = self.parse_expr()?;
handlers.push((effect, handler));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
let end = self.current_span();
self.expect(TokenKind::RBrace)?;
Ok(Expr::Run {
expr,
handlers,
span: start.merge(end),
})
}
fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Resume)?;
@@ -2182,6 +2245,11 @@ impl Parser {
}));
}
// Check for record spread: { ...expr, field: val }
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
return self.parse_record_expr_rest(start);
}
// Check if it's a record (ident: expr) or block
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
@@ -2196,6 +2264,20 @@ impl Parser {
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
let mut fields = Vec::new();
let mut spread = None;
// Check for spread: { ...expr, ... }
if self.check(TokenKind::DotDotDot) {
self.advance(); // consume ...
let spread_expr = self.parse_expr()?;
spread = Some(Box::new(spread_expr));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
while !self.check(TokenKind::RBrace) {
let name = self.parse_ident()?;
@@ -2212,7 +2294,11 @@ impl Parser {
self.expect(TokenKind::RBrace)?;
let span = start.merge(self.previous_span());
Ok(Expr::Record { fields, span })
Ok(Expr::Record {
spread,
fields,
span,
})
}
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {

View File

@@ -228,13 +228,14 @@ impl SymbolTable {
Declaration::Let(let_decl) => {
let is_public = matches!(let_decl.visibility, Visibility::Public);
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
let_decl.name.name.clone(),
SymbolKind::Variable,
let_decl.span,
type_sig,
is_public,
);
symbol.documentation = let_decl.doc.clone();
let id = self.add_symbol(scope_idx, symbol);
self.add_reference(id, let_decl.name.span, true, true);
@@ -279,13 +280,14 @@ impl SymbolTable {
};
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
f.name.name.clone(),
SymbolKind::Function,
f.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = f.doc.clone();
let fn_id = self.add_symbol(scope_idx, symbol);
self.add_reference(fn_id, f.name.span, true, false);
@@ -326,13 +328,14 @@ impl SymbolTable {
let is_public = matches!(t.visibility, Visibility::Public);
let type_sig = format!("type {}", t.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
t.name.name.clone(),
SymbolKind::Type,
t.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = t.doc.clone();
let type_id = self.add_symbol(scope_idx, symbol);
self.add_reference(type_id, t.name.span, true, false);
@@ -372,13 +375,14 @@ impl SymbolTable {
let is_public = true; // Effects are typically public
let type_sig = format!("effect {}", e.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
e.name.name.clone(),
SymbolKind::Effect,
e.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = e.doc.clone();
let effect_id = self.add_symbol(scope_idx, symbol);
// Add operations
@@ -409,13 +413,14 @@ impl SymbolTable {
let is_public = matches!(t.visibility, Visibility::Public);
let type_sig = format!("trait {}", t.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
t.name.name.clone(),
SymbolKind::Type, // Traits are like types
t.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = t.doc.clone();
self.add_symbol(scope_idx, symbol);
}
@@ -479,7 +484,7 @@ impl SymbolTable {
self.visit_expr(arg, scope_idx);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.visit_expr(object, scope_idx);
}
Expr::If { condition, then_branch, else_branch, .. } => {
@@ -522,7 +527,10 @@ impl SymbolTable {
self.visit_expr(e, scope_idx);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.visit_expr(spread_expr, scope_idx);
}
for (_, e) in fields {
self.visit_expr(e, scope_idx);
}

View File

@@ -335,11 +335,14 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
Statement::Expr(e) => references_params(e, params),
}) || references_params(result, params)
}
Expr::Field { object, .. } => references_params(object, params),
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => references_params(object, params),
Expr::Lambda { body, .. } => references_params(body, params),
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)),
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| references_params(s, params))
|| fields.iter().any(|(_, e)| references_params(e, params))
}
Expr::Match { scrutinee, arms, .. } => {
references_params(scrutinee, params)
|| arms.iter().any(|a| references_params(&a.body, params))
@@ -516,10 +519,11 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
elements.iter().any(|e| has_recursive_calls(func_name, e))
}
Expr::Record { fields, .. } => {
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
}
Expr::Field { object, .. } => has_recursive_calls(func_name, object),
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
Expr::Let { value, body, .. } => {
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
}
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
// Build the record expression
Some(Expr::Record {
spread: None,
fields: field_exprs,
span,
})
@@ -1536,7 +1541,7 @@ impl TypeChecker {
// Use the declared type if present, otherwise use inferred
let final_type = if let Some(ref type_expr) = let_decl.typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&inferred, &declared) {
if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -1673,6 +1678,42 @@ impl TypeChecker {
span,
} => self.infer_field(object, field, *span),
Expr::TupleIndex {
object,
index,
span,
} => {
let object_type = self.infer_expr(object);
match &object_type {
Type::Tuple(types) => {
if *index < types.len() {
types[*index].clone()
} else {
self.errors.push(TypeError {
message: format!(
"Tuple index {} out of bounds for tuple with {} elements",
index,
types.len()
),
span: *span,
});
Type::Error
}
}
Type::Var(_) => Type::var(),
_ => {
self.errors.push(TypeError {
message: format!(
"Cannot use tuple index on non-tuple type {}",
object_type
),
span: *span,
});
Type::Error
}
}
}
Expr::Lambda {
params,
return_type,
@@ -1708,7 +1749,11 @@ impl TypeChecker {
span,
} => self.infer_block(statements, result, *span),
Expr::Record { fields, span } => self.infer_record(fields, *span),
Expr::Record {
spread,
fields,
span,
} => self.infer_record(spread.as_deref(), fields, *span),
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
@@ -1747,7 +1792,7 @@ impl TypeChecker {
match op {
BinaryOp::Add => {
// Add supports both numeric types and string concatenation
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1768,9 +1813,32 @@ impl TypeChecker {
}
}
BinaryOp::Concat => {
// Concat (++) supports strings and lists
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '++' must have same type: {}", e),
span,
});
}
match &left_type {
Type::String | Type::List(_) | Type::Var(_) => left_type,
_ => {
self.errors.push(TypeError {
message: format!(
"Operator '++' requires String or List operands, got {}",
left_type
),
span,
});
Type::Error
}
}
}
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
// Arithmetic: both operands must be same numeric type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1794,7 +1862,7 @@ impl TypeChecker {
BinaryOp::Eq | BinaryOp::Ne => {
// Equality: operands must have same type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1805,7 +1873,7 @@ impl TypeChecker {
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
// Comparison: operands must be same orderable type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1816,13 +1884,13 @@ impl TypeChecker {
BinaryOp::And | BinaryOp::Or => {
// Logical: both must be Bool
if let Err(e) = unify(&left_type, &Type::Bool) {
if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Left operand of '{}' must be Bool: {}", op, e),
span: left.span(),
});
}
if let Err(e) = unify(&right_type, &Type::Bool) {
if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Right operand of '{}' must be Bool: {}", op, e),
span: right.span(),
@@ -1836,7 +1904,7 @@ impl TypeChecker {
// right must be a function that accepts left's type
let result_type = Type::var();
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
if let Err(e) = unify(&right_type, &expected_fn) {
if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Pipe target must be a function accepting {}: {}",
@@ -1868,7 +1936,7 @@ impl TypeChecker {
}
},
UnaryOp::Not => {
if let Err(e) = unify(&operand_type, &Type::Bool) {
if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Operator '!' requires Bool operand: {}", e),
span,
@@ -1883,6 +1951,17 @@ impl TypeChecker {
let func_type = self.infer_expr(func);
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
// Check property constraints from where clauses
if let Expr::Var(func_id) = func {
if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() {
@@ -1919,7 +1998,7 @@ impl TypeChecker {
self.current_effects.clone(),
);
match unify(&func_type, &expected_fn) {
match unify_with_env(&func_type, &expected_fn, &self.env) {
Ok(subst) => result_type.apply(&subst),
Err(e) => {
// Provide more detailed error message based on the type of mismatch
@@ -1993,10 +2072,22 @@ impl TypeChecker {
if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) {
// It's a function call on a module field
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
let result_type = Type::var();
let expected_fn = Type::function(arg_types, result_type.clone());
if let Err(e) = unify(field_type, &expected_fn) {
if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Type mismatch in {}.{} call: {}",
@@ -2052,6 +2143,17 @@ impl TypeChecker {
// Check argument types
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
if arg_types.len() != op.params.len() {
self.errors.push(TypeError {
message: format!(
@@ -2068,7 +2170,7 @@ impl TypeChecker {
for (i, (arg_type, (_, param_type))) in
arg_types.iter().zip(op.params.iter()).enumerate()
{
if let Err(e) = unify(arg_type, param_type) {
if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Argument {} of '{}.{}' has type {}, expected {}: {}",
@@ -2101,6 +2203,7 @@ impl TypeChecker {
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
let object_type = self.infer_expr(object);
let object_type = self.env.expand_type_alias(&object_type);
match &object_type {
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
@@ -2181,7 +2284,7 @@ impl TypeChecker {
// Check return type if specified
let ret_type = if let Some(rt) = return_type {
let declared = self.resolve_type(rt);
if let Err(e) = unify(&body_type, &declared) {
if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Lambda body type {} doesn't match declared {}: {}",
@@ -2247,7 +2350,7 @@ impl TypeChecker {
span: Span,
) -> Type {
let cond_type = self.infer_expr(condition);
if let Err(e) = unify(&cond_type, &Type::Bool) {
if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
span: condition.span(),
@@ -2257,7 +2360,7 @@ impl TypeChecker {
let then_type = self.infer_expr(then_branch);
let else_type = self.infer_expr(else_branch);
match unify(&then_type, &else_type) {
match unify_with_env(&then_type, &else_type, &self.env) {
Ok(subst) => then_type.apply(&subst),
Err(e) => {
self.errors.push(TypeError {
@@ -2298,7 +2401,7 @@ impl TypeChecker {
// Check guard if present
if let Some(ref guard) = arm.guard {
let guard_type = self.infer_expr(guard);
if let Err(e) = unify(&guard_type, &Type::Bool) {
if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Match guard must be Bool: {}", e),
span: guard.span(),
@@ -2314,7 +2417,7 @@ impl TypeChecker {
match &result_type {
None => result_type = Some(body_type),
Some(prev) => {
if let Err(e) = unify(prev, &body_type) {
if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Match arm has incompatible type: expected {}, got {}: {}",
@@ -2364,7 +2467,7 @@ impl TypeChecker {
Pattern::Literal(lit) => {
let lit_type = self.infer_literal(lit);
if let Err(e) = unify(&lit_type, expected) {
if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
self.errors.push(TypeError {
message: format!("Pattern literal type mismatch: {}", e),
span: lit.span,
@@ -2378,7 +2481,7 @@ impl TypeChecker {
// For now, handle Option specially
match name.name.as_str() {
"None" => {
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) {
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
self.errors.push(TypeError {
message: format!(
"None pattern doesn't match type {}: {}",
@@ -2391,7 +2494,7 @@ impl TypeChecker {
}
"Some" => {
let inner_type = Type::var();
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone())))
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
{
self.errors.push(TypeError {
message: format!(
@@ -2420,7 +2523,7 @@ impl TypeChecker {
Pattern::Tuple { elements, span } => {
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) {
if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
self.errors.push(TypeError {
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
span: *span,
@@ -2470,7 +2573,7 @@ impl TypeChecker {
if let Some(type_expr) = typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&value_type, &declared) {
if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -2491,12 +2594,47 @@ impl TypeChecker {
self.infer_expr(result)
}
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type {
let field_types: Vec<(String, Type)> = fields
fn infer_record(
&mut self,
spread: Option<&Expr>,
fields: &[(Ident, Expr)],
span: Span,
) -> Type {
// Start with spread fields if present
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
let spread_type = self.infer_expr(spread_expr);
let spread_type = self.env.expand_type_alias(&spread_type);
match spread_type {
Type::Record(spread_fields) => spread_fields,
_ => {
self.errors.push(TypeError {
message: format!(
"Spread expression must be a record type, got {}",
spread_type
),
span,
});
Vec::new()
}
}
} else {
Vec::new()
};
// Apply explicit field overrides
let explicit_types: Vec<(String, Type)> = fields
.iter()
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
.collect();
for (name, typ) in explicit_types {
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
existing.1 = typ;
} else {
field_types.push((name, typ));
}
}
Type::Record(field_types)
}
@@ -2513,7 +2651,7 @@ impl TypeChecker {
let first_type = self.infer_expr(&elements[0]);
for elem in &elements[1..] {
let elem_type = self.infer_expr(elem);
if let Err(e) = unify(&first_type, &elem_type) {
if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
self.errors.push(TypeError {
message: format!("List elements must have same type: {}", e),
span,
@@ -2819,7 +2957,7 @@ impl TypeChecker {
// Check return type matches if specified
if let Some(ref return_type_expr) = impl_method.return_type {
let return_type = self.resolve_type(return_type_expr);
if let Err(e) = unify(&body_type, &return_type) {
if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Method '{}' body has type {}, but declared return type is {}: {}",
@@ -2862,6 +3000,9 @@ impl TypeChecker {
"Option" if resolved_args.len() == 1 => {
return Type::Option(Box::new(resolved_args[0].clone()));
}
"Map" if resolved_args.len() == 2 => {
return Type::Map(Box::new(resolved_args[0].clone()), Box::new(resolved_args[1].clone()));
}
_ => {}
}
}

View File

@@ -47,6 +47,8 @@ pub enum Type {
List(Box<Type>),
/// Option type (sugar for App(Option, [T]))
Option(Box<Type>),
/// Map type (sugar for App(Map, [K, V]))
Map(Box<Type>, Box<Type>),
/// Versioned type (e.g., User @v2)
Versioned {
base: Box<Type>,
@@ -119,6 +121,7 @@ impl Type {
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
Type::Map(k, v) => k.contains_var(var) || v.contains_var(var),
Type::Versioned { base, .. } => base.contains_var(var),
_ => false,
}
@@ -158,6 +161,7 @@ impl Type {
),
Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
Type::Map(k, v) => Type::Map(Box::new(k.apply(subst)), Box::new(v.apply(subst))),
Type::Versioned { base, version } => Type::Versioned {
base: Box::new(base.apply(subst)),
version: version.clone(),
@@ -208,6 +212,11 @@ impl Type {
vars
}
Type::List(inner) | Type::Option(inner) => inner.free_vars(),
Type::Map(k, v) => {
let mut vars = k.free_vars();
vars.extend(v.free_vars());
vars
}
Type::Versioned { base, .. } => base.free_vars(),
_ => HashSet::new(),
}
@@ -279,6 +288,7 @@ impl fmt::Display for Type {
}
Type::List(inner) => write!(f, "List<{}>", inner),
Type::Option(inner) => write!(f, "Option<{}>", inner),
Type::Map(k, v) => write!(f, "Map<{}, {}>", k, v),
Type::Versioned { base, version } => {
write!(f, "{} {}", base, version)
}
@@ -946,6 +956,14 @@ impl TypeEnv {
params: vec![("path".to_string(), Type::String)],
return_type: Type::Unit,
},
EffectOpDef {
name: "copy".to_string(),
params: vec![
("source".to_string(), Type::String),
("dest".to_string(), Type::String),
],
return_type: Type::Unit,
},
],
},
);
@@ -1146,6 +1164,15 @@ impl TypeEnv {
],
return_type: Type::Unit,
},
EffectOpDef {
name: "assertEqualMsg".to_string(),
params: vec![
("expected".to_string(), Type::Var(0)),
("actual".to_string(), Type::Var(0)),
("label".to_string(), Type::String),
],
return_type: Type::Unit,
},
EffectOpDef {
name: "assertNotEqual".to_string(),
params: vec![
@@ -1599,6 +1626,14 @@ impl TypeEnv {
"parseFloat".to_string(),
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
),
(
"indexOf".to_string(),
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
),
(
"lastIndexOf".to_string(),
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
),
]);
env.bind("String", TypeScheme::mono(string_module_type));
@@ -1758,6 +1793,73 @@ impl TypeEnv {
]);
env.bind("Option", TypeScheme::mono(option_module_type));
// Map module
let map_v = || Type::var();
let map_type = || Type::Map(Box::new(Type::String), Box::new(Type::var()));
let map_module_type = Type::Record(vec![
(
"new".to_string(),
Type::function(vec![], map_type()),
),
(
"set".to_string(),
Type::function(
vec![map_type(), Type::String, map_v()],
map_type(),
),
),
(
"get".to_string(),
Type::function(
vec![map_type(), Type::String],
Type::Option(Box::new(map_v())),
),
),
(
"contains".to_string(),
Type::function(vec![map_type(), Type::String], Type::Bool),
),
(
"remove".to_string(),
Type::function(vec![map_type(), Type::String], map_type()),
),
(
"keys".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(Type::String))),
),
(
"values".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(map_v()))),
),
(
"size".to_string(),
Type::function(vec![map_type()], Type::Int),
),
(
"isEmpty".to_string(),
Type::function(vec![map_type()], Type::Bool),
),
(
"fromList".to_string(),
Type::function(
vec![Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()])))],
map_type(),
),
),
(
"toList".to_string(),
Type::function(
vec![map_type()],
Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()]))),
),
),
(
"merge".to_string(),
Type::function(vec![map_type(), map_type()], map_type()),
),
]);
env.bind("Map", TypeScheme::mono(map_module_type));
// Result module
let result_type = Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
@@ -1870,9 +1972,47 @@ impl TypeEnv {
"round".to_string(),
Type::function(vec![Type::var()], Type::Int),
),
(
"sin".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"cos".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"atan2".to_string(),
Type::function(vec![Type::Float, Type::Float], Type::Float),
),
]);
env.bind("Math", TypeScheme::mono(math_module_type));
// Int module
let int_module_type = Type::Record(vec![
(
"toString".to_string(),
Type::function(vec![Type::Int], Type::String),
),
(
"toFloat".to_string(),
Type::function(vec![Type::Int], Type::Float),
),
]);
env.bind("Int", TypeScheme::mono(int_module_type));
// Float module
let float_module_type = Type::Record(vec![
(
"toString".to_string(),
Type::function(vec![Type::Float], Type::String),
),
(
"toInt".to_string(),
Type::function(vec![Type::Float], Type::Int),
),
]);
env.bind("Float", TypeScheme::mono(float_module_type));
env
}
@@ -1956,6 +2096,9 @@ impl TypeEnv {
Type::Option(inner) => {
Type::Option(Box::new(self.expand_type_alias(inner)))
}
Type::Map(k, v) => {
Type::Map(Box::new(self.expand_type_alias(k)), Box::new(self.expand_type_alias(v)))
}
Type::Versioned { base, version } => {
Type::Versioned {
base: Box::new(self.expand_type_alias(base)),
@@ -2032,7 +2175,9 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Function's required effects (e1) must be a subset of available effects (e2)
// A pure function (empty effects) can be called anywhere
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
if !e1.is_subset(&e2) {
// When expected effects (e2) are empty, it means "no constraint" (e.g., callback parameter)
// so we allow any actual effects through
if !e2.is_empty() && !e1.is_subset(&e2) {
return Err(format!(
"Effect mismatch: expected {{{}}}, got {{{}}}",
e1, e2
@@ -2114,6 +2259,13 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Option
(Type::Option(a), Type::Option(b)) => unify(a, b),
// Map
(Type::Map(k1, v1), Type::Map(k2, v2)) => {
let s1 = unify(k1, k2)?;
let s2 = unify(&v1.apply(&s1), &v2.apply(&s1))?;
Ok(s1.compose(&s2))
}
// Versioned types
(
Type::Versioned {