71 Commits

Author SHA1 Message Date
81e58cf3d5 Fix Option<String> pattern match double-dereference in C codegen
LuxString is typedef char* but the codegen treated it as a struct type,
generating *(LuxString*)(field0) instead of (LuxString)(field0). This
caused a heap-buffer-overflow on any Option<String> pattern match since
it read the string contents as a memory address.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 23:59:47 -05:00
92d443e475 chore: bump version to 0.1.13 2026-02-20 20:41:01 -05:00
fe30206cd0 add cargo lock 2026-02-20 20:40:55 -05:00
563d62f526 feat: add module import support to JS backend
The JS backend now processes imported modules, emitting their type
constructors and functions with module-prefixed mangled names. Module
function calls (both via Expr::Call with Expr::Field and via
Expr::EffectOp) are resolved to the correct mangled names.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 20:38:36 -05:00
e9ec1bb84d feat: add handler declaration codegen to JS backend
Handler declarations now emit as JavaScript objects with operation
methods. Each operation defines resume as an identity function,
matching the simple handler model used by the interpreter.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 20:31:10 -05:00
e46afd98eb feat: auto-invoke let main in JS backend
The JS backend now detects `let main = fn() => ...` patterns and
auto-invokes them at the end of the generated code, matching the
interpreter's behavior.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 20:24:47 -05:00
64f33e4e4b feat: add List.get support to JS backend
List.get(list, index) now correctly compiles to JavaScript, returning
Lux.Some(value) for valid indices and Lux.None() for out-of-bounds.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 20:22:15 -05:00
293635f415 chore: bump version to 0.1.12 2026-02-20 20:03:04 -05:00
694e4ec999 feat: add Ref cells for mutable state (Ref.new, Ref.get, Ref.set, Ref.update)
Implements WISH-013 mutable state primitives. Ref<T> is a mutable container
using existing module call syntax. Supported across interpreter, JS, and C backends.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 20:01:29 -05:00
78879ca94e chore: bump version to 0.1.11 2026-02-20 19:36:11 -05:00
01474b401f chore: bump version to 0.1.10 2026-02-20 19:32:56 -05:00
169de0b3c8 chore: update Cargo.lock
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 19:32:27 -05:00
667a94b4dc feat: add extern let declarations for JS FFI
Add support for `extern let name: Type` and `extern let name: Type = "jsName"`
syntax for declaring external JavaScript values. This follows the same pattern
as extern fn across all compiler passes: parser, typechecker, interpreter
(runtime error placeholder), JS backend (emits JS name directly without
mangling), formatter, linter, modules, and symbol table.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 19:29:44 -05:00
1b629aaae4 feat: add 10 missing List operations to JS backend
Add find, findIndex, any, all, zip, flatten, contains, take, drop,
and forEach to the JS backend's emit_list_operation function. These
operations previously worked in the interpreter and C backend but
caused "Unknown List operation" errors when compiled to JS.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 19:21:26 -05:00
0f8babfd8b chore: bump version to 0.1.9 2026-02-20 18:46:51 -05:00
582d603513 chore: update Cargo.lock for v0.1.8
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 18:42:29 -05:00
fbb7ddb6c3 feat: add extern fn declarations for JS FFI
Adds `extern fn` syntax for declaring external JavaScript functions:
  extern fn getElementById(id: String): Element
  extern fn getContext(el: Element, kind: String): CanvasCtx = "getContext"
  pub extern fn alert(msg: String): Unit

Changes across 11 files:
- Lexer: `extern` keyword
- AST: `ExternFnDecl` struct + `Declaration::ExternFn` variant
- Parser: parse `extern fn` with optional `= "jsName"` override
- Typechecker: register extern fn type signatures
- Interpreter: ExternFn value with clear error on call
- JS backend: emit extern fn calls using JS name (no _lux suffix)
- C backend: silently skips extern fns
- Formatter, linter, modules, symbol_table: handle new variant

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 18:38:42 -05:00
400acc3f35 feat: add deep path record update syntax
Adds parser desugaring for `{ ...base, pos.x: val, pos.y: val2 }` which
expands to `{ ...base, pos: { ...base.pos, x: val, y: val2 } }`.
Supports arbitrary nesting depth (e.g. world.physics.gravity.y).
Detects conflicts between flat and deep path fields.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 18:13:11 -05:00
ea3a7ca2dd chore: bump version to 0.1.8 2026-02-20 16:45:49 -05:00
7b40421a6a feat: add List.findIndex, List.zip, List.flatten, List.contains
Add missing List operations requested by ergon game engine project:
- findIndex(list, predicate) -> Option<Int>
- zip(list1, list2) -> List<(A, B)>
- flatten(listOfLists) -> List<A>
- contains(list, element) -> Bool

Resolves ergon porting blocker #4.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 16:40:32 -05:00
26b94935e9 feat: add File.tryRead, File.tryWrite, File.tryDelete returning Result
Add safe variants of File operations that return Result<T, String> instead
of crashing with RuntimeError. This prevents server crashes when a file
is missing or unwritable.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 11:04:33 -05:00
018a799c05 chore: bump version to 0.1.7 2026-02-20 10:38:37 -05:00
ec78286165 feat: enhance Html and Http stdlib modules
Html: add RawHtml, Attribute, meta/link/script/iframe/figure/figcaption
elements, attr() helper, rawHtml() helper, seoDocument() for SEO meta
tags, fix document() to use Attribute instead of DataAttr for standard
HTML attributes.

Http: add serveStaticFile(), parseFormBody(), getFormField(),
sendResponse() convenience helpers.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 10:36:56 -05:00
f2688072ac feat: add File.glob for file pattern matching (issue 15)
Add File.glob(pattern) effect operation that returns a list of file
paths matching a glob pattern (e.g., "src/**/*.lux"). Implemented
across interpreter (using glob crate), JS backend (handler-based),
and C backend (using POSIX glob.h).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 10:33:59 -05:00
746643527d feat: add triple-quoted multiline string literals (issue 12)
Support """...""" syntax for multiline strings with:
- Automatic indent stripping (based on minimum indentation)
- Leading newline after opening """ is skipped
- Trailing whitespace-only line before closing """ is stripped
- String interpolation ({expr}) support
- All escape sequences supported
- Formatter outputs multiline strings for strings containing newlines

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 10:22:52 -05:00
091ff1e422 feat: add List.sort and List.sortBy functions (issue 9)
Add sorting support to the List module across all backends:
- List.sort for natural ordering (Int, Float, String, Bool, Char)
- List.sortBy for custom comparator-based sorting

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 10:02:21 -05:00
1fc472a54c feat: support module-qualified constructor patterns in match expressions (issue 3)
Added module: Option<Ident> to Pattern::Constructor, updated parser to
handle module.Constructor(args) syntax in patterns, exported ADT
constructors from modules, and copied type definitions during module
import so types like Shape are usable in importing files.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-20 09:46:51 -05:00
caabaeeb9c fix: allow multi-line function params, lambda params, tuples, and patterns
Added skip_newlines() calls throughout the parser so that newlines are
properly handled in parameter lists, tuple expressions, and pattern
matching constructs. Fixes Issue 5 and Issue 6 from ISSUES.md.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 23:49:47 -05:00
4e43d3d50d fix: C backend String.indexOf/lastIndexOf compilation (issue 8)
Three bugs fixed:
- Global let bindings always typed as LuxInt; now inferred from value
- Option inner type not tracked for function params; added
  var_option_inner_types map so match extraction uses correct type
- indexOf/lastIndexOf stored ints as (void*)(intptr_t) but extraction
  expected boxed pointers; now uses lux_box_int consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 21:10:52 -05:00
fd5ed53b29 chore: bump version to 0.1.6 2026-02-19 15:22:32 -05:00
2800ce4e2d chore: sync Cargo.lock
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:26:20 -05:00
ec365ebb3f feat: add File.copy and propagate effectful callback effects (WISH-7, WISH-14)
File.copy(source, dest) copies files via interpreter (std::fs::copy) and
C backend (fread/fwrite). Effectful callbacks passed to higher-order
functions like List.map/forEach now propagate their effects to the
enclosing function's inferred effect set.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 09:24:28 -05:00
52dcc88051 chore: bump version to 0.1.5 2026-02-19 03:47:28 -05:00
1842b668e5 chore: sync Cargo.lock with version 0.1.4
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:47:11 -05:00
c67e3f31c3 feat: add and/or keywords, handle alias, --watch flag, JS tree-shaking
- WISH-008: `and`/`or` as aliases for `&&`/`||` boolean operators
- WISH-006: `handle` as alias for `run ... with` (same AST output)
- WISH-005: `--watch` flag for `lux compile` recompiles on file change
- WISH-009: Tree-shake unused runtime sections from JS output based on
  which effects are actually used (Console, Random, Time, Http, Dom)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 03:35:47 -05:00
b0ccde749c chore: bump version to 0.1.4 2026-02-19 02:48:56 -05:00
4ba7a23ae3 feat: add comprehensive compilation checks to validate.sh
Adds interpreter, JS compilation, and C compilation checks for all
examples, showcase programs, standard examples, and projects (113 total
checks). Skip lists exclude programs requiring unsupported effects or
interactive I/O.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:43:46 -05:00
89741b4a32 fix: move top-level let initialization into main() in C backend
Top-level let bindings with function calls (e.g., `let result = factorial(10)`)
were emitted as static initializers, which is invalid C since function calls
aren't compile-time constants. Now globals are declared with zero-init and
initialized inside main() before any run expressions execute.

Also fixes validate.sh to use exit codes instead of grep for cargo check/build.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:31:49 -05:00
3a2376cd49 feat: port AST definitions to Lux (self-hosting)
Translate all 30+ type definitions from src/ast.rs (727 lines Rust)
into Lux ADTs in projects/lux-compiler/ast.lux.

Types ported: Span, Ident, Visibility, Version, VersionConstraint,
BehavioralProperty, WhereClause, ModulePath, ImportDecl, Program,
Declaration, FunctionDecl, Parameter, EffectDecl, EffectOp, TypeDecl,
TypeDef, RecordField, Variant, VariantFields, Migration, HandlerDecl,
HandlerImpl, LetDecl, TraitDecl, TraitMethod, TraitBound, ImplDecl,
TraitConstraint, ImplMethod, TypeExpr, Expr (19 variants), Literal,
LiteralKind, BinaryOp, UnaryOp, Statement, MatchArm, Pattern.

Passes `lux check` and `lux run`.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:07:30 -05:00
4dfb04a1b6 chore: sync Cargo.lock with version 0.1.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:51 -05:00
3cdde02eb2 feat: add Int.toFloat/Float.toInt JS backend support and fix Map C codegen
- JS backend: Add Int/Float module dispatch in both Call and EffectOp paths
  for toFloat, toInt, and toString operations
- C backend: Fix lux_strdup → lux_string_dup in Map module codegen

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:40 -05:00
a5762d0397 feat: add built-in Map type with String keys
Add Map<String, V> as a first-class built-in type for key-value storage,
needed for self-hosting the compiler (parser/typechecker/interpreter all
rely heavily on hashmaps).

- types.rs: Type::Map(K,V) variant, all match arms (unify, apply, etc.)
- interpreter.rs: Value::Map, 12 BuiltinFn variants (new/set/get/contains/
  remove/keys/values/size/isEmpty/fromList/toList/merge), immutable semantics
- typechecker.rs: Map<K,V> resolution in resolve_type
- js_backend.rs: Map as JS Map with emit_map_operation()
- c_backend.rs: LuxMap struct (linear-scan), runtime fns, emit_map_operation()
- main.rs: 12 tests covering all Map operations
- validate.sh: now checks all projects/ directories too

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:45:13 -05:00
1132c621c6 fix: allow newlines before then in if/then/else expressions
The parser now skips newlines between the condition and `then` keyword,
enabling multiline if expressions like:
  if long_condition
    then expr1
    else expr2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:38:05 -05:00
a0fff1814e fix: JS backend scoping for let/match/if inside closures
Three related bugs fixed:
- BUG-009: let bindings inside lambdas hoisted to top-level
- BUG-011: match expressions inside lambdas hoisted to top-level
- BUG-012: variable name deduplication leaked across function scopes

Root cause: emit_expr() uses writeln() for statements, but lambdas
captured only the return value, not the emitted statements. Also,
var_substitutions from emit_function() leaked to subsequent code.

Fix: Lambda handler now captures all output emitted during body
evaluation and places it inside the function body. Both emit_function
and Lambda save/restore var_substitutions to prevent cross-scope leaks.
Lambda params are registered as identity substitutions to override any
outer bindings with the same name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:10:55 -05:00
4e9e823246 fix: record spread works with named type aliases
Resolve type aliases (e.g. Player -> { pos: Vec2, speed: Float })
before checking if spread expression is a record type. Previously
{ ...p, field: val } failed with "must be a record type, got Player"
when the variable had a named type annotation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 00:01:20 -05:00
6a2e4a7ac1 chore: bump version to 0.1.3 2026-02-18 23:06:10 -05:00
3d706cb32b feat: add record spread syntax { ...base, field: val }
Adds spread operator for records, allowing concise record updates:
  let p2 = { ...p, x: 5.0 }

Changes across the full pipeline:
- Lexer: new DotDotDot (...) token
- AST: optional spread field on Record variant
- Parser: detect ... at start of record expression
- Typechecker: merge spread record fields with explicit overrides
- Interpreter: evaluate spread, overlay explicit fields
- JS backend: emit native JS spread syntax
- C backend: copy spread into temp, assign overrides
- Formatter, linter, LSP, symbol table: propagate spread

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:27 -05:00
7c3bfa9301 feat: add Math.sin, Math.cos, Math.atan2 trig functions
Adds trigonometric functions to the Math module across interpreter,
type system, and C backend. JS backend already supported them.
Also adds #include <math.h> to C preamble and handles Math module
calls through both Call and EffectOp paths in C backend.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:12 -05:00
b56c5461f1 fix: JS const _ duplication and hardcoded version string
- JS backend now emits wildcard let bindings as side-effect statements
  instead of const _ declarations, fixing SyntaxError on multiple let _ = ...
- Version string now uses env!("CARGO_PKG_VERSION") to auto-sync with Cargo.toml
- Add -lm linker flag for math library support

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:03 -05:00
61e1469845 feat: add ++ concat operator and auto-invoke main
BUG-004: Add ++ operator for string and list concatenation across all
backends (interpreter, C, JS) with type checking and formatting support.

BUG-001: Auto-invoke top-level `let main = fn () => ...` when main is
a zero-parameter function, instead of just printing the function value.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 22:01:41 -05:00
bb0a288210 chore: bump version to 0.1.2 2026-02-18 21:16:44 -05:00
5d7f4633e1 docs: add explicit commit instructions to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:27 -05:00
d05b13d840 fix: JS backend compiles print() to console.log()
Bare `print()` calls in Lux now emit `console.log()` in JS output
instead of undefined `print()`. Fixes BUG-006.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:07 -05:00
0ee3050704 chore: bump version to 0.1.1 2026-02-18 20:41:43 -05:00
80b1276f9f fix: release script auto-bumps patch by default
Release script now supports: patch (default), minor, major, or explicit
version. Auto-updates Cargo.toml and flake.nix before building.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:41:29 -05:00
bd843d2219 fix: record type aliases now work for unification and field access
Expand type aliases via unify_with_env() everywhere in the type checker,
not just in a few places. This fixes named record types like
`type Vec2 = { x: Float, y: Float }` — they now properly unify with
anonymous records and support field access (v.x, v.y).

Also adds scripts/validate.sh for automated full-suite regression
testing (Rust tests + all 5 package test suites + type checking).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:21:29 -05:00
d76aa17b38 feat: static binary builds and automated release script
Switch reqwest from native-tls (openssl) to rustls-tls for a pure-Rust
TLS stack, enabling fully static musl builds. Add `nix build .#static`
for portable Linux binaries and `scripts/release.sh` for automated
Gitea releases with changelog generation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 19:09:32 -05:00
c23d9c7078 fix: test runner now supports module imports
The `lux test` command used Parser::parse_source() and
check_program() directly, which meant test files with `import`
statements would fail with type errors. Now uses ModuleLoader
and check_program_with_modules() to properly resolve imports,
and run_with_modules() for execution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 17:11:16 -05:00
fffacd2467 feat: C backend module import support, Int/Float.toString, Test.assertEqualMsg
The C backend can now compile programs that import user-defined modules.
Module-qualified calls like `mymodule.func(args)` are resolved to prefixed
C functions (e.g., `mymodule_func_lux`), with full support for transitive
imports and effect-passing. Also adds Int.toString/Float.toString to type
system, interpreter, and C backend, and Test.assertEqualMsg for labeled
test assertions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 16:35:24 -05:00
2ae2c132e5 docs: add language philosophy document and compiler integration
Write comprehensive PHILOSOPHY.md covering Lux's six core principles
(explicit over implicit, composition over configuration, safety without
ceremony, practical over academic, one right way, tools are the language)
with detailed comparisons against JS/TS, Python, Rust, Go, Java/C#,
Haskell/Elm, and Gleam/Elixir. Includes tooling audit and improvement
suggestions.

Add `lux philosophy` command to the compiler, update help screen with
abbreviated philosophy, and link from README.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 10:19:29 -05:00
4909ff9fff docs: add package ecosystem plan and error documentation workflow
Add PACKAGES.md analyzing the Lux package ecosystem gaps vs stdlib,
with prioritized implementation plans for markdown, xml, rss, frontmatter,
path, and sitemap packages. Add CLAUDE.md instructions for documenting
Lux language errors in ISSUES.md during every major task.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 10:01:56 -05:00
8e788c8a9f fix: embed C compiler path at build time for self-contained binary
build.rs captures the absolute path to cc/gcc/clang during compilation
and bakes it into the binary. On Nix systems this embeds the full
/nix/store path so `lux compile` works without cc on PATH.

Lookup order: $CC env var > embedded build-time path > PATH search.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 08:12:18 -05:00
dbdd3cca57 chore: move blu-site to its own repo at ~/src/blu-site
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:57:55 -05:00
3ac022c04a chore: gitignore build output (_site/, docs/)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:48:51 -05:00
6bedd37ac7 fix: show help menu when running lux with no arguments
Previously `lux` with no args entered the REPL. Now it shows the help
menu. Use `lux repl` to start the REPL explicitly.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 07:34:09 -05:00
2909bf14b6 fix: eliminate all non-json C backend errors (79→0)
Second round of C backend fixes, building on d8871ac which reduced
errors from 286 to 111. This eliminates all 79 non-json errors:

- Fix function references as values (wrap in LuxClosure*)
- Fix fold/map/filter with type-aware calling conventions
- Add String.indexOf/lastIndexOf emission and C runtime functions
- Add File.readDir with dirent.h implementation
- Fix string concat in closure bodies
- Exclude ADT constructors from closure free variable capture
- Fix match result type inference (prioritize pattern binding types)
- Fix Option inner type inference (usage-based for List.head)
- Fix void* to struct cast (dereference through pointer)
- Handle constructors in emit_expr_with_env

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 05:56:21 -05:00
d8871acf7e fix: improve C backend robustness, reduce compilation errors by 61%
- Fix closure captured variable types: look up actual types from var_types
  instead of hardcoding LuxInt for all captured variables
- Register function parameters in var_types so closures can find their types
- Replace is_string_expr() with infer_expr_type() for more accurate string
  detection in binary ops (concat, comparison)
- Add missing String operations to infer_expr_type (substring, indexOf, etc.)
- Add module method call type inference (String.*, List.*, Int.*, Float.*)
- Add built-in Result type (Ok/Err) to C prelude alongside Option
- Register Ok/Err/Some/None in variant_to_type and variant_field_types
- Fix variable scoping: use if-statement pattern instead of ternary when
  branches emit statements (prevents redefinition of h2/h3 etc.)
- Add RC scope management for if-else branches and match arms to prevent
  undeclared variable errors from cleanup code
- Add infer_pattern_binding_type for better match result type inference
- Add expr_emits_statements helper to detect statement-emitting expressions
- Add infer_option_inner_type for String.indexOf (returns Option<Int>)

Reduces blu-site compilation errors from 286 to 111 (remaining are mostly
unsupported json effect and function-as-value references).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 17:56:27 -05:00
73b5eee664 docs: add commit-after-every-piece-of-work instruction to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 16:21:54 -05:00
542255780d feat: add tuple index access, multiline args, and effect unification fix
- Tuple index: `pair.0`, `pair.1` syntax across parser, typechecker,
  interpreter, C/JS backends, formatter, linter, and symbol table
- Multi-line function args: allow newlines inside argument lists
- Fix effect unification for callback parameters (empty expected
  effects means "no constraint", not "must be pure")

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 16:21:48 -05:00
bac63bab2a feat: add blu-site static site generator and fix language issues
Build a complete static site generator in Lux that faithfully clones
blu.cx (elmstatic). Generates 14 post pages, section indexes, tag pages,
and a home page with snippets grid from markdown content.

Language fixes discovered during development:
- Add \{ and \} escape sequences in string literals (lexer)
- Register String.indexOf and String.lastIndexOf in type checker
- Fix formatter to preserve brace escapes in string literals
- Improve LSP hover to show documentation for let bindings and functions

ISSUES.md documents 15 Lux language limitations found during the project.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 15:43:05 -05:00
db82ca1a1c fix: improve LSP hover to show function info when cursor is on fn keyword
When hovering on declaration keywords (fn, type, effect, let, trait),
look ahead to find the declaration name and show that symbol's full
info from the symbol table instead of generic keyword documentation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-17 08:32:01 -05:00
29 changed files with 8049 additions and 530 deletions

5
.gitignore vendored
View File

@@ -4,6 +4,11 @@
# Claude Code project instructions
CLAUDE.md
# Build output
_site/
docs/*.html
docs/*.css
# Test binaries
hello
test_rc

View File

@@ -42,15 +42,46 @@ When making changes:
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
8. **Git commits**: Always use `--no-gpg-sign` flag
### Post-work checklist (run after each major piece of work)
### Post-work checklist (run after each committable change)
**MANDATORY: Run the full validation script after every committable change:**
```bash
./scripts/validate.sh
```
This script runs ALL of the following checks and will fail if any regress:
1. `cargo check` — no Rust compilation errors
2. `cargo test` — all Rust tests pass (currently 387)
3. `cargo build --release` — release binary builds
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
5. `lux check` on every package — type checking + lint passes
If `validate.sh` is not available or you need to run manually:
```bash
nix develop --command cargo check # No Rust errors
nix develop --command cargo test # All tests pass (currently 381)
./target/release/lux check # Type check + lint all .lux files
./target/release/lux fmt # Format all .lux files
./target/release/lux lint # Standalone lint pass
nix develop --command cargo test # All Rust tests pass
nix develop --command cargo build --release # Build release binary
cd ../packages/path && ../../lang/target/release/lux test # Package tests
cd ../packages/frontmatter && ../../lang/target/release/lux test
cd ../packages/xml && ../../lang/target/release/lux test
cd ../packages/rss && ../../lang/target/release/lux test
cd ../packages/markdown && ../../lang/target/release/lux test
```
**Do NOT commit if any check fails.** Fix the issue first.
### Commit after every piece of work
**After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
**Commit workflow:**
1. Make the change
2. Run `./scripts/validate.sh` (all 13 checks must pass)
3. `git add` the relevant files
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
5. Move on to the next task
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
**IMPORTANT: Always verify Lux code you write:**
- Run with interpreter: `./target/release/lux file.lux`
- Compile to binary: `./target/release/lux compile file.lux`
@@ -68,10 +99,45 @@ nix develop --command cargo test # All tests pass (currently 381)
| `lux serve` | `lux s` | Static file server |
| `lux compile` | `lux c` | Compile to binary |
## Documenting Lux Language Errors
When working on any major task that involves writing Lux code, **document every language error, limitation, or surprising behavior** you encounter. This log is optimized for LLM consumption so future sessions can avoid repeating mistakes.
**File:** Maintain an `ISSUES.md` in the relevant project directory (e.g., `~/src/blu-site/ISSUES.md`).
**Format for each entry:**
```markdown
## Issue N: <Short descriptive title>
**Category**: Parser limitation | Type checker gap | Missing feature | Runtime error | Documentation gap
**Severity**: High | Medium | Low
**Status**: Open | **Fixed** (commit hash or version)
<1-2 sentence description of the problem>
**Reproduction:**
```lux
// Minimal code that triggers the issue
```
**Error message:** `<exact error text>`
**Workaround:** <how to accomplish the goal despite the limitation>
**Fix:** <if fixed, what was changed and where>
```
**Rules:**
- Add new issues as you encounter them during any task
- When a previously documented issue gets fixed, update its status to **Fixed** and note the commit/version
- Remove entries that are no longer relevant (e.g., the feature was redesigned entirely)
- Keep the summary table at the bottom of ISSUES.md in sync with the entries
- Do NOT duplicate issues already documented -- check existing entries first
## Code Quality
- Fix all compiler warnings before committing
- Ensure all tests pass (currently 381 tests)
- Ensure all tests pass (currently 387 tests)
- Add new tests when adding features
- Keep examples and documentation in sync

217
Cargo.lock generated
View File

@@ -135,16 +135,6 @@ dependencies = [
"libc",
]
[[package]]
name = "core-foundation"
version = "0.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2a6cd9ae233e7f62ba4e9353e81a88df7fc8a5987b8d445b4d90c879bd156f6"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]]
name = "core-foundation-sys"
version = "0.8.7"
@@ -297,21 +287,6 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9c4f5dac5e15c24eb999c26181a6ca40b39fe946cbe4c263c7209467bc83af2"
[[package]]
name = "foreign-types"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
dependencies = [
"foreign-types-shared",
]
[[package]]
name = "foreign-types-shared"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
[[package]]
name = "form_urlencoded"
version = "1.2.2"
@@ -417,6 +392,12 @@ dependencies = [
"wasip3",
]
[[package]]
name = "glob"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0cc23270f6e1808e30a928bdc84dea0b9b4136a8bc82338574f23baf47bbd280"
[[package]]
name = "h2"
version = "0.3.27"
@@ -552,16 +533,17 @@ dependencies = [
]
[[package]]
name = "hyper-tls"
version = "0.5.0"
name = "hyper-rustls"
version = "0.24.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6183ddfa99b85da61a140bea0efc93fdf56ceaa041b37d553518030827f9905"
checksum = "ec3efd23720e2049821a693cbc7e65ea87c72f1c58ff2f9522ff332b1491e590"
dependencies = [
"bytes",
"futures-util",
"http",
"hyper",
"native-tls",
"rustls",
"tokio",
"tokio-native-tls",
"tokio-rustls",
]
[[package]]
@@ -794,8 +776,9 @@ dependencies = [
[[package]]
name = "lux"
version = "0.1.0"
version = "0.1.12"
dependencies = [
"glob",
"lsp-server",
"lsp-types",
"postgres",
@@ -843,23 +826,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "native-tls"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d5d26952a508f321b4d3d2e80e78fc2603eaefcdf0c30783867f19586518bdc"
dependencies = [
"libc",
"log",
"openssl",
"openssl-probe",
"openssl-sys",
"schannel",
"security-framework",
"security-framework-sys",
"tempfile",
]
[[package]]
name = "nibble_vec"
version = "0.1.0"
@@ -905,50 +871,6 @@ version = "1.21.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42f5e15c9953c5e4ccceeb2e7382a716482c34515315f7b03532b8b4e8393d2d"
[[package]]
name = "openssl"
version = "0.10.75"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08838db121398ad17ab8531ce9de97b244589089e290a384c900cb9ff7434328"
dependencies = [
"bitflags 2.10.0",
"cfg-if",
"foreign-types",
"libc",
"once_cell",
"openssl-macros",
"openssl-sys",
]
[[package]]
name = "openssl-macros"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "openssl-probe"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
[[package]]
name = "openssl-sys"
version = "0.9.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82cab2d520aa75e3c58898289429321eb788c3106963d0dc886ec7a5f4adc321"
dependencies = [
"cc",
"libc",
"pkg-config",
"vcpkg",
]
[[package]]
name = "parking_lot"
version = "0.12.5"
@@ -1203,15 +1125,15 @@ dependencies = [
"http",
"http-body",
"hyper",
"hyper-tls",
"hyper-rustls",
"ipnet",
"js-sys",
"log",
"mime",
"native-tls",
"once_cell",
"percent-encoding",
"pin-project-lite",
"rustls",
"rustls-pemfile",
"serde",
"serde_json",
@@ -1219,15 +1141,30 @@ dependencies = [
"sync_wrapper",
"system-configuration",
"tokio",
"tokio-native-tls",
"tokio-rustls",
"tower-service",
"url",
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
"webpki-roots",
"winreg",
]
[[package]]
name = "ring"
version = "0.17.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a4689e6c2294d81e88dc6261c768b63bc4fcdb852be6d1352498b114f61383b7"
dependencies = [
"cc",
"cfg-if",
"getrandom 0.2.17",
"libc",
"untrusted",
"windows-sys 0.52.0",
]
[[package]]
name = "rusqlite"
version = "0.31.0"
@@ -1255,6 +1192,18 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "rustls"
version = "0.21.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f56a14d1f48b391359b22f731fd4bd7e43c97f3c50eee276f3aa09c94784d3e"
dependencies = [
"log",
"ring",
"rustls-webpki",
"sct",
]
[[package]]
name = "rustls-pemfile"
version = "1.0.4"
@@ -1264,6 +1213,16 @@ dependencies = [
"base64 0.21.7",
]
[[package]]
name = "rustls-webpki"
version = "0.101.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b6275d1ee7a1cd780b64aca7726599a1dbc893b1e64144529e55c3c2f745765"
dependencies = [
"ring",
"untrusted",
]
[[package]]
name = "rustversion"
version = "1.0.22"
@@ -1298,15 +1257,6 @@ version = "1.0.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9774ba4a74de5f7b1c1451ed6cd5285a32eddb5cccb8cc655a4e50009e06477f"
[[package]]
name = "schannel"
version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "scopeguard"
version = "1.2.0"
@@ -1314,26 +1264,13 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "security-framework"
version = "3.6.0"
name = "sct"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d17b898a6d6948c3a8ee4372c17cb384f90d2e6e912ef00895b14fd7ab54ec38"
checksum = "da046153aa2352493d6cb7da4b6e5c0c057d8a1d0a9aa8560baffdd945acd414"
dependencies = [
"bitflags 2.10.0",
"core-foundation 0.10.1",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
[[package]]
name = "security-framework-sys"
version = "2.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "321c8673b092a9a42605034a9879d73cb79101ed5fd117bc9a597b89b4e9e61a"
dependencies = [
"core-foundation-sys",
"libc",
"ring",
"untrusted",
]
[[package]]
@@ -1521,7 +1458,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ba3a3adc5c275d719af8cb4272ea1c4a6d668a777f37e115f6d11ddbc1c8e0e7"
dependencies = [
"bitflags 1.3.2",
"core-foundation 0.9.4",
"core-foundation",
"system-configuration-sys",
]
@@ -1619,16 +1556,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "tokio-native-tls"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
dependencies = [
"native-tls",
"tokio",
]
[[package]]
name = "tokio-postgres"
version = "0.7.16"
@@ -1655,6 +1582,16 @@ dependencies = [
"whoami",
]
[[package]]
name = "tokio-rustls"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c28327cf380ac148141087fbfb9de9d7bd4e84ab5d2c28fbc911d753de8a7081"
dependencies = [
"rustls",
"tokio",
]
[[package]]
name = "tokio-util"
version = "0.7.18"
@@ -1750,6 +1687,12 @@ version = "0.2.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
[[package]]
name = "untrusted"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8ecb6da28b8a351d773b68d5825ac39017e680750f980f3a1a85cd8dd28a47c1"
[[package]]
name = "url"
version = "2.5.8"
@@ -1941,6 +1884,12 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "webpki-roots"
version = "0.25.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f20c57d8d7db6d3b86154206ae5d8fba62dd39573114de97c2cb0578251f8e1"
[[package]]
name = "whoami"
version = "2.1.1"

View File

@@ -1,6 +1,6 @@
[package]
name = "lux"
version = "0.1.0"
version = "0.1.13"
edition = "2021"
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
license = "MIT"
@@ -13,10 +13,11 @@ lsp-types = "0.94"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
rand = "0.8"
reqwest = { version = "0.11", features = ["blocking", "json"] }
reqwest = { version = "0.11", default-features = false, features = ["blocking", "json", "rustls-tls"] }
tiny_http = "0.12"
rusqlite = { version = "0.31", features = ["bundled"] }
postgres = "0.19"
glob = "0.3"
[dev-dependencies]

367
PACKAGES.md Normal file
View File

@@ -0,0 +1,367 @@
# Lux Package Ecosystem Plan
## Current State
### Stdlib (built-in)
| Module | Coverage |
|--------|----------|
| String | Comprehensive (split, join, trim, indexOf, replace, etc.) |
| List | Good (map, filter, fold, head, tail, concat, range, find, any, all, take, drop) |
| Option | Basic (map, flatMap, getOrElse, isSome, isNone) |
| Result | Basic (map, flatMap, getOrElse, isOk, isErr) |
| Math | Basic (abs, min, max, sqrt, pow, floor, ceil, round) |
| Json | Comprehensive (parse, stringify, get, typed extractors, constructors) |
| File | Good (read, write, append, exists, delete, readDir, isDir, mkdir) |
| Console | Good (print, read, readLine, readInt) |
| Process | Good (exec, execStatus, env, args, exit, cwd) |
| Http | Basic (get, post, put, delete, setHeader) |
| HttpServer | Basic (listen, accept, respond) |
| Time | Minimal (now, sleep) |
| Random | Basic (int, float, bool) |
| Sql | Good (SQLite: open, query, execute, transactions) |
| Postgres | Good (connect, query, execute, transactions) |
| Schema | Niche (versioned data migration) |
| Test | Good (assert, assertEqual, assertTrue) |
| Concurrent | Experimental (spawn, await, yield, cancel) |
| Channel | Experimental (create, send, receive) |
### Registry (pkgs.lux) - 3 packages
| Package | Version | Notes |
|---------|---------|-------|
| json | 1.0.0 | Wraps stdlib Json with convenience functions (getPath, getString, etc.) |
| http-client | 0.1.0 | Wraps stdlib Http with JSON helpers, URL encoding |
| testing | 0.1.0 | Wraps stdlib Test with describe/it structure |
---
## Gap Analysis
### What's Missing vs Other Languages
Compared to ecosystems like Rust/cargo, Go, Python, Elm, Gleam:
| Category | Gap | Impact | Notes |
|----------|-----|--------|-------|
| **Collections** | No HashMap, Set, Queue, Stack | Critical | List-of-pairs with O(n) lookup is the only option |
| **Sorting** | No List.sort or List.sortBy | High | Must implement insertion sort manually |
| **Date/Time** | Only `Time.now()` (epoch ms), no parsing/formatting | High | blu-site does string-based date formatting manually |
| **Markdown** | No markdown parser | High | blu-site has 300+ lines of hand-rolled markdown |
| **XML/RSS** | No XML generation | High | Can't generate RSS feeds or sitemaps |
| **Regex** | No pattern matching on strings | High | Character-by-character scanning required |
| **Path** | No file path utilities | Medium | basename/dirname manually reimplemented |
| **YAML/TOML** | No config file parsing (beyond JSON) | Medium | Frontmatter parsing is manual |
| **Template** | No string templating | Medium | HTML built via raw string concatenation |
| **URL** | No URL parsing/encoding | Medium | http-client has basic urlEncode but no parser |
| **Crypto** | No hashing (SHA256, etc.) | Medium | Can't do checksums, content hashing |
| **Base64** | No encoding/decoding | Low | Needed for data URIs, some auth |
| **CSV** | No CSV parsing | Low | Common data format |
| **UUID** | No UUID generation | Low | Useful for IDs |
| **Logging** | No structured logging | Low | Just Console.print |
| **CLI** | No argument parsing library | Low | Manual arg handling |
### What Should Be Stdlib vs Package
**Should be stdlib additions** (too fundamental to be packages):
- HashMap / Map type (requires runtime support)
- List.sort / List.sortBy (fundamental operation)
- Better Time module (date parsing, formatting)
- Regex (needs runtime/C support for performance)
- Path module (cross-platform file path handling)
**Should be packages** (application-level, opinionated, composable):
- markdown
- xml
- rss/atom
- frontmatter
- template
- csv
- crypto
- ssg (static site generator framework)
---
## Priority Package Plans
Ordered by what unblocks blu-site fixes first, then general ecosystem value.
---
### Package 1: `markdown` (Priority: HIGHEST)
**Why:** The 300-line markdown parser in blu-site's main.lux is general-purpose code that belongs in a reusable package. It's also the most complex part of blu-site and has known bugs (e.g., `### ` inside list items renders literally).
**Scope:**
```
markdown/
lux.toml
lib.lux # Public API: parse, parseInline
src/
inline.lux # Inline parsing (bold, italic, links, images, code)
block.lux # Block parsing (headings, lists, code blocks, blockquotes, hr)
types.lux # AST types (optional - could emit HTML directly)
```
**Public API:**
```lux
// Convert markdown string to HTML string
pub fn toHtml(markdown: String): String
// Convert inline markdown only (no blocks)
pub fn inlineToHtml(text: String): String
// Escape HTML entities
pub fn escapeHtml(s: String): String
```
**Improvements over current blu-site code:**
- Fix heading-inside-list-item rendering (`- ### Title` should work)
- Support nested lists (currently flat only)
- Support reference-style links `[text][ref]`
- Handle edge cases (empty lines in code blocks, nested blockquotes)
- Proper HTML entity escaping in more contexts
**Depends on:** Nothing (pure string processing)
**Estimated size:** ~400-500 lines of Lux
---
### Package 2: `xml` (Priority: HIGH)
**Why:** Needed for RSS/Atom feed generation, sitemap.xml, and robots.txt generation. General-purpose XML builder that doesn't try to parse XML (which would need regex), just emits it.
**Scope:**
```
xml/
lux.toml
lib.lux # Public API: element, document, serialize
```
**Public API:**
```lux
type XmlNode =
| Element(String, List<XmlAttr>, List<XmlNode>)
| Text(String)
| CData(String)
| Comment(String)
| Declaration(String, String) // version, encoding
type XmlAttr =
| Attr(String, String)
// Build an XML element
pub fn element(tag: String, attrs: List<XmlAttr>, children: List<XmlNode>): XmlNode
// Build a text node (auto-escapes)
pub fn text(content: String): XmlNode
// Build a CDATA section
pub fn cdata(content: String): XmlNode
// Serialize XML tree to string
pub fn serialize(node: XmlNode): String
// Serialize with XML declaration header
pub fn document(version: String, encoding: String, root: XmlNode): String
// Convenience: self-closing element
pub fn selfClosing(tag: String, attrs: List<XmlAttr>): XmlNode
```
**Depends on:** Nothing
**Estimated size:** ~150-200 lines
---
### Package 3: `rss` (Priority: HIGH)
**Why:** Directly needed for blu-site's #6 priority fix (add RSS feed). Builds on `xml` package.
**Scope:**
```
rss/
lux.toml # depends on xml
lib.lux # Public API: feed, item, toXml, toAtom
```
**Public API:**
```lux
type FeedInfo =
| FeedInfo(String, String, String, String, String)
// title, link, description, language, lastBuildDate
type FeedItem =
| FeedItem(String, String, String, String, String, String)
// title, link, description, pubDate, guid, categories (comma-separated)
// Generate RSS 2.0 XML string
pub fn toRss(info: FeedInfo, items: List<FeedItem>): String
// Generate Atom 1.0 XML string
pub fn toAtom(info: FeedInfo, items: List<FeedItem>): String
```
**Depends on:** `xml`
**Estimated size:** ~100-150 lines
---
### Package 4: `frontmatter` (Priority: HIGH)
**Why:** blu-site has ~50 lines of fragile frontmatter parsing. This is a common need for any content-driven Lux project. The current parser uses `String.indexOf(line, ": ")` which breaks on values containing `: `.
**Scope:**
```
frontmatter/
lux.toml
lib.lux # Public API: parse
```
**Public API:**
```lux
type FrontmatterResult =
| FrontmatterResult(List<(String, String)>, String)
// key-value pairs, remaining body
// Parse frontmatter from a string (--- delimited YAML-like header)
pub fn parse(content: String): FrontmatterResult
// Get a value by key from parsed frontmatter
pub fn get(pairs: List<(String, String)>, key: String): Option<String>
// Get a value or default
pub fn getOrDefault(pairs: List<(String, String)>, key: String, default: String): String
// Parse a space-separated tag string into a list
pub fn parseTags(tagString: String): List<String>
```
**Improvements over current blu-site code:**
- Handle values with `: ` in them (only split on first `: `)
- Handle multi-line values (indented continuation)
- Handle quoted values with embedded newlines
- Strip quotes from values consistently
**Depends on:** Nothing
**Estimated size:** ~100-150 lines
---
### Package 5: `path` (Priority: MEDIUM)
**Why:** blu-site manually implements `basename` and `dirname`. Any file-processing Lux program needs these. Tiny but universally useful.
**Scope:**
```
path/
lux.toml
lib.lux
```
**Public API:**
```lux
// Get filename from path: "/foo/bar.txt" -> "bar.txt"
pub fn basename(p: String): String
// Get directory from path: "/foo/bar.txt" -> "/foo"
pub fn dirname(p: String): String
// Get file extension: "file.txt" -> "txt", "file" -> ""
pub fn extension(p: String): String
// Remove file extension: "file.txt" -> "file"
pub fn stem(p: String): String
// Join path segments: join("foo", "bar") -> "foo/bar"
pub fn join(a: String, b: String): String
// Normalize path: "foo//bar/../baz" -> "foo/baz"
pub fn normalize(p: String): String
// Check if path is absolute
pub fn isAbsolute(p: String): Bool
```
**Depends on:** Nothing
**Estimated size:** ~80-120 lines
---
### Package 6: `sitemap` (Priority: MEDIUM)
**Why:** Directly needed for blu-site's #9 priority fix. Simple package that generates sitemap.xml.
**Scope:**
```
sitemap/
lux.toml # depends on xml
lib.lux
```
**Public API:**
```lux
type SitemapEntry =
| SitemapEntry(String, String, String, String)
// url, lastmod (ISO date), changefreq, priority
// Generate sitemap.xml string
pub fn generate(entries: List<SitemapEntry>): String
// Generate a simple robots.txt pointing to the sitemap
pub fn robotsTxt(sitemapUrl: String): String
```
**Depends on:** `xml`
**Estimated size:** ~50-70 lines
---
### Package 7: `ssg` (Priority: LOW - future)
**Why:** Once markdown, frontmatter, rss, sitemap, and path packages exist, the remaining logic in blu-site's main.lux is generic SSG framework code: read content dirs, parse posts, sort by date, generate section indexes, generate tag pages, copy static assets. This could be extracted into a framework package that other Lux users could use to build their own static sites.
**This should wait** until the foundation packages above are stable and battle-tested through blu-site usage.
---
## Non-Package Stdlib Improvements Needed
These gaps are too fundamental to be packages and should be added to the Lux language itself:
### HashMap (Critical)
Every package above that needs key-value lookups (frontmatter, xml attributes, etc.) is working around the lack of HashMap with `List<(String, String)>`. This is O(n) per lookup and makes code verbose. A stdlib `Map` module would transform the ecosystem.
### List.sort / List.sortBy (High)
blu-site implements insertion sort manually. Every content-driven app needs sorting. This should be a stdlib function.
### Time.format / Time.parse (High)
blu-site manually parses "2025-01-15" by substring extraction and maps month numbers to names. A proper date/time library (even just ISO 8601 parsing and basic formatting) would help every package above.
---
## Implementation Order
```
Phase 1 (unblock blu-site fixes):
1. markdown - extract from blu-site, fix bugs, publish
2. frontmatter - extract from blu-site, improve robustness
3. path - tiny, universally useful
4. xml - needed by rss and sitemap
Phase 2 (complete blu-site features):
5. rss - depends on xml
6. sitemap - depends on xml
Phase 3 (ecosystem growth):
7. template - string templating (mustache-like)
8. csv - data processing
9. cli - argument parsing
10. ssg - framework extraction from blu-site
```
Each package should be developed in its own directory under `~/src/`, published to the git.qrty.ink registry, and tested by integrating it into blu-site.

View File

@@ -2,15 +2,22 @@
A functional programming language with first-class effects, schema evolution, and behavioral types.
## Vision
## Philosophy
Most programming languages treat three critical concerns as afterthoughts:
**Make the important things visible.**
1. **Effects** — What can this code do? (Hidden, untraceable, untestable)
2. **Data Evolution** — Types change, data persists. (Manual migrations, runtime failures)
3. **Behavioral Properties** — Is this idempotent? Does it terminate? (Comments and hope)
Most languages hide what matters most: what code can do (effects), how data changes over time (schema evolution), and what guarantees functions provide (behavioral properties). Lux makes all three first-class, compiler-checked language features.
Lux makes these first-class language features. The compiler knows what your code does, how your data evolves, and what properties your functions guarantee.
| Principle | What it means |
|-----------|--------------|
| **Explicit over implicit** | Effects in types — see what code does |
| **Composition over configuration** | No DI frameworks — effects compose naturally |
| **Safety without ceremony** | Type inference + explicit signatures where they matter |
| **Practical over academic** | Familiar syntax, ML semantics, no monads |
| **One right way** | Opinionated formatter, integrated tooling, built-in test framework |
| **Tools are the language** | `lux fmt/lint/check/test/compile` — one binary, not seven tools |
See [docs/PHILOSOPHY.md](./docs/PHILOSOPHY.md) for the full philosophy with language comparisons and design rationale.
## Core Principles

38
build.rs Normal file
View File

@@ -0,0 +1,38 @@
use std::path::PathBuf;
fn main() {
// Capture the absolute C compiler path at build time so the binary is self-contained.
// This is critical for Nix builds where cc/gcc live in /nix/store paths.
let cc_path = std::env::var("CC").ok()
.filter(|s| !s.is_empty())
.and_then(|s| resolve_absolute(&s))
.or_else(|| find_in_path("cc"))
.or_else(|| find_in_path("gcc"))
.or_else(|| find_in_path("clang"))
.unwrap_or_default();
println!("cargo:rustc-env=LUX_CC_PATH={}", cc_path);
println!("cargo:rerun-if-env-changed=CC");
println!("cargo:rerun-if-env-changed=PATH");
}
/// Resolve a command name to its absolute path by searching PATH.
fn find_in_path(cmd: &str) -> Option<String> {
let path_var = std::env::var("PATH").ok()?;
for dir in path_var.split(':') {
let candidate = PathBuf::from(dir).join(cmd);
if candidate.is_file() {
return Some(candidate.to_string_lossy().into_owned());
}
}
None
}
/// If the path is already absolute and exists, return it. Otherwise search PATH.
fn resolve_absolute(cmd: &str) -> Option<String> {
let p = PathBuf::from(cmd);
if p.is_absolute() && p.is_file() {
return Some(cmd.to_string());
}
find_in_path(cmd)
}

449
docs/PHILOSOPHY.md Normal file
View File

@@ -0,0 +1,449 @@
# The Lux Philosophy
## In One Sentence
**Make the important things visible.**
## The Three Pillars
Most programming languages hide the things that matter most in production:
1. **What can this code do?** — Side effects are invisible in function signatures
2. **How does data change over time?** — Schema evolution is a deployment problem, not a language one
3. **What guarantees does this code provide?** — Properties like idempotency live in comments and hope
Lux makes all three first-class, compiler-checked language features.
---
## Core Principles
### 1. Explicit Over Implicit
Every function signature tells you what it does:
```lux
fn processOrder(order: Order): Receipt with {Database, Email, Logger}
```
You don't need to read the body, trace call chains, or check documentation. The signature *is* the documentation. Code review becomes: "should this function really send emails?"
**What this means in practice:**
- Effects are declared in types, not hidden behind interfaces
- No dependency injection frameworks — just swap handlers
- No mocking libraries — test with different effect implementations
- No "spooky action at a distance" — if a function can fail, its type says so
**How this compares:**
| Language | Side effects | Lux equivalent |
|----------|-------------|----------------|
| JavaScript | Anything, anywhere, silently | `with {Console, Http, File}` |
| Python | Implicit, discovered by reading code | Effect declarations in signature |
| Java | Checked exceptions (partial), DI frameworks | Effects + handlers |
| Go | Return error values (partial) | `with {Fail}` or `Result` |
| Rust | `unsafe` blocks, `Result`/`Option` | Effects for I/O, Result for values |
| Haskell | Monad transformers (explicit but heavy) | Effects (explicit and lightweight) |
| Koka | Algebraic effects (similar) | Same family, more familiar syntax |
### 2. Composition Over Configuration
Things combine naturally without glue code:
```lux
// Multiple effects compose by listing them
fn sync(id: UserId): User with {Database, Http, Logger} = ...
// Handlers compose by providing them
run sync(id) with {
Database = postgres(conn),
Http = realHttp,
Logger = consoleLogger
}
```
No monad transformers. No middleware stacks. No factory factories. Effects are sets; they union naturally.
**What this means in practice:**
- Functions compose with `|>` (pipes)
- Effects compose by set union
- Types compose via generics and ADTs
- Tests compose by handler substitution
### 3. Safety Without Ceremony
The type system catches errors at compile time, but doesn't make you fight it:
```lux
// Type inference keeps code clean
let x = 42 // Int, inferred
let names = ["Alice", "Bob"] // List<String>, inferred
// But function signatures are always explicit
fn greet(name: String): String = "Hello, {name}"
```
**The balance:**
- Function signatures: always annotated (documentation + API contract)
- Local bindings: inferred (reduces noise in implementation)
- Effects: declared or inferred (explicit at boundaries, lightweight inside)
- Behavioral properties: opt-in (`is pure`, `is total` — add when valuable)
### 4. Practical Over Academic
Lux borrows from the best of programming language research, but wraps it in familiar syntax:
```lux
// This is algebraic effects. But it reads like normal code.
fn main(): Unit with {Console} = {
Console.print("What's your name?")
let name = Console.readLine()
Console.print("Hello, {name}!")
}
```
Compare with Haskell's equivalent:
```haskell
main :: IO ()
main = do
putStrLn "What's your name?"
name <- getLine
putStrLn ("Hello, " ++ name ++ "!")
```
Both are explicit about effects. Lux chooses syntax that reads like imperative code while maintaining the same guarantees.
**What this means in practice:**
- ML-family semantics, C-family appearance
- No monads to learn (effects replace them)
- No category theory prerequisites
- The learning curve is: functions → types → effects (days, not months)
### 5. One Right Way
Like Go and Python, Lux favors having one obvious way to do things:
- **One formatter** (`lux fmt`) — opinionated, not configurable, ends all style debates
- **One test framework** (built-in `Test` effect) — no framework shopping
- **One way to handle effects** — declare, handle, compose
- **One package manager** (`lux pkg`) — integrated, not bolted on
This is a deliberate rejection of the JavaScript/Ruby approach where every project assembles its own stack from dozens of competing libraries.
### 6. Tools Are Part of the Language
The compiler, linter, formatter, LSP, package manager, and test runner are one thing, not seven:
```bash
lux fmt # Format
lux lint # Lint (with --explain for education)
lux check # Type check + lint
lux test # Run tests
lux compile # Build a binary
lux serve # Serve files
lux --lsp # Editor integration
```
This follows Go's philosophy: a language is its toolchain. The formatter knows the AST. The linter knows the type system. The LSP knows the effects. They're not afterthoughts.
---
## Design Decisions and Their Reasons
### Why algebraic effects instead of monads?
Monads are powerful but have poor ergonomics for composition. Combining `IO`, `State`, and `Error` in Haskell requires monad transformers — a notoriously difficult concept. Effects compose naturally:
```lux
// Just list the effects you need. No transformers.
fn app(): Unit with {Console, File, Http, Time} = ...
```
### Why not just `async/await`?
`async/await` solves one effect (concurrency). Effects solve all of them: I/O, state, randomness, failure, concurrency, logging, databases. One mechanism, universally applicable.
### Why require function type annotations?
Three reasons:
1. **Documentation**: Every function signature is self-documenting
2. **Error messages**: Inference failures produce confusing errors; annotations localize them
3. **API stability**: Changing a function body shouldn't silently change its type
### Why an opinionated formatter?
Style debates waste engineering time. `gofmt` proved that an opinionated, non-configurable formatter eliminates an entire category of bikeshedding. `lux fmt` does the same.
### Why immutable by default?
Mutable state is the root of most concurrency bugs and many logic bugs. Immutability makes code easier to reason about. When you need state, the `State` effect makes it explicit and trackable.
### Why behavioral types?
Properties like "this function is idempotent" or "this function always terminates" are critical for correctness but typically live in comments. Making them part of the type system means:
- The compiler can verify them (or generate property tests)
- Callers can require them (`where F is idempotent`)
- They serve as machine-readable documentation
---
## Comparison with Popular Languages
### JavaScript / TypeScript (SO #1 / #6 by usage)
| Aspect | JavaScript/TypeScript | Lux |
|--------|----------------------|-----|
| **Type system** | Optional/gradual (TS) | Required, Hindley-Milner |
| **Side effects** | Anywhere, implicit | Declared in types |
| **Testing** | Mock libraries (Jest, etc.) | Swap effect handlers |
| **Formatting** | Prettier (configurable) | `lux fmt` (opinionated) |
| **Package management** | npm (massive ecosystem) | `lux pkg` (small ecosystem) |
| **Paradigm** | Multi-paradigm | Functional-first |
| **Null safety** | Optional chaining (partial) | `Option<T>`, no null |
| **Error handling** | try/catch (unchecked) | `Result<T, E>` + `Fail` effect |
| **Shared** | Familiar syntax, first-class functions, closures, string interpolation |
**What Lux learns from JS/TS:** Familiar syntax matters. String interpolation, arrow functions, and readable code lower the barrier to entry.
**What Lux rejects:** Implicit `any`, unchecked exceptions, the "pick your own adventure" toolchain.
### Python (SO #4 by usage, #1 most desired)
| Aspect | Python | Lux |
|--------|--------|-----|
| **Type system** | Optional (type hints) | Required, static |
| **Side effects** | Implicit | Explicit |
| **Performance** | Slow (interpreted) | Faster (compiled to C) |
| **Syntax** | Whitespace-significant | Braces/keywords |
| **Immutability** | Mutable by default | Immutable by default |
| **Tooling** | Fragmented (black, ruff, mypy, pytest...) | Unified (`lux` binary) |
| **Shared** | Clean syntax philosophy, "one way to do it", readability focus |
**What Lux learns from Python:** Readability counts. The Zen of Python's emphasis on one obvious way to do things resonates with Lux's design.
**What Lux rejects:** Dynamic typing, mutable-by-default, fragmented tooling.
### Rust (SO #1 most admired)
| Aspect | Rust | Lux |
|--------|------|-----|
| **Memory** | Ownership/borrowing (manual) | Reference counting (automatic) |
| **Type system** | Traits, generics, lifetimes | ADTs, effects, generics |
| **Side effects** | Implicit (except `unsafe`) | Explicit (effect system) |
| **Error handling** | `Result<T, E>` + `?` | `Result<T, E>` + `Fail` effect |
| **Performance** | Zero-cost, systems-level | Good, not systems-level |
| **Learning curve** | Steep (ownership) | Moderate (effects) |
| **Pattern matching** | Excellent, exhaustive | Excellent, exhaustive |
| **Shared** | ADTs, pattern matching, `Option`/`Result`, no null, immutable by default, strong type system |
**What Lux learns from Rust:** ADTs with exhaustive matching, `Option`/`Result` instead of null/exceptions, excellent error messages, integrated tooling (cargo model).
**What Lux rejects:** Ownership complexity (Lux uses GC/RC instead), lifetimes, `unsafe`.
### Go (SO #13 by usage, #11 most admired)
| Aspect | Go | Lux |
|--------|-----|-----|
| **Type system** | Structural, simple | HM inference, ADTs |
| **Side effects** | Implicit | Explicit |
| **Error handling** | Multiple returns (`val, err`) | `Result<T, E>` + effects |
| **Formatting** | `gofmt` (opinionated) | `lux fmt` (opinionated) |
| **Tooling** | All-in-one (`go` binary) | All-in-one (`lux` binary) |
| **Concurrency** | Goroutines + channels | `Concurrent` + `Channel` effects |
| **Generics** | Added late, limited | First-class from day one |
| **Shared** | Opinionated formatter, unified tooling, practical philosophy |
**What Lux learns from Go:** Unified toolchain, opinionated formatting, simplicity as a feature, fast compilation.
**What Lux rejects:** Verbose error handling (`if err != nil`), no ADTs, no generics (historically), nil.
### Java / C# (SO #7 / #8 by usage)
| Aspect | Java/C# | Lux |
|--------|---------|-----|
| **Paradigm** | OOP-first | FP-first |
| **Effects** | DI frameworks (Spring, etc.) | Language-level effects |
| **Testing** | Mock frameworks (Mockito, etc.) | Handler swapping |
| **Null safety** | Nullable (Java), nullable ref types (C#) | `Option<T>` |
| **Boilerplate** | High (getters, setters, factories) | Low (records, inference) |
| **Shared** | Static typing, generics, pattern matching (recent), established ecosystems |
**What Lux learns from Java/C#:** Enterprise needs (database effects, HTTP, serialization) matter. Testability is a first-class concern.
**What Lux rejects:** OOP ceremony, DI frameworks, null, boilerplate.
### Haskell / OCaml / Elm (FP family)
| Aspect | Haskell | Elm | Lux |
|--------|---------|-----|-----|
| **Effects** | Monads + transformers | Cmd/Sub (Elm Architecture) | Algebraic effects |
| **Learning curve** | Steep | Moderate | Moderate |
| **Error messages** | Improving | Excellent | Good (aspiring to Elm-quality) |
| **Practical focus** | Academic-leaning | Web-focused | General-purpose |
| **Syntax** | Unique | Unique | Familiar (C-family feel) |
| **Shared** | Immutability, ADTs, pattern matching, type inference, no null |
**What Lux learns from Haskell:** Effects must be explicit. Types must be powerful. Purity matters.
**What Lux learns from Elm:** Error messages should teach. Tooling should be integrated. Simplicity beats power.
**What Lux rejects (from Haskell):** Monad transformers, academic syntax, steep learning curve.
### Gleam / Elixir (SO #2 / #3 most admired, 2025)
| Aspect | Gleam | Elixir | Lux |
|--------|-------|--------|-----|
| **Type system** | Static, HM | Dynamic | Static, HM |
| **Effects** | No special tracking | Implicit | First-class |
| **Concurrency** | BEAM (built-in) | BEAM (built-in) | Effect-based |
| **Error handling** | `Result` | Pattern matching on tuples | `Result` + `Fail` effect |
| **Shared** | Friendly errors, pipe operator, functional style, immutability |
**What Lux learns from Gleam:** Friendly developer experience, clear error messages, and pragmatic FP resonate with developers.
---
## Tooling Philosophy Audit
### Does the linter follow the philosophy?
**Yes, strongly.** The linter embodies "make the important things visible":
- `could-be-pure`: Nudges users toward declaring purity — making guarantees visible
- `could-be-total`: Same for termination
- `unnecessary-effect-decl`: Keeps effect signatures honest — don't claim effects you don't use
- `unused-variable/import/function`: Keeps code focused — everything visible should be meaningful
- `single-arm-match` / `manual-map-option`: Teaches idiomatic patterns
The category system (correctness > suspicious > idiom > style > pedantic) reflects the philosophy of being practical, not academic: real bugs are errors, style preferences are opt-in.
### Does the formatter follow the philosophy?
**Yes, with one gap.** The formatter is opinionated and non-configurable, matching the "one right way" principle. It enforces consistent style across all Lux code.
**Gap:** `max_width` and `trailing_commas` are declared in `FormatConfig` but never used. This is harmless but inconsistent — either remove the unused config or implement line wrapping.
### Does the type checker follow the philosophy?
**Yes.** The type checker embodies every core principle:
- Effects are tracked and verified in function types
- Behavioral properties are checked where possible
- Error messages include context and suggestions
- Type inference reduces ceremony while maintaining safety
---
## What Could Be Improved
### High-value additions (improve experience significantly, low verbosity cost)
1. **Pipe-friendly standard library**
- Currently: `List.map(myList, fn(x: Int): Int => x * 2)`
- Better: Allow `myList |> List.map(fn(x: Int): Int => x * 2)`
- Many languages (Elixir, F#, Gleam) make the pipe operator the primary composition tool. If the first argument of stdlib functions is always the data, pipes become natural. This is a **library convention**, not a language change.
- **LLM impact:** Pipe chains are easier for LLMs to generate and read — linear data flow with no nesting.
- **Human impact:** Reduces cognitive load. Reading left-to-right matches how humans think about data transformation.
2. **Exhaustive `match` warnings for non-enum types**
- The linter warns about `wildcard-on-small-enum`, but could also warn when a match on `Option` or `Result` uses a wildcard instead of handling both cases explicitly.
- **Both audiences:** Prevents subtle bugs where new variants are silently caught by `_`.
3. **Error message improvements toward Elm quality**
- Current errors show the right information but could be more conversational and suggest fixes more consistently.
- Example improvement: When a function is called with wrong argument count, show the expected signature and highlight which argument is wrong.
- **LLM impact:** Structured error messages with clear "expected X, got Y" patterns are easier for LLMs to parse and fix.
- **Human impact:** Friendly errors reduce frustration, especially for beginners.
4. **`let ... else` for fallible destructuring**
- Rust's `let ... else` pattern handles the "unwrap or bail" case elegantly:
```lux
let Some(value) = maybeValue else return defaultValue
```
- Currently requires a full `match` expression for this common pattern.
- **Both audiences:** Reduces boilerplate for the most common Option/Result handling pattern.
5. **Trait/typeclass system for overloading**
- Currently `toString`, `==`, and similar operations are built-in. A trait system would let users define their own:
```lux
trait Show<T> { fn show(value: T): String }
impl Show<User> { fn show(u: User): String = "User({u.name})" }
```
- **Note:** This exists partially. Expanding it would enable more generic programming without losing explicitness.
- **LLM impact:** Traits provide clear, greppable contracts. LLMs can generate trait impls from examples.
### Medium-value additions (good improvements, some verbosity cost)
6. **Named arguments or builder pattern for records**
- When functions take many parameters, the linter already warns at 5+. Named arguments or record-punning would help:
```lux
fn createUser({ name, email, age }: UserConfig): User = ...
createUser({ name: "Alice", email: "alice@ex.com", age: 30 })
```
- **Trade-off:** Adds syntax, but the linter already pushes users toward records for many params.
7. **Async/concurrent effect sugar**
- The `Concurrent` effect exists but could benefit from syntactic sugar:
```lux
let (a, b) = concurrent {
fetch("/api/users"),
fetch("/api/posts")
}
```
- **Trade-off:** Adds syntax, but concurrent code is important enough to warrant it.
8. **Module-level documentation with `///` doc comments**
- The `missing-doc-comment` lint exists, but the doc generation system could be enhanced with richer doc comments that include examples, parameter descriptions, and effect documentation.
- **LLM impact:** Structured documentation is the single highest-value feature for LLM code understanding.
### Lower-value or risky additions (consider carefully)
9. **Type inference for function return types**
- Would reduce ceremony: `fn double(x: Int) = x * 2` instead of `fn double(x: Int): Int = x * 2`
- **Risk:** Violates the "function signatures are documentation" principle. A body change could silently change the API. Current approach is the right trade-off.
10. **Operator overloading**
- Tempting for numeric types, but quickly leads to the C++ problem where `+` could mean anything.
- **Risk:** Violates "make the important things visible" — you can't tell what `a + b` does.
- **Better:** Keep operators for built-in numeric types. Use named functions for everything else.
11. **Macros**
- Powerful but drastically complicate tooling, error messages, and readability.
- **Risk:** Rust's macro system is powerful but produces some of the worst error messages in the language.
- **Better:** Solve specific problems with language features (effects, generics) rather than a general metaprogramming escape hatch.
---
## The LLM Perspective
Lux has several properties that make it unusually well-suited for LLM-assisted programming:
1. **Effect signatures are machine-readable contracts.** An LLM reading `fn f(): T with {Database, Logger}` knows exactly what capabilities to provide when generating handler code.
2. **Behavioral properties are verifiable assertions.** `is pure`, `is idempotent` give LLMs clear constraints to check their own output against.
3. **The opinionated formatter eliminates style ambiguity.** LLMs don't need to guess indentation, brace style, or naming conventions — `lux fmt` handles it.
4. **Exhaustive pattern matching forces completeness.** LLMs that generate `match` expressions are reminded by the compiler when they miss cases.
5. **Small, consistent standard library.** `List.map`, `String.split`, `Option.map` — uniform `Module.function` convention is easy to learn from few examples.
6. **Effect-based testing needs no framework knowledge.** An LLM doesn't need to know Jest, pytest, or JUnit — just swap handlers.
**What would help LLMs more:**
- Structured error output (JSON mode) for programmatic error fixing
- Example-rich documentation that LLMs can learn patterns from
- A canonical set of "Lux patterns" (like Go's proverbs) that encode best practices in memorable form
---
## Summary
Lux's philosophy can be compressed to five words: **Make the important things visible.**
This manifests as:
- **Effects in types** — see what code does
- **Properties in types** — see what code guarantees
- **Versions in types** — see how data evolves
- **One tool for everything** — see how to build
- **One format for all** — see consistent style
The language is in the sweet spot between Haskell's rigor and Python's practicality, with Go's tooling philosophy and Elm's developer experience aspirations. It doesn't try to be everything — it tries to make the things that matter most in real software visible, composable, and verifiable.

View File

@@ -14,6 +14,7 @@
pkgs = import nixpkgs { inherit system overlays; };
rustToolchain = pkgs.rust-bin.stable.latest.default.override {
extensions = [ "rust-src" "rust-analyzer" ];
targets = [ "x86_64-unknown-linux-musl" ];
};
in
{
@@ -22,8 +23,8 @@
rustToolchain
cargo-watch
cargo-edit
pkg-config
openssl
# Static builds
pkgsStatic.stdenv.cc
# Benchmark tools
hyperfine
poop
@@ -43,7 +44,7 @@
printf "\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m v0.1.0\n"
printf " \033[1;35m \033[0m v0.1.13\n"
printf "\n"
printf " Functional language with first-class effects\n"
printf "\n"
@@ -61,18 +62,47 @@
packages.default = pkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.0";
version = "0.1.13";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
nativeBuildInputs = [ pkgs.pkg-config ];
buildInputs = [ pkgs.openssl ];
doCheck = false;
};
# Benchmark scripts
packages.static = let
muslPkgs = import nixpkgs {
inherit system;
crossSystem = {
config = "x86_64-unknown-linux-musl";
isStatic = true;
};
};
in muslPkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.13";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
CARGO_BUILD_TARGET = "x86_64-unknown-linux-musl";
CARGO_BUILD_RUSTFLAGS = "-C target-feature=+crt-static";
doCheck = false;
postInstall = ''
$STRIP $out/bin/lux 2>/dev/null || true
'';
};
apps = {
# Release automation
release = {
type = "app";
program = toString (pkgs.writeShellScript "lux-release" ''
exec ${self}/scripts/release.sh "$@"
'');
};
# Benchmark scripts
# Run hyperfine benchmark comparison
bench = {
type = "app";

View File

@@ -0,0 +1,225 @@
// Lux AST — Self-hosted Abstract Syntax Tree definitions
//
// Direct translation of src/ast.rs into Lux ADTs.
// These types represent the parsed structure of a Lux program.
//
// Naming conventions to avoid collisions:
// Ex = Expr variant, Pat = Pattern, Te = TypeExpr
// Td = TypeDef, Vf = VariantFields, Op = Operator
// Decl = Declaration, St = Statement
// === Source Location ===
type Span = | Span(Int, Int)
// === Identifiers ===
type Ident = | Ident(String, Span)
// === Visibility ===
type Visibility = | Public | Private
// === Schema Evolution ===
type Version = | Version(Int, Span)
type VersionConstraint =
| VcExact(Version)
| VcAtLeast(Version)
| VcLatest(Span)
// === Behavioral Types ===
type BehavioralProperty =
| BpPure
| BpTotal
| BpIdempotent
| BpDeterministic
| BpCommutative
// === Trait Bound (needed before WhereClause) ===
type TraitBound = | TraitBound(Ident, List<TypeExpr>, Span)
// === Trait Constraint (needed before WhereClause) ===
type TraitConstraint = | TraitConstraint(Ident, List<TraitBound>, Span)
// === Where Clauses ===
type WhereClause =
| WcProperty(Ident, BehavioralProperty, Span)
| WcResult(Expr, Span)
| WcTrait(TraitConstraint)
// === Module Path ===
type ModulePath = | ModulePath(List<Ident>, Span)
// === Import ===
// path, alias, items, wildcard, span
type ImportDecl = | ImportDecl(ModulePath, Option<Ident>, Option<List<Ident>>, Bool, Span)
// === Program ===
type Program = | Program(List<ImportDecl>, List<Declaration>)
// === Declarations ===
type Declaration =
| DeclFunction(FunctionDecl)
| DeclEffect(EffectDecl)
| DeclType(TypeDecl)
| DeclHandler(HandlerDecl)
| DeclLet(LetDecl)
| DeclTrait(TraitDecl)
| DeclImpl(ImplDecl)
// === Parameter ===
type Parameter = | Parameter(Ident, TypeExpr, Span)
// === Effect Operation ===
type EffectOp = | EffectOp(Ident, List<Parameter>, TypeExpr, Span)
// === Record Field ===
type RecordField = | RecordField(Ident, TypeExpr, Span)
// === Variant Fields ===
type VariantFields =
| VfUnit
| VfTuple(List<TypeExpr>)
| VfRecord(List<RecordField>)
// === Variant ===
type Variant = | Variant(Ident, VariantFields, Span)
// === Migration ===
type Migration = | Migration(Version, Expr, Span)
// === Handler Impl ===
// op_name, params, resume, body, span
type HandlerImpl = | HandlerImpl(Ident, List<Ident>, Option<Ident>, Expr, Span)
// === Impl Method ===
// name, params, return_type, body, span
type ImplMethod = | ImplMethod(Ident, List<Parameter>, Option<TypeExpr>, Expr, Span)
// === Trait Method ===
// name, type_params, params, return_type, default_impl, span
type TraitMethod = | TraitMethod(Ident, List<Ident>, List<Parameter>, TypeExpr, Option<Expr>, Span)
// === Type Expressions ===
type TypeExpr =
| TeNamed(Ident)
| TeApp(TypeExpr, List<TypeExpr>)
| TeFunction(List<TypeExpr>, TypeExpr, List<Ident>)
| TeTuple(List<TypeExpr>)
| TeRecord(List<RecordField>)
| TeUnit
| TeVersioned(TypeExpr, VersionConstraint)
// === Literal ===
type LiteralKind =
| LitInt(Int)
| LitFloat(String)
| LitString(String)
| LitChar(Char)
| LitBool(Bool)
| LitUnit
type Literal = | Literal(LiteralKind, Span)
// === Binary Operators ===
type BinaryOp =
| OpAdd | OpSub | OpMul | OpDiv | OpMod
| OpEq | OpNe | OpLt | OpLe | OpGt | OpGe
| OpAnd | OpOr
| OpPipe | OpConcat
// === Unary Operators ===
type UnaryOp = | OpNeg | OpNot
// === Statements ===
type Statement =
| StExpr(Expr)
| StLet(Ident, Option<TypeExpr>, Expr, Span)
// === Match Arms ===
type MatchArm = | MatchArm(Pattern, Option<Expr>, Expr, Span)
// === Patterns ===
type Pattern =
| PatWildcard(Span)
| PatVar(Ident)
| PatLiteral(Literal)
| PatConstructor(Ident, List<Pattern>, Span)
| PatRecord(List<(Ident, Pattern)>, Span)
| PatTuple(List<Pattern>, Span)
// === Function Declaration ===
// visibility, doc, name, type_params, params, return_type, effects, properties, where_clauses, body, span
type FunctionDecl = | FunctionDecl(Visibility, Option<String>, Ident, List<Ident>, List<Parameter>, TypeExpr, List<Ident>, List<BehavioralProperty>, List<WhereClause>, Expr, Span)
// === Effect Declaration ===
// doc, name, type_params, operations, span
type EffectDecl = | EffectDecl(Option<String>, Ident, List<Ident>, List<EffectOp>, Span)
// === Type Declaration ===
// visibility, doc, name, type_params, version, definition, migrations, span
type TypeDecl = | TypeDecl(Visibility, Option<String>, Ident, List<Ident>, Option<Version>, TypeDef, List<Migration>, Span)
// === Handler Declaration ===
// name, params, effect, implementations, span
type HandlerDecl = | HandlerDecl(Ident, List<Parameter>, Ident, List<HandlerImpl>, Span)
// === Let Declaration ===
// visibility, doc, name, typ, value, span
type LetDecl = | LetDecl(Visibility, Option<String>, Ident, Option<TypeExpr>, Expr, Span)
// === Trait Declaration ===
// visibility, doc, name, type_params, super_traits, methods, span
type TraitDecl = | TraitDecl(Visibility, Option<String>, Ident, List<Ident>, List<TraitBound>, List<TraitMethod>, Span)
// === Impl Declaration ===
// type_params, constraints, trait_name, trait_args, target_type, methods, span
type ImplDecl = | ImplDecl(List<Ident>, List<TraitConstraint>, Ident, List<TypeExpr>, TypeExpr, List<ImplMethod>, Span)
// === Expressions ===
type Expr =
| ExLiteral(Literal)
| ExVar(Ident)
| ExBinaryOp(BinaryOp, Expr, Expr, Span)
| ExUnaryOp(UnaryOp, Expr, Span)
| ExCall(Expr, List<Expr>, Span)
| ExEffectOp(Ident, Ident, List<Expr>, Span)
| ExField(Expr, Ident, Span)
| ExTupleIndex(Expr, Int, Span)
| ExLambda(List<Parameter>, Option<TypeExpr>, List<Ident>, Expr, Span)
| ExLet(Ident, Option<TypeExpr>, Expr, Expr, Span)
| ExIf(Expr, Expr, Expr, Span)
| ExMatch(Expr, List<MatchArm>, Span)
| ExBlock(List<Statement>, Expr, Span)
| ExRecord(Option<Expr>, List<(Ident, Expr)>, Span)
| ExTuple(List<Expr>, Span)
| ExList(List<Expr>, Span)
| ExRun(Expr, List<(Ident, Expr)>, Span)
| ExResume(Expr, Span)

213
scripts/release.sh Executable file
View File

@@ -0,0 +1,213 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Release Script
# Builds a static binary, generates changelog, and creates a Gitea release.
#
# Usage:
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
# ./scripts/release.sh patch # same as above
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
# ./scripts/release.sh v1.2.3 # explicit version
#
# Environment:
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
REPO_OWNER="blu"
REPO_NAME="lux"
API_BASE="$GITEA_URL/api/v1"
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
info() { printf "${CYAN}::${NC} %s\n" "$1"; }
ok() { printf "${GREEN}ok${NC} %s\n" "$1"; }
warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
# --- Determine version ---
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
BUMP="${1:-patch}"
bump_version() {
local ver="$1" part="$2"
IFS='.' read -r major minor patch <<< "$ver"
case "$part" in
major) echo "$((major + 1)).0.0" ;;
minor) echo "$major.$((minor + 1)).0" ;;
patch) echo "$major.$minor.$((patch + 1))" ;;
*) echo "$part" ;; # treat as explicit version
esac
}
case "$BUMP" in
major|minor|patch)
VERSION=$(bump_version "$CURRENT" "$BUMP")
info "Bumping $BUMP: $CURRENT$VERSION"
;;
*)
# Explicit version — strip v prefix if present
VERSION="${BUMP#v}"
info "Explicit version: $VERSION"
;;
esac
TAG="v$VERSION"
# --- Check for clean working tree ---
if [ -n "$(git status --porcelain)" ]; then
warn "Working tree has uncommitted changes:"
git status --short
printf "\n"
read -rp "Continue anyway? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || exit 1
fi
# --- Check if tag already exists ---
if git rev-parse "$TAG" >/dev/null 2>&1; then
err "Tag $TAG already exists. Choose a different version."
fi
# --- Update version in source files ---
if [ "$VERSION" != "$CURRENT" ]; then
info "Updating version in Cargo.toml and flake.nix..."
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
git add Cargo.toml flake.nix
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
ok "Version updated and committed"
fi
# --- Generate changelog ---
info "Generating changelog..."
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
if [ -n "$LAST_TAG" ]; then
RANGE="$LAST_TAG..HEAD"
info "Changes since $LAST_TAG:"
else
RANGE="HEAD"
info "First release — summarizing recent commits:"
fi
CHANGELOG=$(git log "$RANGE" --pretty=format:"- %s" --no-merges 2>/dev/null | head -50 || true)
if [ -z "$CHANGELOG" ]; then
CHANGELOG="- Initial release"
fi
# --- Build static binary ---
info "Building static binary (nix build .#static)..."
nix build .#static
BINARY="result/bin/lux"
if [ ! -f "$BINARY" ]; then
err "Static binary not found at $BINARY"
fi
BINARY_SIZE=$(ls -lh "$BINARY" | awk '{print $5}')
BINARY_TYPE=$(file "$BINARY" | sed 's/.*: //')
ok "Binary: $BINARY_SIZE, $BINARY_TYPE"
# --- Prepare release artifact ---
ARTIFACT="/tmp/lux-${TAG}-linux-x86_64"
cp "$BINARY" "$ARTIFACT"
chmod +x "$ARTIFACT"
# --- Show release summary ---
printf "\n"
printf "${BOLD}═══ Release Summary ═══${NC}\n"
printf "\n"
printf " ${BOLD}Tag:${NC} %s\n" "$TAG"
printf " ${BOLD}Binary:${NC} %s (%s)\n" "lux-${TAG}-linux-x86_64" "$BINARY_SIZE"
printf " ${BOLD}Commit:${NC} %s\n" "$(git rev-parse --short HEAD)"
printf "\n"
printf "${BOLD}Changelog:${NC}\n"
printf "%s\n" "$CHANGELOG"
printf "\n"
# --- Confirm ---
read -rp "Create release $TAG? [y/N] " confirm
[[ "$confirm" =~ ^[Yy]$ ]] || { info "Aborted."; exit 0; }
# --- Get Gitea token ---
if [ -z "${GITEA_TOKEN:-}" ]; then
printf "\n"
info "Gitea API token required (create at $GITEA_URL/user/settings/applications)"
read -rsp "Token: " GITEA_TOKEN
printf "\n"
fi
if [ -z "$GITEA_TOKEN" ]; then
err "No token provided"
fi
# --- Create and push tag ---
info "Creating tag $TAG..."
git tag -a "$TAG" -m "Release $TAG" --no-sign
ok "Tag created"
info "Pushing tag to origin..."
git push origin "$TAG"
ok "Tag pushed"
# --- Create Gitea release ---
info "Creating release on Gitea..."
RELEASE_BODY=$(printf "## Lux %s\n\n### Changes\n\n%s\n\n### Installation\n\n\`\`\`bash\ncurl -Lo lux %s/%s/%s/releases/download/%s/lux-linux-x86_64\nchmod +x lux\n./lux --version\n\`\`\`" \
"$TAG" "$CHANGELOG" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG")
RELEASE_JSON=$(jq -n \
--arg tag "$TAG" \
--arg name "Lux $TAG" \
--arg body "$RELEASE_BODY" \
'{tag_name: $tag, name: $name, body: $body, draft: false, prerelease: false}')
RELEASE_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/json" \
-d "$RELEASE_JSON")
RELEASE_ID=$(echo "$RELEASE_RESPONSE" | jq -r '.id // empty')
if [ -z "$RELEASE_ID" ]; then
echo "$RELEASE_RESPONSE" | jq . 2>/dev/null || echo "$RELEASE_RESPONSE"
err "Failed to create release"
fi
ok "Release created (id: $RELEASE_ID)"
# --- Upload binary ---
info "Uploading binary..."
UPLOAD_RESPONSE=$(curl -s -X POST \
"$API_BASE/repos/$REPO_OWNER/$REPO_NAME/releases/$RELEASE_ID/assets?name=lux-linux-x86_64" \
-H "Authorization: token $GITEA_TOKEN" \
-H "Content-Type: application/octet-stream" \
--data-binary "@$ARTIFACT")
ASSET_NAME=$(echo "$UPLOAD_RESPONSE" | jq -r '.name // empty')
if [ -z "$ASSET_NAME" ]; then
echo "$UPLOAD_RESPONSE" | jq . 2>/dev/null || echo "$UPLOAD_RESPONSE"
err "Failed to upload binary"
fi
ok "Binary uploaded: $ASSET_NAME"
# --- Done ---
printf "\n"
printf "${GREEN}${BOLD}Release $TAG published!${NC}\n"
printf "\n"
printf " ${BOLD}URL:${NC} %s/%s/%s/releases/tag/%s\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf " ${BOLD}Download:${NC} %s/%s/%s/releases/download/%s/lux-linux-x86_64\n" "$GITEA_URL" "$REPO_OWNER" "$REPO_NAME" "$TAG"
printf "\n"
# Cleanup
rm -f "$ARTIFACT"

211
scripts/validate.sh Executable file
View File

@@ -0,0 +1,211 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Full Validation Script
# Runs all checks: Rust tests, package tests, type checking, example compilation.
# Run after every committable change to ensure no regressions.
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
LUX="$(pwd)/target/release/lux"
PACKAGES_DIR="$(pwd)/../packages"
PROJECTS_DIR="$(pwd)/projects"
EXAMPLES_DIR="$(pwd)/examples"
RED='\033[0;31m'
GREEN='\033[0;32m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
FAILED=0
TOTAL=0
step() {
TOTAL=$((TOTAL + 1))
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
}
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
# --- Rust checks ---
step "cargo check"
if nix develop --command cargo check 2>/dev/null; then ok; else fail; fi
step "cargo test"
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
# --- Build release binary ---
step "cargo build --release"
if nix develop --command cargo build --release 2>/dev/null; then ok; else fail; fi
# --- Package tests ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux test ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
fi
done
# --- Lux check on packages ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux check ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
fi
done
# --- Project checks ---
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
if [ -f "$proj_dir/main.lux" ]; then
step "lux check (project: $proj)"
OUTPUT=$("$LUX" check "$proj_dir/main.lux" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
fi
# Check any standalone .lux files in the project
for lux_file in "$proj_dir"/*.lux; do
[ -f "$lux_file" ] || continue
fname=$(basename "$lux_file")
[ "$fname" = "main.lux" ] && continue
step "lux check (project: $proj/$fname)"
OUTPUT=$("$LUX" check "$lux_file" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
done
done
# === Compilation & Interpreter Checks ===
# --- Interpreter: examples ---
# Skip: http_api, http, http_router, http_server (network), postgres_demo (db),
# random, property_testing (Random effect), shell (Process), json (File I/O),
# file_io (File I/O), test_math, test_lists (Test effect), stress_shared_rc,
# test_rc_comparison (internal tests), modules/* (need cwd)
INTERP_SKIP="http_api http http_router http_server postgres_demo random property_testing shell json file_io test_math test_lists stress_shared_rc test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $INTERP_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (examples/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/standard ---
# Skip: guessing_game (reads stdin)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "guessing_game" ] && continue
step "interpreter (standard/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: examples/showcase ---
# Skip: task_manager (parse error in current version)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "interpreter (showcase/$name)"
if timeout 10 "$LUX" "$f" >/dev/null 2>&1; then ok; else fail; fi
done
# --- Interpreter: projects ---
# Skip: guessing-game (Random), rest-api (HttpServer)
PROJ_INTERP_SKIP="guessing-game rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $PROJ_INTERP_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "interpreter (project: $proj)"
if timeout 10 "$LUX" "$proj_dir/main.lux" >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples ---
# Skip files that fail JS compilation (unsupported features)
JS_SKIP="http_api http http_router postgres_demo property_testing json test_lists test_rc_comparison"
for f in "$EXAMPLES_DIR"/*.lux; do
name=$(basename "$f" .lux)
skip=false
for s in $JS_SKIP; do [ "$name" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (examples/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/standard ---
# Skip: stdlib_demo (uses String.toUpper not in JS backend)
for f in "$EXAMPLES_DIR"/standard/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "stdlib_demo" ] && continue
step "compile JS (standard/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: examples/showcase ---
# Skip: task_manager (unsupported features)
for f in "$EXAMPLES_DIR"/showcase/*.lux; do
[ -f "$f" ] || continue
name=$(basename "$f" .lux)
[ "$name" = "task_manager" ] && continue
step "compile JS (showcase/$name)"
if "$LUX" compile "$f" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- JS compilation: projects ---
# Skip: json-parser, rest-api (unsupported features)
JS_PROJ_SKIP="json-parser rest-api"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
[ -f "$proj_dir/main.lux" ] || continue
skip=false
for s in $JS_PROJ_SKIP; do [ "$proj" = "$s" ] && skip=true; done
$skip && continue
step "compile JS (project: $proj)"
if "$LUX" compile "$proj_dir/main.lux" --target js -o /tmp/lux_validate.js >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples ---
# Only compile examples known to work with C backend
C_EXAMPLES="hello factorial pipelines tailcall jit_test"
for name in $C_EXAMPLES; do
f="$EXAMPLES_DIR/$name.lux"
[ -f "$f" ] || continue
step "compile C (examples/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- C compilation: examples/standard ---
C_STD_EXAMPLES="hello_world factorial fizzbuzz primes guessing_game"
for name in $C_STD_EXAMPLES; do
f="$EXAMPLES_DIR/standard/$name.lux"
[ -f "$f" ] || continue
step "compile C (standard/$name)"
if "$LUX" compile "$f" -o /tmp/lux_validate_bin >/dev/null 2>&1; then ok; else fail; fi
done
# --- Cleanup ---
rm -f /tmp/lux_validate.js /tmp/lux_validate_bin
# --- Summary ---
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
if [ $FAILED -eq 0 ]; then
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
else
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
exit 1
fi

View File

@@ -221,6 +221,10 @@ pub enum Declaration {
Trait(TraitDecl),
/// Trait implementation: impl Trait for Type { ... }
Impl(ImplDecl),
/// Extern function declaration (FFI): extern fn name(params): ReturnType
ExternFn(ExternFnDecl),
/// Extern let declaration (FFI): extern let name: Type
ExternLet(ExternLetDecl),
}
/// Function declaration
@@ -428,6 +432,34 @@ pub struct ImplMethod {
pub span: Span,
}
/// Extern function declaration (FFI)
#[derive(Debug, Clone)]
pub struct ExternFnDecl {
pub visibility: Visibility,
/// Documentation comment
pub doc: Option<String>,
pub name: Ident,
pub type_params: Vec<Ident>,
pub params: Vec<Parameter>,
pub return_type: TypeExpr,
/// Optional JS name override: extern fn foo(...): T = "jsFoo"
pub js_name: Option<String>,
pub span: Span,
}
/// Extern let declaration (FFI)
#[derive(Debug, Clone)]
pub struct ExternLetDecl {
pub visibility: Visibility,
/// Documentation comment
pub doc: Option<String>,
pub name: Ident,
pub typ: TypeExpr,
/// Optional JS name override: extern let foo: T = "window.foo"
pub js_name: Option<String>,
pub span: Span,
}
/// Type expressions
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum TypeExpr {
@@ -499,6 +531,12 @@ pub enum Expr {
field: Ident,
span: Span,
},
/// Tuple index access: tuple.0, tuple.1
TupleIndex {
object: Box<Expr>,
index: usize,
span: Span,
},
/// Lambda: fn(x, y) => x + y or fn(x: Int): Int => x + 1
Lambda {
params: Vec<Parameter>,
@@ -535,7 +573,9 @@ pub enum Expr {
span: Span,
},
/// Record literal: { name: "Alice", age: 30 }
/// With optional spread: { ...base, name: "Bob" }
Record {
spread: Option<Box<Expr>>,
fields: Vec<(Ident, Expr)>,
span: Span,
},
@@ -563,6 +603,7 @@ impl Expr {
Expr::Call { span, .. } => *span,
Expr::EffectOp { span, .. } => *span,
Expr::Field { span, .. } => *span,
Expr::TupleIndex { span, .. } => *span,
Expr::Lambda { span, .. } => *span,
Expr::Let { span, .. } => *span,
Expr::If { span, .. } => *span,
@@ -614,7 +655,8 @@ pub enum BinaryOp {
And,
Or,
// Other
Pipe, // |>
Pipe, // |>
Concat, // ++
}
impl fmt::Display for BinaryOp {
@@ -634,6 +676,7 @@ impl fmt::Display for BinaryOp {
BinaryOp::And => write!(f, "&&"),
BinaryOp::Or => write!(f, "||"),
BinaryOp::Pipe => write!(f, "|>"),
BinaryOp::Concat => write!(f, "++"),
}
}
}
@@ -686,8 +729,9 @@ pub enum Pattern {
Var(Ident),
/// Literal: 42, "hello", true
Literal(Literal),
/// Constructor: Some(x), None, Ok(v)
/// Constructor: Some(x), None, Ok(v), module.Constructor(x)
Constructor {
module: Option<Ident>,
name: Ident,
fields: Vec<Pattern>,
span: Span,

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -333,11 +333,13 @@ mod tests {
fn test_option_exhaustive() {
let patterns = vec![
Pattern::Constructor {
module: None,
name: make_ident("None"),
fields: vec![],
span: span(),
},
Pattern::Constructor {
module: None,
name: make_ident("Some"),
fields: vec![Pattern::Wildcard(span())],
span: span(),
@@ -352,6 +354,7 @@ mod tests {
#[test]
fn test_option_missing_none() {
let patterns = vec![Pattern::Constructor {
module: None,
name: make_ident("Some"),
fields: vec![Pattern::Wildcard(span())],
span: span(),
@@ -391,11 +394,13 @@ mod tests {
fn test_result_exhaustive() {
let patterns = vec![
Pattern::Constructor {
module: None,
name: make_ident("Ok"),
fields: vec![Pattern::Wildcard(span())],
span: span(),
},
Pattern::Constructor {
module: None,
name: make_ident("Err"),
fields: vec![Pattern::Wildcard(span())],
span: span(),

View File

@@ -3,9 +3,9 @@
//! Formats Lux source code according to standard style guidelines.
use crate::ast::{
BehavioralProperty, BinaryOp, Declaration, EffectDecl, Expr, FunctionDecl, HandlerDecl,
ImplDecl, ImplMethod, LetDecl, Literal, LiteralKind, Pattern, Program, Statement, TraitDecl,
TypeDecl, TypeDef, TypeExpr, UnaryOp, VariantFields,
BehavioralProperty, BinaryOp, Declaration, EffectDecl, ExternFnDecl, ExternLetDecl, Expr, FunctionDecl,
HandlerDecl, ImplDecl, ImplMethod, LetDecl, Literal, LiteralKind, Pattern, Program, Statement,
TraitDecl, TypeDecl, TypeDef, TypeExpr, UnaryOp, VariantFields, Visibility,
};
use crate::lexer::Lexer;
use crate::parser::Parser;
@@ -103,9 +103,77 @@ impl Formatter {
Declaration::Handler(h) => self.format_handler(h),
Declaration::Trait(t) => self.format_trait(t),
Declaration::Impl(i) => self.format_impl(i),
Declaration::ExternFn(e) => self.format_extern_fn(e),
Declaration::ExternLet(e) => self.format_extern_let(e),
}
}
fn format_extern_fn(&mut self, ext: &ExternFnDecl) {
let indent = self.indent();
self.write(&indent);
if ext.visibility == Visibility::Public {
self.write("pub ");
}
self.write("extern fn ");
self.write(&ext.name.name);
// Type parameters
if !ext.type_params.is_empty() {
self.write("<");
self.write(
&ext.type_params
.iter()
.map(|p| p.name.clone())
.collect::<Vec<_>>()
.join(", "),
);
self.write(">");
}
// Parameters
self.write("(");
let params: Vec<String> = ext
.params
.iter()
.map(|p| format!("{}: {}", p.name.name, self.format_type_expr(&p.typ)))
.collect();
self.write(&params.join(", "));
self.write("): ");
// Return type
self.write(&self.format_type_expr(&ext.return_type));
// Optional JS name
if let Some(js_name) = &ext.js_name {
self.write(&format!(" = \"{}\"", js_name));
}
self.newline();
}
fn format_extern_let(&mut self, ext: &ExternLetDecl) {
let indent = self.indent();
self.write(&indent);
if ext.visibility == Visibility::Public {
self.write("pub ");
}
self.write("extern let ");
self.write(&ext.name.name);
self.write(": ");
self.write(&self.format_type_expr(&ext.typ));
// Optional JS name
if let Some(js_name) = &ext.js_name {
self.write(&format!(" = \"{}\"", js_name));
}
self.newline();
}
fn format_function(&mut self, func: &FunctionDecl) {
let indent = self.indent();
self.write(&indent);
@@ -598,6 +666,9 @@ impl Formatter {
Expr::Field { object, field, .. } => {
format!("{}.{}", self.format_expr(object), field.name)
}
Expr::TupleIndex { object, index, .. } => {
format!("{}.{}", self.format_expr(object), index)
}
Expr::If { condition, then_branch, else_branch, .. } => {
format!(
"if {} then {} else {}",
@@ -685,15 +756,17 @@ impl Formatter {
.join(", ")
)
}
Expr::Record { fields, .. } => {
format!(
"{{ {} }}",
fields
.iter()
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val)))
.collect::<Vec<_>>()
.join(", ")
)
Expr::Record {
spread, fields, ..
} => {
let mut parts = Vec::new();
if let Some(spread_expr) = spread {
parts.push(format!("...{}", self.format_expr(spread_expr)));
}
for (name, val) in fields {
parts.push(format!("{}: {}", name.name, self.format_expr(val)));
}
format!("{{ {} }}", parts.join(", "))
}
Expr::EffectOp { effect, operation, args, .. } => {
format!(
@@ -728,7 +801,30 @@ impl Formatter {
match &lit.kind {
LiteralKind::Int(n) => n.to_string(),
LiteralKind::Float(f) => format!("{}", f),
LiteralKind::String(s) => format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"")),
LiteralKind::String(s) => {
if s.contains('\n') {
// Use triple-quoted multiline string
let tab = " ".repeat(self.config.indent_size);
let base_indent = tab.repeat(self.indent_level);
let content_indent = tab.repeat(self.indent_level + 1);
let lines: Vec<&str> = s.split('\n').collect();
let mut result = String::from("\"\"\"\n");
for line in &lines {
if line.is_empty() {
result.push('\n');
} else {
result.push_str(&content_indent);
result.push_str(&line.replace('{', "\\{").replace('}', "\\}"));
result.push('\n');
}
}
result.push_str(&base_indent);
result.push_str("\"\"\"");
result
} else {
format!("\"{}\"", s.replace('\\', "\\\\").replace('"', "\\\"").replace('{', "\\{").replace('}', "\\}"))
}
},
LiteralKind::Char(c) => format!("'{}'", c),
LiteralKind::Bool(b) => b.to_string(),
LiteralKind::Unit => "()".to_string(),
@@ -750,6 +846,7 @@ impl Formatter {
BinaryOp::Ge => ">=",
BinaryOp::And => "&&",
BinaryOp::Or => "||",
BinaryOp::Concat => "++",
BinaryOp::Pipe => "|>",
}
}
@@ -766,12 +863,22 @@ impl Formatter {
Pattern::Wildcard(_) => "_".to_string(),
Pattern::Var(ident) => ident.name.clone(),
Pattern::Literal(lit) => self.format_literal(lit),
Pattern::Constructor { name, fields, .. } => {
Pattern::Constructor {
module,
name,
fields,
..
} => {
let prefix = match module {
Some(m) => format!("{}.", m.name),
None => String::new(),
};
if fields.is_empty() {
name.name.clone()
format!("{}{}", prefix, name.name)
} else {
format!(
"{}({})",
"{}{}({})",
prefix,
name.name,
fields
.iter()

View File

@@ -28,6 +28,8 @@ pub enum BuiltinFn {
ListGet,
ListRange,
ListForEach,
ListSort,
ListSortBy,
// String operations
StringSplit,
@@ -74,14 +76,21 @@ pub enum BuiltinFn {
MathFloor,
MathCeil,
MathRound,
MathSin,
MathCos,
MathAtan2,
// Additional List operations
ListIsEmpty,
ListFind,
ListFindIndex,
ListAny,
ListAll,
ListTake,
ListDrop,
ListZip,
ListFlatten,
ListContains,
// Additional String operations
StringStartsWith,
@@ -95,6 +104,12 @@ pub enum BuiltinFn {
StringLastIndexOf,
StringRepeat,
// Int/Float operations
IntToString,
IntToFloat,
FloatToString,
FloatToInt,
// JSON operations
JsonParse,
JsonStringify,
@@ -115,6 +130,26 @@ pub enum BuiltinFn {
JsonString,
JsonArray,
JsonObject,
// Map operations
MapNew,
MapSet,
MapGet,
MapContains,
MapRemove,
MapKeys,
MapValues,
MapSize,
MapIsEmpty,
MapFromList,
MapToList,
MapMerge,
// Ref operations
RefNew,
RefGet,
RefSet,
RefUpdate,
}
/// Runtime value
@@ -129,6 +164,7 @@ pub enum Value {
List(Vec<Value>),
Tuple(Vec<Value>),
Record(HashMap<String, Value>),
Map(HashMap<String, Value>),
Function(Rc<Closure>),
Handler(Rc<HandlerValue>),
/// Built-in function
@@ -146,6 +182,13 @@ pub enum Value {
},
/// JSON value (for JSON parsing/manipulation)
Json(serde_json::Value),
/// Extern function (FFI — only callable from JS backend)
ExternFn {
name: String,
arity: usize,
},
/// Mutable reference cell
Ref(Rc<RefCell<Value>>),
}
impl Value {
@@ -160,12 +203,15 @@ impl Value {
Value::List(_) => "List",
Value::Tuple(_) => "Tuple",
Value::Record(_) => "Record",
Value::Map(_) => "Map",
Value::Function(_) => "Function",
Value::Handler(_) => "Handler",
Value::Builtin(_) => "Function",
Value::Constructor { .. } => "Constructor",
Value::Versioned { .. } => "Versioned",
Value::Json(_) => "Json",
Value::ExternFn { .. } => "ExternFn",
Value::Ref(_) => "Ref",
}
}
@@ -208,6 +254,11 @@ impl Value {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Map(xs), Value::Map(ys)) => {
xs.len() == ys.len() && xs.iter().all(|(k, v)| {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
}
@@ -216,6 +267,7 @@ impl Value {
t1 == t2 && v1 == v2 && Value::values_equal(val1, val2)
}
(Value::Json(j1), Value::Json(j2)) => j1 == j2,
(Value::Ref(r1), Value::Ref(r2)) => Rc::ptr_eq(r1, r2),
// Functions and handlers cannot be compared for equality
_ => false,
}
@@ -278,6 +330,16 @@ impl TryFromValue for Vec<Value> {
}
}
impl TryFromValue for HashMap<String, Value> {
const TYPE_NAME: &'static str = "Map";
fn try_from_value(value: &Value) -> Option<Self> {
match value {
Value::Map(m) => Some(m.clone()),
_ => None,
}
}
}
impl TryFromValue for Value {
const TYPE_NAME: &'static str = "any";
fn try_from_value(value: &Value) -> Option<Self> {
@@ -324,6 +386,18 @@ impl fmt::Display for Value {
}
write!(f, " }}")
}
Value::Map(entries) => {
write!(f, "Map {{")?;
let mut sorted: Vec<_> = entries.iter().collect();
sorted.sort_by_key(|(k, _)| (*k).clone());
for (i, (key, value)) in sorted.iter().enumerate() {
if i > 0 {
write!(f, ", ")?;
}
write!(f, "\"{}\": {}", key, value)?;
}
write!(f, "}}")
}
Value::Function(_) => write!(f, "<function>"),
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
Value::Handler(_) => write!(f, "<handler>"),
@@ -349,6 +423,8 @@ impl fmt::Display for Value {
write!(f, "{} @v{}", value, version)
}
Value::Json(json) => write!(f, "{}", json),
Value::ExternFn { name, .. } => write!(f, "<extern fn {}>", name),
Value::Ref(cell) => write!(f, "<ref: {}>", cell.borrow()),
}
}
}
@@ -920,14 +996,23 @@ impl Interpreter {
Value::Builtin(BuiltinFn::ListIsEmpty),
),
("find".to_string(), Value::Builtin(BuiltinFn::ListFind)),
("findIndex".to_string(), Value::Builtin(BuiltinFn::ListFindIndex)),
("any".to_string(), Value::Builtin(BuiltinFn::ListAny)),
("all".to_string(), Value::Builtin(BuiltinFn::ListAll)),
("take".to_string(), Value::Builtin(BuiltinFn::ListTake)),
("drop".to_string(), Value::Builtin(BuiltinFn::ListDrop)),
("zip".to_string(), Value::Builtin(BuiltinFn::ListZip)),
("flatten".to_string(), Value::Builtin(BuiltinFn::ListFlatten)),
("contains".to_string(), Value::Builtin(BuiltinFn::ListContains)),
(
"forEach".to_string(),
Value::Builtin(BuiltinFn::ListForEach),
),
("sort".to_string(), Value::Builtin(BuiltinFn::ListSort)),
(
"sortBy".to_string(),
Value::Builtin(BuiltinFn::ListSortBy),
),
]));
env.define("List", list_module);
@@ -1068,9 +1153,26 @@ impl Interpreter {
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
]));
env.define("Math", math_module);
// Int module
let int_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
("toFloat".to_string(), Value::Builtin(BuiltinFn::IntToFloat)),
]));
env.define("Int", int_module);
// Float module
let float_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
("toInt".to_string(), Value::Builtin(BuiltinFn::FloatToInt)),
]));
env.define("Float", float_module);
// JSON module
let json_module = Value::Record(HashMap::from([
("parse".to_string(), Value::Builtin(BuiltinFn::JsonParse)),
@@ -1094,16 +1196,81 @@ impl Interpreter {
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
]));
env.define("Json", json_module);
// Map module
let map_module = Value::Record(HashMap::from([
("new".to_string(), Value::Builtin(BuiltinFn::MapNew)),
("set".to_string(), Value::Builtin(BuiltinFn::MapSet)),
("get".to_string(), Value::Builtin(BuiltinFn::MapGet)),
("contains".to_string(), Value::Builtin(BuiltinFn::MapContains)),
("remove".to_string(), Value::Builtin(BuiltinFn::MapRemove)),
("keys".to_string(), Value::Builtin(BuiltinFn::MapKeys)),
("values".to_string(), Value::Builtin(BuiltinFn::MapValues)),
("size".to_string(), Value::Builtin(BuiltinFn::MapSize)),
("isEmpty".to_string(), Value::Builtin(BuiltinFn::MapIsEmpty)),
("fromList".to_string(), Value::Builtin(BuiltinFn::MapFromList)),
("toList".to_string(), Value::Builtin(BuiltinFn::MapToList)),
("merge".to_string(), Value::Builtin(BuiltinFn::MapMerge)),
]));
env.define("Map", map_module);
// Ref module
let ref_module = Value::Record(HashMap::from([
("new".to_string(), Value::Builtin(BuiltinFn::RefNew)),
("get".to_string(), Value::Builtin(BuiltinFn::RefGet)),
("set".to_string(), Value::Builtin(BuiltinFn::RefSet)),
("update".to_string(), Value::Builtin(BuiltinFn::RefUpdate)),
]));
env.define("Ref", ref_module);
}
/// Execute a program
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
let mut last_value = Value::Unit;
let mut has_main_let = false;
for decl in &program.declarations {
// Track if there's a top-level `let main = ...`
if let Declaration::Let(let_decl) = decl {
if let_decl.name.name == "main" {
has_main_let = true;
}
}
last_value = self.eval_declaration(decl)?;
}
// Auto-invoke main if it was defined as a let binding with a function value
if has_main_let {
if let Some(main_val) = self.global_env.get("main") {
if let Value::Function(ref closure) = main_val {
if closure.params.is_empty() {
let span = Span { start: 0, end: 0 };
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
// Trampoline loop
loop {
match result {
EvalResult::Value(v) => {
last_value = v;
break;
}
EvalResult::Effect(req) => {
last_value = self.handle_effect(req)?;
break;
}
EvalResult::TailCall { func, args, span } => {
result = self.eval_call(func, args, span)?;
}
EvalResult::Resume(v) => {
last_value = v;
break;
}
}
}
}
}
}
}
Ok(last_value)
}
@@ -1265,6 +1432,33 @@ impl Interpreter {
Ok(Value::Unit)
}
Declaration::ExternFn(ext) => {
// Register a placeholder that errors at runtime
let name = ext.name.name.clone();
let arity = ext.params.len();
// Create a closure that produces a clear error
let closure = Closure {
params: ext.params.iter().map(|p| p.name.name.clone()).collect(),
body: Expr::Literal(crate::ast::Literal {
kind: crate::ast::LiteralKind::Unit,
span: ext.span,
}),
env: self.global_env.clone(),
};
// We store an ExternFn marker value
self.global_env
.define(&name, Value::ExternFn { name: name.clone(), arity });
Ok(Value::Unit)
}
Declaration::ExternLet(ext) => {
// Register a placeholder that errors at runtime (extern lets only work in JS)
let name = ext.name.name.clone();
self.global_env
.define(&name, Value::ExternFn { name: name.clone(), arity: 0 });
Ok(Value::Unit)
}
Declaration::Effect(_) | Declaration::Trait(_) | Declaration::Impl(_) => {
// These are compile-time only
Ok(Value::Unit)
@@ -1415,6 +1609,34 @@ impl Interpreter {
}
}
Expr::TupleIndex {
object,
index,
span,
} => {
let obj_val = self.eval_expr(object, env)?;
match obj_val {
Value::Tuple(elements) => {
if *index < elements.len() {
Ok(EvalResult::Value(elements[*index].clone()))
} else {
Err(RuntimeError {
message: format!(
"Tuple index {} out of bounds for tuple with {} elements",
index,
elements.len()
),
span: Some(*span),
})
}
}
_ => Err(RuntimeError {
message: format!("Cannot use tuple index on {}", obj_val.type_name()),
span: Some(*span),
}),
}
}
Expr::Lambda { params, body, .. } => {
let closure = Closure {
params: params.iter().map(|p| p.name.name.clone()).collect(),
@@ -1481,8 +1703,28 @@ impl Interpreter {
self.eval_expr_tail(result, &block_env, tail)
}
Expr::Record { fields, .. } => {
Expr::Record {
spread, fields, ..
} => {
let mut record = HashMap::new();
// If there's a spread, evaluate it and start with its fields
if let Some(spread_expr) = spread {
let spread_val = self.eval_expr(spread_expr, env)?;
if let Value::Record(spread_fields) = spread_val {
record = spread_fields;
} else {
return Err(RuntimeError {
message: format!(
"Spread expression must evaluate to a record, got {}",
spread_val.type_name()
),
span: Some(expr.span()),
});
}
}
// Override with explicit fields
for (name, expr) in fields {
let val = self.eval_expr(expr, env)?;
record.insert(name.name.clone(), val);
@@ -1555,6 +1797,18 @@ impl Interpreter {
span: Some(span),
}),
},
BinaryOp::Concat => match (left, right) {
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
(Value::List(a), Value::List(b)) => {
let mut result = a;
result.extend(b);
Ok(Value::List(result))
}
(l, r) => Err(RuntimeError {
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
span: Some(span),
}),
},
BinaryOp::Sub => match (left, right) {
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
@@ -1724,6 +1978,13 @@ impl Interpreter {
}))
}
Value::Builtin(builtin) => self.eval_builtin(builtin, args, span),
Value::ExternFn { name, .. } => Err(RuntimeError {
message: format!(
"Extern function '{}' can only be called when compiled to JavaScript (use `lux build --target js`)",
name
),
span: Some(span),
}),
v => Err(RuntimeError {
message: format!("Cannot call {}", v.type_name()),
span: Some(span),
@@ -2223,6 +2484,46 @@ impl Interpreter {
Ok(EvalResult::Value(Value::String(result)))
}
BuiltinFn::IntToString => {
if args.len() != 1 {
return Err(err("Int.toString requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::String(format!("{}", n)))),
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
}
}
BuiltinFn::FloatToString => {
if args.len() != 1 {
return Err(err("Float.toString requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::String(format!("{}", f)))),
v => Ok(EvalResult::Value(Value::String(format!("{}", v)))),
}
}
BuiltinFn::IntToFloat => {
if args.len() != 1 {
return Err(err("Int.toFloat requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::Float(*n as f64))),
v => Err(err(&format!("Int.toFloat expects Int, got {}", v.type_name()))),
}
}
BuiltinFn::FloatToInt => {
if args.len() != 1 {
return Err(err("Float.toInt requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::Int(*f as i64))),
v => Err(err(&format!("Float.toInt expects Float, got {}", v.type_name()))),
}
}
BuiltinFn::TypeOf => {
if args.len() != 1 {
return Err(err("typeOf requires 1 argument"));
@@ -2399,6 +2700,45 @@ impl Interpreter {
}
}
BuiltinFn::MathSin => {
if args.len() != 1 {
return Err(err("Math.sin requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathCos => {
if args.len() != 1 {
return Err(err("Math.cos requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathAtan2 => {
if args.len() != 2 {
return Err(err("Math.atan2 requires 2 arguments: y, x"));
}
let y = match &args[0] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
let x = match &args[1] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
}
// Additional List operations
BuiltinFn::ListIsEmpty => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
@@ -2452,6 +2792,55 @@ impl Interpreter {
Ok(EvalResult::Value(Value::Bool(true)))
}
BuiltinFn::ListFindIndex => {
let (list, func) = Self::expect_args_2::<Vec<Value>, Value>(&args, "List.findIndex", span)?;
for (i, item) in list.iter().enumerate() {
let v = self.eval_call_to_value(func.clone(), vec![item.clone()], span)?;
match v {
Value::Bool(true) => {
return Ok(EvalResult::Value(Value::Constructor {
name: "Some".to_string(),
fields: vec![Value::Int(i as i64)],
}));
}
Value::Bool(false) => {}
_ => return Err(err("List.findIndex predicate must return Bool")),
}
}
Ok(EvalResult::Value(Value::Constructor {
name: "None".to_string(),
fields: vec![],
}))
}
BuiltinFn::ListZip => {
let (list1, list2) = Self::expect_args_2::<Vec<Value>, Vec<Value>>(&args, "List.zip", span)?;
let result: Vec<Value> = list1
.into_iter()
.zip(list2.into_iter())
.map(|(a, b)| Value::Tuple(vec![a, b]))
.collect();
Ok(EvalResult::Value(Value::List(result)))
}
BuiltinFn::ListFlatten => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.flatten", span)?;
let mut result = Vec::new();
for item in list {
match item {
Value::List(inner) => result.extend(inner),
other => result.push(other),
}
}
Ok(EvalResult::Value(Value::List(result)))
}
BuiltinFn::ListContains => {
let (list, target) = Self::expect_args_2::<Vec<Value>, Value>(&args, "List.contains", span)?;
let found = list.iter().any(|item| Value::values_equal(item, &target));
Ok(EvalResult::Value(Value::Bool(found)))
}
BuiltinFn::ListTake => {
let (list, n) = Self::expect_args_2::<Vec<Value>, i64>(&args, "List.take", span)?;
let n = n.max(0) as usize;
@@ -2478,6 +2867,67 @@ impl Interpreter {
Ok(EvalResult::Value(Value::Unit))
}
BuiltinFn::ListSort => {
// List.sort(list) - sort using natural ordering (Int, Float, String, Bool)
let mut list =
Self::expect_arg_1::<Vec<Value>>(&args, "List.sort", span)?;
list.sort_by(|a, b| Self::compare_values(a, b));
Ok(EvalResult::Value(Value::List(list)))
}
BuiltinFn::ListSortBy => {
// List.sortBy(list, fn(a, b) => Int) - sort with custom comparator
// Comparator returns negative (a < b), 0 (a == b), or positive (a > b)
let (list, func) =
Self::expect_args_2::<Vec<Value>, Value>(&args, "List.sortBy", span)?;
let mut indexed: Vec<(usize, Value)> =
list.into_iter().enumerate().collect();
let mut err: Option<RuntimeError> = None;
let func_ref = &func;
let self_ptr = self as *mut Self;
indexed.sort_by(|a, b| {
if err.is_some() {
return std::cmp::Ordering::Equal;
}
// Safety: we're in a single-threaded context and the closure
// needs mutable access to call eval_call_to_value
let interp = unsafe { &mut *self_ptr };
match interp.eval_call_to_value(
func_ref.clone(),
vec![a.1.clone(), b.1.clone()],
span,
) {
Ok(Value::Int(n)) => {
if n < 0 {
std::cmp::Ordering::Less
} else if n > 0 {
std::cmp::Ordering::Greater
} else {
std::cmp::Ordering::Equal
}
}
Ok(_) => {
err = Some(RuntimeError {
message: "List.sortBy comparator must return Int"
.to_string(),
span: Some(span),
});
std::cmp::Ordering::Equal
}
Err(e) => {
err = Some(e);
std::cmp::Ordering::Equal
}
}
});
if let Some(e) = err {
return Err(e);
}
let result: Vec<Value> =
indexed.into_iter().map(|(_, v)| v).collect();
Ok(EvalResult::Value(Value::List(result)))
}
// Additional String operations
BuiltinFn::StringStartsWith => {
let (s, prefix) = Self::expect_args_2::<String, String>(&args, "String.startsWith", span)?;
@@ -2888,6 +3338,178 @@ impl Interpreter {
}
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
}
// Map operations
BuiltinFn::MapNew => {
Ok(EvalResult::Value(Value::Map(HashMap::new())))
}
BuiltinFn::MapSet => {
if args.len() != 3 {
return Err(err("Map.set requires 3 arguments: map, key, value"));
}
let mut map = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.set expects Map as first argument, got {}", v.type_name()))),
};
let key = match &args[1] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.set expects String key, got {}", v.type_name()))),
};
map.insert(key, args[2].clone());
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapGet => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.get", span)?;
match map.get(&key) {
Some(v) => Ok(EvalResult::Value(Value::Constructor {
name: "Some".to_string(),
fields: vec![v.clone()],
})),
None => Ok(EvalResult::Value(Value::Constructor {
name: "None".to_string(),
fields: vec![],
})),
}
}
BuiltinFn::MapContains => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.contains", span)?;
Ok(EvalResult::Value(Value::Bool(map.contains_key(&key))))
}
BuiltinFn::MapRemove => {
let (mut map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.remove", span)?;
map.remove(&key);
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapKeys => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.keys", span)?;
let mut keys: Vec<String> = map.keys().cloned().collect();
keys.sort();
Ok(EvalResult::Value(Value::List(
keys.into_iter().map(Value::String).collect(),
)))
}
BuiltinFn::MapValues => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.values", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries.into_iter().map(|(_, v)| v).collect(),
)))
}
BuiltinFn::MapSize => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.size", span)?;
Ok(EvalResult::Value(Value::Int(map.len() as i64)))
}
BuiltinFn::MapIsEmpty => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.isEmpty", span)?;
Ok(EvalResult::Value(Value::Bool(map.is_empty())))
}
BuiltinFn::MapFromList => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "Map.fromList", span)?;
let mut map = HashMap::new();
for item in list {
match item {
Value::Tuple(fields) if fields.len() == 2 => {
let key = match &fields[0] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.fromList expects (String, V) tuples, got {} key", v.type_name()))),
};
map.insert(key, fields[1].clone());
}
_ => return Err(err("Map.fromList expects List<(String, V)>")),
}
}
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapToList => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.toList", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries
.into_iter()
.map(|(k, v)| Value::Tuple(vec![Value::String(k), v]))
.collect(),
)))
}
BuiltinFn::MapMerge => {
if args.len() != 2 {
return Err(err("Map.merge requires 2 arguments: map1, map2"));
}
let mut map1 = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as first argument, got {}", v.type_name()))),
};
let map2 = match &args[1] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as second argument, got {}", v.type_name()))),
};
for (k, v) in map2 {
map1.insert(k, v);
}
Ok(EvalResult::Value(Value::Map(map1)))
}
BuiltinFn::RefNew => {
if args.len() != 1 {
return Err(err("Ref.new requires 1 argument"));
}
Ok(EvalResult::Value(Value::Ref(Rc::new(RefCell::new(args.into_iter().next().unwrap())))))
}
BuiltinFn::RefGet => {
if args.len() != 1 {
return Err(err("Ref.get requires 1 argument"));
}
match &args[0] {
Value::Ref(cell) => Ok(EvalResult::Value(cell.borrow().clone())),
v => Err(err(&format!("Ref.get expects Ref, got {}", v.type_name()))),
}
}
BuiltinFn::RefSet => {
if args.len() != 2 {
return Err(err("Ref.set requires 2 arguments: ref, value"));
}
match &args[0] {
Value::Ref(cell) => {
*cell.borrow_mut() = args[1].clone();
Ok(EvalResult::Value(Value::Unit))
}
v => Err(err(&format!("Ref.set expects Ref as first argument, got {}", v.type_name()))),
}
}
BuiltinFn::RefUpdate => {
if args.len() != 2 {
return Err(err("Ref.update requires 2 arguments: ref, fn"));
}
match &args[0] {
Value::Ref(cell) => {
let old = cell.borrow().clone();
let result = self.eval_call(args[1].clone(), vec![old], span)?;
match result {
EvalResult::Value(new_val) => {
*cell.borrow_mut() = new_val;
}
_ => return Err(err("Ref.update callback must return a value")),
}
Ok(EvalResult::Value(Value::Unit))
}
v => Err(err(&format!("Ref.update expects Ref as first argument, got {}", v.type_name()))),
}
}
}
}
@@ -2971,6 +3593,18 @@ impl Interpreter {
})
}
/// Compare two values for natural ordering (used by List.sort)
fn compare_values(a: &Value, b: &Value) -> std::cmp::Ordering {
match (a, b) {
(Value::Int(x), Value::Int(y)) => x.cmp(y),
(Value::Float(x), Value::Float(y)) => x.partial_cmp(y).unwrap_or(std::cmp::Ordering::Equal),
(Value::String(x), Value::String(y)) => x.cmp(y),
(Value::Bool(x), Value::Bool(y)) => x.cmp(y),
(Value::Char(x), Value::Char(y)) => x.cmp(y),
_ => std::cmp::Ordering::Equal,
}
}
fn match_pattern(&self, pattern: &Pattern, value: &Value) -> Option<Vec<(String, Value)>> {
match pattern {
Pattern::Wildcard(_) => Some(Vec::new()),
@@ -3053,6 +3687,11 @@ impl Interpreter {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(Value::Map(a), Value::Map(b)) => {
a.len() == b.len() && a.iter().all(|(k, v)| {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(
Value::Constructor {
name: n1,
@@ -3473,6 +4112,119 @@ impl Interpreter {
}
}
("File", "copy") => {
let source = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string source path".to_string(),
span: None,
}),
};
let dest = match request.args.get(1) {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.copy requires a string destination path".to_string(),
span: None,
}),
};
match std::fs::copy(&source, &dest) {
Ok(_) => Ok(Value::Unit),
Err(e) => Err(RuntimeError {
message: format!("Failed to copy '{}' to '{}': {}", source, dest, e),
span: None,
}),
}
}
("File", "glob") => {
let pattern = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.glob requires a string pattern".to_string(),
span: None,
}),
};
match glob::glob(&pattern) {
Ok(paths) => {
let entries: Vec<Value> = paths
.filter_map(|entry| entry.ok())
.map(|path| Value::String(path.to_string_lossy().to_string()))
.collect();
Ok(Value::List(entries))
}
Err(e) => Err(RuntimeError {
message: format!("Invalid glob pattern '{}': {}", pattern, e),
span: None,
}),
}
}
// ===== File Effect (safe Result-returning variants) =====
("File", "tryRead") => {
let path = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.tryRead requires a string path".to_string(),
span: None,
}),
};
match std::fs::read_to_string(&path) {
Ok(content) => Ok(Value::Constructor {
name: "Ok".to_string(),
fields: vec![Value::String(content)],
}),
Err(e) => Ok(Value::Constructor {
name: "Err".to_string(),
fields: vec![Value::String(format!("Failed to read file '{}': {}", path, e))],
}),
}
}
("File", "tryWrite") => {
let path = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.tryWrite requires a string path".to_string(),
span: None,
}),
};
let content = match request.args.get(1) {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.tryWrite requires string content".to_string(),
span: None,
}),
};
match std::fs::write(&path, &content) {
Ok(()) => Ok(Value::Constructor {
name: "Ok".to_string(),
fields: vec![Value::Unit],
}),
Err(e) => Ok(Value::Constructor {
name: "Err".to_string(),
fields: vec![Value::String(format!("Failed to write file '{}': {}", path, e))],
}),
}
}
("File", "tryDelete") => {
let path = match request.args.first() {
Some(Value::String(s)) => s.clone(),
_ => return Err(RuntimeError {
message: "File.tryDelete requires a string path".to_string(),
span: None,
}),
};
match std::fs::remove_file(&path) {
Ok(()) => Ok(Value::Constructor {
name: "Ok".to_string(),
fields: vec![Value::Unit],
}),
Err(e) => Ok(Value::Constructor {
name: "Err".to_string(),
fields: vec![Value::String(format!("Failed to delete file '{}': {}", path, e))],
}),
}
}
// ===== Process Effect =====
("Process", "exec") => {
use std::process::Command;
@@ -3828,6 +4580,26 @@ impl Interpreter {
}
Ok(Value::Unit)
}
("Test", "assertEqualMsg") => {
let expected = request.args.first().cloned().unwrap_or(Value::Unit);
let actual = request.args.get(1).cloned().unwrap_or(Value::Unit);
let label = match request.args.get(2) {
Some(Value::String(s)) => s.clone(),
_ => "Values not equal".to_string(),
};
if Value::values_equal(&expected, &actual) {
self.test_results.borrow_mut().passed += 1;
} else {
self.test_results.borrow_mut().failed += 1;
self.test_results.borrow_mut().failures.push(TestFailure {
message: label,
expected: Some(format!("{}", expected)),
actual: Some(format!("{}", actual)),
});
}
Ok(Value::Unit)
}
("Test", "assertNotEqual") => {
let a = request.args.first().cloned().unwrap_or(Value::Unit);
let b = request.args.get(1).cloned().unwrap_or(Value::Unit);
@@ -4960,6 +5732,7 @@ mod tests {
// Create a simple migration that adds a field
// Migration: old.name -> { name: old.name, email: "unknown" }
let migration_body = Expr::Record {
spread: None,
fields: vec![
(
Ident::new("name", Span::default()),

View File

@@ -42,6 +42,7 @@ pub enum TokenKind {
Effect,
Handler,
Run,
Handle,
Resume,
Type,
True,
@@ -54,6 +55,7 @@ pub enum TokenKind {
Trait, // trait (for type classes)
Impl, // impl (for trait implementations)
For, // for (in impl Trait for Type)
Extern, // extern (for FFI declarations)
// Documentation
DocComment(String), // /// doc comment
@@ -70,6 +72,7 @@ pub enum TokenKind {
// Operators
Plus, // +
PlusPlus, // ++
Minus, // -
Star, // *
Slash, // /
@@ -89,6 +92,7 @@ pub enum TokenKind {
Arrow, // =>
ThinArrow, // ->
Dot, // .
DotDotDot, // ...
Colon, // :
ColonColon, // ::
Comma, // ,
@@ -138,6 +142,7 @@ impl fmt::Display for TokenKind {
TokenKind::Effect => write!(f, "effect"),
TokenKind::Handler => write!(f, "handler"),
TokenKind::Run => write!(f, "run"),
TokenKind::Handle => write!(f, "handle"),
TokenKind::Resume => write!(f, "resume"),
TokenKind::Type => write!(f, "type"),
TokenKind::Import => write!(f, "import"),
@@ -148,6 +153,7 @@ impl fmt::Display for TokenKind {
TokenKind::Trait => write!(f, "trait"),
TokenKind::Impl => write!(f, "impl"),
TokenKind::For => write!(f, "for"),
TokenKind::Extern => write!(f, "extern"),
TokenKind::DocComment(s) => write!(f, "/// {}", s),
TokenKind::Is => write!(f, "is"),
TokenKind::Pure => write!(f, "pure"),
@@ -160,6 +166,7 @@ impl fmt::Display for TokenKind {
TokenKind::True => write!(f, "true"),
TokenKind::False => write!(f, "false"),
TokenKind::Plus => write!(f, "+"),
TokenKind::PlusPlus => write!(f, "++"),
TokenKind::Minus => write!(f, "-"),
TokenKind::Star => write!(f, "*"),
TokenKind::Slash => write!(f, "/"),
@@ -179,6 +186,7 @@ impl fmt::Display for TokenKind {
TokenKind::Arrow => write!(f, "=>"),
TokenKind::ThinArrow => write!(f, "->"),
TokenKind::Dot => write!(f, "."),
TokenKind::DotDotDot => write!(f, "..."),
TokenKind::Colon => write!(f, ":"),
TokenKind::ColonColon => write!(f, "::"),
TokenKind::Comma => write!(f, ","),
@@ -268,7 +276,14 @@ impl<'a> Lexer<'a> {
let kind = match c {
// Single-character tokens
'+' => TokenKind::Plus,
'+' => {
if self.peek() == Some('+') {
self.advance();
TokenKind::PlusPlus
} else {
TokenKind::Plus
}
}
'*' => TokenKind::Star,
'%' => TokenKind::Percent,
'(' => TokenKind::LParen,
@@ -364,7 +379,22 @@ impl<'a> Lexer<'a> {
TokenKind::Pipe
}
}
'.' => TokenKind::Dot,
'.' => {
if self.peek() == Some('.') {
// Check for ... (need to peek past second dot)
// We look at source directly since we can only peek one ahead
let next_next = self.source[self.pos..].chars().nth(1);
if next_next == Some('.') {
self.advance(); // consume second '.'
self.advance(); // consume third '.'
TokenKind::DotDotDot
} else {
TokenKind::Dot
}
} else {
TokenKind::Dot
}
}
':' => {
if self.peek() == Some(':') {
self.advance();
@@ -383,7 +413,26 @@ impl<'a> Lexer<'a> {
}
// String literals
'"' => self.scan_string(start)?,
'"' => {
// Check for triple-quote multiline string """
if self.peek() == Some('"') {
// Clone to peek at the second char
let mut lookahead = self.chars.clone();
lookahead.next(); // consume first peeked "
if lookahead.peek() == Some(&'"') {
// It's a triple-quote: consume both remaining quotes
self.advance(); // second "
self.advance(); // third "
self.scan_multiline_string(start)?
} else {
// It's an empty string ""
self.advance(); // consume closing "
TokenKind::String(String::new())
}
} else {
self.scan_string(start)?
}
}
// Char literals
'\'' => self.scan_char(start)?,
@@ -493,6 +542,8 @@ impl<'a> Lexer<'a> {
Some('"') => '"',
Some('0') => '\0',
Some('\'') => '\'',
Some('{') => '{',
Some('}') => '}',
Some('x') => {
// Hex escape \xNN
let h1 = self.advance().and_then(|c| c.to_digit(16));
@@ -639,6 +690,211 @@ impl<'a> Lexer<'a> {
Ok(TokenKind::InterpolatedString(parts))
}
fn scan_multiline_string(&mut self, _start: usize) -> Result<TokenKind, LexError> {
let mut parts: Vec<StringPart> = Vec::new();
let mut current_literal = String::new();
// Skip the first newline after opening """ if present
if self.peek() == Some('\n') {
self.advance();
} else if self.peek() == Some('\r') {
self.advance();
if self.peek() == Some('\n') {
self.advance();
}
}
loop {
match self.advance() {
Some('"') => {
// Check for closing """
if self.peek() == Some('"') {
let mut lookahead = self.chars.clone();
lookahead.next(); // consume first peeked "
if lookahead.peek() == Some(&'"') {
// Closing """ found
self.advance(); // second "
self.advance(); // third "
break;
}
}
// Not closing triple-quote, just a regular " in the string
current_literal.push('"');
}
Some('\\') => {
// Handle escape sequences (same as regular strings)
match self.peek() {
Some('{') => {
self.advance();
current_literal.push('{');
}
Some('}') => {
self.advance();
current_literal.push('}');
}
_ => {
let escape_start = self.pos;
let escaped = match self.advance() {
Some('n') => '\n',
Some('r') => '\r',
Some('t') => '\t',
Some('\\') => '\\',
Some('"') => '"',
Some('0') => '\0',
Some('\'') => '\'',
Some(c) => {
return Err(LexError {
message: format!("Invalid escape sequence: \\{}", c),
span: Span::new(escape_start - 1, self.pos),
});
}
None => {
return Err(LexError {
message: "Unterminated multiline string".into(),
span: Span::new(_start, self.pos),
});
}
};
current_literal.push(escaped);
}
}
}
Some('{') => {
// Interpolation (same as regular strings)
if !current_literal.is_empty() {
parts.push(StringPart::Literal(std::mem::take(&mut current_literal)));
}
let mut expr_text = String::new();
let mut brace_depth = 1;
loop {
match self.advance() {
Some('{') => {
brace_depth += 1;
expr_text.push('{');
}
Some('}') => {
brace_depth -= 1;
if brace_depth == 0 {
break;
}
expr_text.push('}');
}
Some(c) => expr_text.push(c),
None => {
return Err(LexError {
message: "Unterminated interpolation in multiline string"
.into(),
span: Span::new(_start, self.pos),
});
}
}
}
parts.push(StringPart::Expr(expr_text));
}
Some(c) => current_literal.push(c),
None => {
return Err(LexError {
message: "Unterminated multiline string".into(),
span: Span::new(_start, self.pos),
});
}
}
}
// Strip common leading whitespace from all lines
let strip_indent = |s: &str| -> String {
if s.is_empty() {
return String::new();
}
let lines: Vec<&str> = s.split('\n').collect();
// Find minimum indentation of non-empty lines
let min_indent = lines
.iter()
.filter(|line| !line.trim().is_empty())
.map(|line| line.len() - line.trim_start().len())
.min()
.unwrap_or(0);
// Strip that indentation from each line
lines
.iter()
.map(|line| {
if line.len() >= min_indent {
&line[min_indent..]
} else {
line.trim_start()
}
})
.collect::<Vec<_>>()
.join("\n")
};
// Strip trailing whitespace-only line before closing """
let trim_trailing = |s: &mut String| {
// Remove trailing spaces/tabs (indent before closing """)
while s.ends_with(' ') || s.ends_with('\t') {
s.pop();
}
// Remove the trailing newline
if s.ends_with('\n') {
s.pop();
if s.ends_with('\r') {
s.pop();
}
}
};
if parts.is_empty() {
trim_trailing(&mut current_literal);
let result = strip_indent(&current_literal);
return Ok(TokenKind::String(result));
}
// Add remaining literal
if !current_literal.is_empty() {
trim_trailing(&mut current_literal);
parts.push(StringPart::Literal(current_literal));
}
// For interpolated multiline strings, strip indent from literal parts
// First, collect all literal content to find min indent
let mut all_text = String::new();
for part in &parts {
if let StringPart::Literal(lit) = part {
all_text.push_str(lit);
}
}
let lines: Vec<&str> = all_text.split('\n').collect();
let min_indent = lines
.iter()
.filter(|line| !line.trim().is_empty())
.map(|line| line.len() - line.trim_start().len())
.min()
.unwrap_or(0);
if min_indent > 0 {
for part in &mut parts {
if let StringPart::Literal(lit) = part {
let stripped_lines: Vec<&str> = lit
.split('\n')
.map(|line| {
if line.len() >= min_indent {
&line[min_indent..]
} else {
line.trim_start()
}
})
.collect();
*lit = stripped_lines.join("\n");
}
}
}
Ok(TokenKind::InterpolatedString(parts))
}
fn scan_char(&mut self, start: usize) -> Result<TokenKind, LexError> {
let c = match self.advance() {
Some('\\') => match self.advance() {
@@ -743,6 +999,7 @@ impl<'a> Lexer<'a> {
"effect" => TokenKind::Effect,
"handler" => TokenKind::Handler,
"run" => TokenKind::Run,
"handle" => TokenKind::Handle,
"resume" => TokenKind::Resume,
"type" => TokenKind::Type,
"import" => TokenKind::Import,
@@ -753,6 +1010,7 @@ impl<'a> Lexer<'a> {
"trait" => TokenKind::Trait,
"impl" => TokenKind::Impl,
"for" => TokenKind::For,
"extern" => TokenKind::Extern,
"is" => TokenKind::Is,
"pure" => TokenKind::Pure,
"total" => TokenKind::Total,
@@ -761,6 +1019,8 @@ impl<'a> Lexer<'a> {
"commutative" => TokenKind::Commutative,
"where" => TokenKind::Where,
"assume" => TokenKind::Assume,
"and" => TokenKind::And,
"or" => TokenKind::Or,
"true" => TokenKind::Bool(true),
"false" => TokenKind::Bool(false),
_ => TokenKind::Ident(ident.to_string()),

View File

@@ -403,6 +403,12 @@ impl Linter {
Declaration::Function(f) => {
self.defined_functions.insert(f.name.name.clone());
}
Declaration::ExternFn(e) => {
self.defined_functions.insert(e.name.name.clone());
}
Declaration::ExternLet(e) => {
self.define_var(&e.name.name);
}
Declaration::Let(l) => {
self.define_var(&l.name.name);
}
@@ -510,10 +516,13 @@ impl Linter {
self.collect_refs_expr(&arm.body);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.collect_refs_expr(object);
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_refs_expr(spread_expr);
}
for (_, val) in fields {
self.collect_refs_expr(val);
}

View File

@@ -317,66 +317,227 @@ impl LspServer {
let doc = self.documents.get(&uri)?;
let source = &doc.text;
// Try to get info from symbol table first
// Try to get info from symbol table first (position-based lookup)
if let Some(ref table) = doc.symbol_table {
let offset = self.position_to_offset(source, position);
if let Some(symbol) = table.definition_at_position(offset) {
let signature = symbol.type_signature.as_ref()
.map(|s| s.as_str())
.unwrap_or(&symbol.name);
let kind_str = match symbol.kind {
SymbolKind::Function => "function",
SymbolKind::Variable => "variable",
SymbolKind::Parameter => "parameter",
SymbolKind::Type => "type",
SymbolKind::TypeParameter => "type parameter",
SymbolKind::Variant => "variant",
SymbolKind::Effect => "effect",
SymbolKind::EffectOperation => "effect operation",
SymbolKind::Field => "field",
SymbolKind::Module => "module",
};
let doc_str = symbol.documentation.as_ref()
.map(|d| format!("\n\n{}", d))
.unwrap_or_default();
// Format signature: wrap long signatures onto multiple lines
let formatted_sig = format_signature_for_hover(signature);
// Add behavioral property documentation if present
let property_docs = extract_property_docs(signature);
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n*{}*{}{}", formatted_sig, kind_str, property_docs, doc_str),
}),
range: None,
});
return Some(self.format_symbol_hover(symbol));
}
}
// Fall back to hardcoded info
// Extract the word at the cursor position
// Get the word under cursor
let word = self.get_word_at_position(source, position)?;
// Look up rich documentation for known symbols
let info = self.get_rich_symbol_info(&word)
.or_else(|| self.get_symbol_info(&word).map(|(s, d)| (s.to_string(), d.to_string())));
// When hovering on a keyword like 'fn', 'type', 'effect', 'let', 'trait',
// look ahead to find the declaration name and show that symbol's info
if let Some(ref table) = doc.symbol_table {
if matches!(word.as_str(), "fn" | "type" | "effect" | "let" | "trait" | "handler" | "impl") {
let offset = self.position_to_offset(source, position);
if let Some(name) = self.find_next_ident(source, offset + word.len()) {
for sym in table.global_symbols() {
if sym.name == name {
return Some(self.format_symbol_hover(sym));
}
}
}
}
if let Some((signature, doc)) = info {
let formatted_sig = format_signature_for_hover(&signature);
Some(Hover {
// Try name-based lookup in symbol table (for usage sites)
for sym in table.global_symbols() {
if sym.name == word {
return Some(self.format_symbol_hover(sym));
}
}
}
// Check for module names (Console, List, String, etc.)
if let Some(hover) = self.get_module_hover(&word) {
return Some(hover);
}
// Rich documentation for behavioral property keywords
if let Some((signature, doc_text)) = self.get_rich_symbol_info(&word) {
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n{}", formatted_sig, doc),
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
}),
range: None,
})
} else {
None
});
}
// Builtin keyword/function info
if let Some((signature, doc_text)) = self.get_symbol_info(&word) {
return Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n\n{}", signature, doc_text),
}),
range: None,
});
}
None
}
/// Format a symbol into a hover response
fn format_symbol_hover(&self, symbol: &crate::symbol_table::Symbol) -> Hover {
let signature = symbol.type_signature.as_ref()
.map(|s| s.as_str())
.unwrap_or(&symbol.name);
let kind_str = match symbol.kind {
SymbolKind::Function => "function",
SymbolKind::Variable => "variable",
SymbolKind::Parameter => "parameter",
SymbolKind::Type => "type",
SymbolKind::TypeParameter => "type parameter",
SymbolKind::Variant => "variant",
SymbolKind::Effect => "effect",
SymbolKind::EffectOperation => "effect operation",
SymbolKind::Field => "field",
SymbolKind::Module => "module",
};
let doc_str = symbol.documentation.as_ref()
.map(|d| format!("\n\n{}", d))
.unwrap_or_default();
let formatted_sig = format_signature_for_hover(signature);
let property_docs = extract_property_docs(signature);
Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!(
"```lux\n{}\n```\n*{}*{}{}",
formatted_sig, kind_str, property_docs, doc_str
),
}),
range: None,
}
}
/// Get hover info for built-in module names
fn get_module_hover(&self, name: &str) -> Option<Hover> {
let (sig, doc) = match name {
"Console" => (
"effect Console",
"**Console I/O**\n\n\
- `Console.print(msg: String): Unit` — print to stdout\n\
- `Console.readLine(): String` — read a line from stdin\n\
- `Console.readInt(): Int` — read an integer from stdin",
),
"File" => (
"effect File",
"**File System**\n\n\
- `File.read(path: String): String` — read file contents\n\
- `File.write(path: String, content: String): Unit` — write to file\n\
- `File.append(path: String, content: String): Unit` — append to file\n\
- `File.exists(path: String): Bool` — check if file exists\n\
- `File.delete(path: String): Unit` — delete a file\n\
- `File.list(path: String): List<String>` — list directory",
),
"Http" => (
"effect Http",
"**HTTP Client**\n\n\
- `Http.get(url: String): String` — GET request\n\
- `Http.post(url: String, body: String): String` — POST request\n\
- `Http.put(url: String, body: String): String` — PUT request\n\
- `Http.delete(url: String): String` — DELETE request",
),
"Sql" => (
"effect Sql",
"**SQL Database**\n\n\
- `Sql.open(path: String): Connection` — open database\n\
- `Sql.execute(conn: Connection, sql: String): Unit` — execute SQL\n\
- `Sql.query(conn: Connection, sql: String): List<Row>` — query rows\n\
- `Sql.close(conn: Connection): Unit` — close connection",
),
"Random" => (
"effect Random",
"**Random Number Generation**\n\n\
- `Random.int(min: Int, max: Int): Int` — random integer\n\
- `Random.float(): Float` — random float 0.01.0\n\
- `Random.bool(): Bool` — random boolean",
),
"Time" => (
"effect Time",
"**Time**\n\n\
- `Time.now(): Int` — current Unix timestamp (ms)\n\
- `Time.sleep(ms: Int): Unit` — sleep for milliseconds",
),
"Process" => (
"effect Process",
"**Process / System**\n\n\
- `Process.exec(cmd: String): String` — run shell command\n\
- `Process.env(name: String): String` — get env variable\n\
- `Process.args(): List<String>` — command-line arguments\n\
- `Process.exit(code: Int): Unit` — exit with code",
),
"Math" => (
"module Math",
"**Math Functions**\n\n\
- `Math.abs(n: Int): Int` — absolute value\n\
- `Math.min(a: Int, b: Int): Int` — minimum\n\
- `Math.max(a: Int, b: Int): Int` — maximum\n\
- `Math.sqrt(n: Float): Float` — square root\n\
- `Math.pow(base: Float, exp: Float): Float` — power\n\
- `Math.floor(n: Float): Int` — round down\n\
- `Math.ceil(n: Float): Int` — round up",
),
"List" => (
"module List",
"**List Operations**\n\n\
- `List.map(list, f)` — transform each element\n\
- `List.filter(list, p)` — keep matching elements\n\
- `List.fold(list, init, f)` — reduce to single value\n\
- `List.head(list)` — first element (Option)\n\
- `List.tail(list)` — all except first (Option)\n\
- `List.length(list)` — number of elements\n\
- `List.concat(a, b)` — concatenate lists\n\
- `List.range(start, end)` — integer range\n\
- `List.reverse(list)` — reverse order\n\
- `List.get(list, i)` — element at index (Option)",
),
"String" => (
"module String",
"**String Operations**\n\n\
- `String.length(s)` — string length\n\
- `String.split(s, delim)` — split by delimiter\n\
- `String.join(list, delim)` — join with delimiter\n\
- `String.trim(s)` — trim whitespace\n\
- `String.contains(s, sub)` — check substring\n\
- `String.replace(s, from, to)` — replace occurrences\n\
- `String.startsWith(s, prefix)` — check prefix\n\
- `String.endsWith(s, suffix)` — check suffix\n\
- `String.substring(s, start, end)` — extract range\n\
- `String.chars(s)` — list of characters",
),
"Option" => (
"type Option<A> = Some(A) | None",
"**Optional Value**\n\n\
- `Option.isSome(opt)` — has a value?\n\
- `Option.isNone(opt)` — is empty?\n\
- `Option.getOrElse(opt, default)` — unwrap or default\n\
- `Option.map(opt, f)` — transform if present\n\
- `Option.flatMap(opt, f)` — chain operations",
),
"Result" => (
"type Result<A, E> = Ok(A) | Err(E)",
"**Result of Fallible Operation**\n\n\
- `Result.isOk(r)` — succeeded?\n\
- `Result.isErr(r)` — failed?\n\
- `Result.map(r, f)` — transform success value\n\
- `Result.mapErr(r, f)` — transform error value",
),
_ => return None,
};
Some(Hover {
contents: HoverContents::Markup(MarkupContent {
kind: MarkupKind::Markdown,
value: format!("```lux\n{}\n```\n{}", sig, doc),
}),
range: None,
})
}
fn get_word_at_position(&self, source: &str, position: Position) -> Option<String> {
@@ -402,6 +563,26 @@ impl LspServer {
}
}
/// Find the next identifier in source after the given offset (skipping whitespace)
fn find_next_ident(&self, source: &str, start: usize) -> Option<String> {
let chars: Vec<char> = source.chars().collect();
let mut pos = start;
// Skip whitespace
while pos < chars.len() && (chars[pos] == ' ' || chars[pos] == '\t' || chars[pos] == '\n' || chars[pos] == '\r') {
pos += 1;
}
// Collect identifier
let ident_start = pos;
while pos < chars.len() && (chars[pos].is_alphanumeric() || chars[pos] == '_') {
pos += 1;
}
if pos > ident_start {
Some(chars[ident_start..pos].iter().collect())
} else {
None
}
}
fn get_symbol_info(&self, word: &str) -> Option<(&'static str, &'static str)> {
match word {
// Keywords
@@ -607,17 +788,11 @@ impl LspServer {
fn position_to_offset(&self, source: &str, position: Position) -> usize {
let mut offset = 0;
let mut line = 0u32;
for (i, c) in source.char_indices() {
if line == position.line {
let col = i - offset;
return offset + (position.character as usize).min(col + 1);
}
if c == '\n' {
line += 1;
offset = i + 1;
for (line_idx, line) in source.lines().enumerate() {
if line_idx == position.line as usize {
return offset + (position.character as usize).min(line.len());
}
offset += line.len() + 1; // +1 for newline
}
source.len()
}
@@ -1396,12 +1571,15 @@ fn collect_call_site_hints(
collect_call_site_hints(source, e, param_names, hints);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
collect_call_site_hints(source, spread_expr, param_names, hints);
}
for (_, e) in fields {
collect_call_site_hints(source, e, param_names, hints);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
collect_call_site_hints(source, object, param_names, hints);
}
Expr::Run { expr, handlers, .. } => {

View File

@@ -1,4 +1,7 @@
//! Lux - A functional programming language with first-class effects
//! Lux — Make the important things visible.
//!
//! A functional programming language with first-class effects, schema evolution,
//! and behavioral types. See `lux philosophy` or docs/PHILOSOPHY.md.
mod analysis;
mod ast;
@@ -34,7 +37,7 @@ use std::borrow::Cow;
use std::collections::HashSet;
use typechecker::TypeChecker;
const VERSION: &str = "0.1.0";
const VERSION: &str = env!("CARGO_PKG_VERSION");
const HELP: &str = r#"
Lux - A functional language with first-class effects
@@ -171,9 +174,14 @@ fn main() {
.and_then(|s| s.parse::<u16>().ok())
.unwrap_or(8080);
let dir = args.get(2)
.filter(|a| !a.starts_with('-'))
.map(|s| s.as_str())
let port_value_idx = args.iter()
.position(|a| a == "--port" || a == "-p")
.map(|i| i + 1);
let dir = args.iter().enumerate()
.skip(2)
.filter(|(i, a)| !a.starts_with('-') && Some(*i) != port_value_idx)
.map(|(_, a)| a.as_str())
.next()
.unwrap_or(".");
serve_static_files(dir, port);
@@ -185,10 +193,12 @@ fn main() {
eprintln!(" lux compile <file.lux> --run");
eprintln!(" lux compile <file.lux> --emit-c [-o file.c]");
eprintln!(" lux compile <file.lux> --target js [-o file.js]");
eprintln!(" lux compile <file.lux> --watch");
std::process::exit(1);
}
let run_after = args.iter().any(|a| a == "--run");
let emit_c = args.iter().any(|a| a == "--emit-c");
let watch = args.iter().any(|a| a == "--watch");
let target_js = args.iter()
.position(|a| a == "--target")
.and_then(|i| args.get(i + 1))
@@ -204,17 +214,34 @@ fn main() {
} else {
compile_to_c(&args[2], output_path, run_after, emit_c);
}
if watch {
// Build the args to replay for each recompilation (without --watch)
let compile_args: Vec<String> = args.iter()
.skip(1)
.filter(|a| a.as_str() != "--watch")
.cloned()
.collect();
watch_and_rerun(&args[2], &compile_args);
}
}
"repl" => {
// Start REPL
run_repl();
}
"doc" => {
// Generate API documentation
generate_docs(&args[2..]);
}
"philosophy" => {
print_philosophy();
}
cmd => {
// Check if it looks like a command typo
if !std::path::Path::new(cmd).exists() && !cmd.starts_with('-') && !cmd.contains('.') && !cmd.contains('/') {
let known_commands = vec![
"fmt", "lint", "test", "watch", "init", "check", "debug",
"pkg", "registry", "serve", "compile", "doc",
"pkg", "registry", "serve", "compile", "doc", "repl", "philosophy",
];
let suggestions = diagnostics::find_similar_names(cmd, known_commands.into_iter(), 2);
if !suggestions.is_empty() {
@@ -229,18 +256,24 @@ fn main() {
}
}
} else {
// Start REPL
run_repl();
// No arguments — show help
print_help();
}
}
fn print_help() {
println!("{}", bc(colors::GREEN, &format!("Lux {}", VERSION)));
println!("{}", c(colors::DIM, "A functional language with first-class effects"));
println!("{}", c(colors::DIM, "Make the important things visible."));
println!();
println!(" {} Effects in types — see what code does", c(colors::DIM, "·"));
println!(" {} Composition over configuration — no DI frameworks", c(colors::DIM, "·"));
println!(" {} Safety without ceremony — inference where it helps", c(colors::DIM, "·"));
println!(" {} One right way — opinionated formatter, integrated tools", c(colors::DIM, "·"));
println!();
println!("{}", bc("", "Usage:"));
println!();
println!(" {} Start the REPL", bc(colors::CYAN, "lux"));
println!(" {} Show this help", bc(colors::CYAN, "lux"));
println!(" {} Start the REPL", bc(colors::CYAN, "lux repl"));
println!(" {} {} Run a file (interpreter)", bc(colors::CYAN, "lux"), c(colors::YELLOW, "<file.lux>"));
println!(" {} {} {} Compile to native binary", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<file.lux>"));
println!(" {} {} {} {} Compile with output name", bc(colors::CYAN, "lux"), bc(colors::CYAN, "compile"), c(colors::YELLOW, "<f>"), c(colors::YELLOW, "-o app"));
@@ -275,6 +308,8 @@ fn print_help() {
c(colors::DIM, "(alias: s)"));
println!(" {} {} {} Generate API documentation",
bc(colors::CYAN, "lux"), bc(colors::CYAN, "doc"), c(colors::YELLOW, "[file] [-o dir]"));
println!(" {} {} Show language philosophy",
bc(colors::CYAN, "lux"), bc(colors::CYAN, "philosophy"));
println!(" {} {} Start LSP server",
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--lsp"));
println!(" {} {} Show this help",
@@ -283,6 +318,36 @@ fn print_help() {
bc(colors::CYAN, "lux"), c(colors::YELLOW, "--version"));
}
fn print_philosophy() {
println!("{}", bc(colors::GREEN, &format!("The Lux Philosophy")));
println!();
println!(" {}", bc("", "Make the important things visible."));
println!();
println!(" Most languages hide what matters most in production: what code");
println!(" can do, how data changes over time, and what guarantees functions");
println!(" provide. Lux makes all three first-class, compiler-checked features.");
println!();
println!(" {} {}", bc(colors::CYAN, "1. Explicit over implicit"), c(colors::DIM, "— effects in types, not hidden behind interfaces"));
println!(" fn processOrder(order: Order): Receipt {} {}", c(colors::YELLOW, "with {Database, Email}"), c(colors::DIM, "// signature IS documentation"));
println!();
println!(" {} {}", bc(colors::CYAN, "2. Composition over configuration"), c(colors::DIM, "— no DI frameworks, no monad transformers"));
println!(" run app() {} {}", c(colors::YELLOW, "with { Database = mock, Http = mock }"), c(colors::DIM, "// swap handlers, not libraries"));
println!();
println!(" {} {}", bc(colors::CYAN, "3. Safety without ceremony"), c(colors::DIM, "— type inference where it helps, annotations where they document"));
println!(" let x = 42 {}", c(colors::DIM, "// inferred"));
println!(" fn f(x: Int): Int = x * 2 {}", c(colors::DIM, "// annotated: API contract"));
println!();
println!(" {} {}", bc(colors::CYAN, "4. Practical over academic"), c(colors::DIM, "— ML semantics in C-family syntax, no monads to learn"));
println!(" {} {} {}", c(colors::DIM, "fn main(): Unit"), c(colors::YELLOW, "with {Console}"), c(colors::DIM, "= Console.print(\"Hello!\")"));
println!();
println!(" {} {}", bc(colors::CYAN, "5. One right way"), c(colors::DIM, "— opinionated formatter, integrated tooling, built-in testing"));
println!(" lux fmt | lux lint | lux check | lux test | lux compile");
println!();
println!(" {} {}", bc(colors::CYAN, "6. Tools are the language"), c(colors::DIM, "— formatter knows the AST, linter knows the types, LSP knows the effects"));
println!();
println!(" See {} for the full philosophy with language comparisons.", c(colors::CYAN, "docs/PHILOSOPHY.md"));
}
fn format_files(args: &[String]) {
use formatter::{format, FormatConfig};
use std::path::Path;
@@ -721,6 +786,36 @@ fn collect_lux_files_nonrecursive(dir: &str, pattern: Option<&str>, files: &mut
}
}
/// Find a C compiler. Priority: $CC env var, build-time embedded path, PATH search.
fn find_c_compiler() -> String {
// 1. Explicit env var
if let Ok(cc) = std::env::var("CC") {
if !cc.is_empty() {
return cc;
}
}
// 2. Path captured at build time (e.g. absolute nix store path)
let built_in = env!("LUX_CC_PATH");
if !built_in.is_empty() && std::path::Path::new(built_in).exists() {
return built_in.to_string();
}
// 3. Search PATH
for name in &["cc", "gcc", "clang"] {
if let Ok(output) = std::process::Command::new("which").arg(name).output() {
if output.status.success() {
if let Ok(p) = String::from_utf8(output.stdout) {
let p = p.trim();
if !p.is_empty() {
return p.to_string();
}
}
}
}
}
// 4. Last resort
"cc".to_string()
}
fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c: bool) {
use codegen::c_backend::CBackend;
use modules::ModuleLoader;
@@ -764,7 +859,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
// Generate C code
let mut backend = CBackend::new();
let c_code = match backend.generate(&program) {
let c_code = match backend.generate(&program, loader.module_cache()) {
Ok(code) => code,
Err(e) => {
eprintln!("{} C codegen: {}", c(colors::RED, "error:"), e);
@@ -812,13 +907,14 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
std::process::exit(1);
}
// Find C compiler
let cc = std::env::var("CC").unwrap_or_else(|_| "cc".to_string());
// Find C compiler: $CC env var > embedded build-time path > PATH search
let cc = find_c_compiler();
let compile_result = Command::new(&cc)
.args(["-O2", "-o"])
.arg(&output_bin)
.arg(&temp_c)
.arg("-lm")
.output();
match compile_result {
@@ -901,7 +997,7 @@ fn compile_to_js(path: &str, output_path: Option<&str>, run_after: bool) {
// Generate JavaScript code
let mut backend = JsBackend::new();
let js_code = match backend.generate(&program) {
let js_code = match backend.generate(&program, loader.module_cache()) {
Ok(code) => code,
Err(e) => {
eprintln!("{} JS codegen: {}", c(colors::RED, "error:"), e);
@@ -1002,7 +1098,7 @@ fn run_tests(args: &[String]) {
for test_file in &test_files {
let path_str = test_file.to_string_lossy().to_string();
// Read and parse the file
// Read and parse the file (with module loading)
let source = match fs::read_to_string(test_file) {
Ok(s) => s,
Err(e) => {
@@ -1012,7 +1108,13 @@ fn run_tests(args: &[String]) {
}
};
let program = match Parser::parse_source(&source) {
use modules::ModuleLoader;
let mut loader = ModuleLoader::new();
if let Some(parent) = test_file.parent() {
loader.add_search_path(parent.to_path_buf());
}
let program = match loader.load_source(&source, Some(test_file.as_path())) {
Ok(p) => p,
Err(e) => {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, &format!("parse error: {}", e)));
@@ -1021,9 +1123,9 @@ fn run_tests(args: &[String]) {
}
};
// Type check
// Type check with module support
let mut checker = typechecker::TypeChecker::new();
if let Err(errors) = checker.check_program(&program) {
if let Err(errors) = checker.check_program_with_modules(&program, &loader) {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), path_str, c(colors::RED, "type error"));
for err in errors {
eprintln!(" {}", err);
@@ -1051,7 +1153,7 @@ fn run_tests(args: &[String]) {
interp.register_auto_migrations(&auto_migrations);
interp.reset_test_results();
match interp.run(&program) {
match interp.run_with_modules(&program, &loader) {
Ok(_) => {
let results = interp.get_test_results();
if results.failed == 0 && results.passed == 0 {
@@ -1085,8 +1187,8 @@ fn run_tests(args: &[String]) {
interp.register_auto_migrations(&auto_migrations);
interp.reset_test_results();
// First run the file to define all functions
if let Err(e) = interp.run(&program) {
// First run the file to define all functions and load imports
if let Err(e) = interp.run_with_modules(&program, &loader) {
println!(" {} {} {}", c(colors::RED, "\u{2717}"), test_name, c(colors::RED, &e.to_string()));
total_failed += 1;
continue;
@@ -1261,6 +1363,64 @@ fn watch_file(path: &str) {
}
}
fn watch_and_rerun(path: &str, compile_args: &[String]) {
use std::time::{Duration, SystemTime};
use std::path::Path;
let file_path = Path::new(path);
if !file_path.exists() {
eprintln!("File not found: {}", path);
std::process::exit(1);
}
println!();
println!("Watching {} for changes (Ctrl+C to stop)...", path);
let mut last_modified = std::fs::metadata(file_path)
.and_then(|m| m.modified())
.unwrap_or(SystemTime::UNIX_EPOCH);
loop {
std::thread::sleep(Duration::from_millis(500));
let modified = match std::fs::metadata(file_path).and_then(|m| m.modified()) {
Ok(m) => m,
Err(_) => continue,
};
if modified > last_modified {
last_modified = modified;
// Clear screen
print!("\x1B[2J\x1B[H");
println!("=== Compiling {} ===", path);
println!();
let result = std::process::Command::new(std::env::current_exe().unwrap())
.args(compile_args)
.status();
match result {
Ok(status) if status.success() => {
println!();
println!("=== Success ===");
}
Ok(_) => {
println!();
println!("=== Failed ===");
}
Err(e) => {
eprintln!("Error running compiler: {}", e);
}
}
println!();
println!("Watching for changes...");
}
}
}
fn serve_static_files(dir: &str, port: u16) {
use std::io::{Write, BufRead, BufReader};
use std::net::TcpListener;
@@ -2128,6 +2288,48 @@ fn extract_module_doc(source: &str, path: &str) -> Result<ModuleDoc, String> {
is_public: matches!(t.visibility, ast::Visibility::Public),
});
}
ast::Declaration::ExternFn(ext) => {
let params: Vec<String> = ext.params.iter()
.map(|p| format!("{}: {}", p.name.name, format_type(&p.typ)))
.collect();
let js_note = ext.js_name.as_ref()
.map(|n| format!(" = \"{}\"", n))
.unwrap_or_default();
let signature = format!(
"extern fn {}({}): {}{}",
ext.name.name,
params.join(", "),
format_type(&ext.return_type),
js_note
);
let doc = extract_doc_comment(source, ext.span.start);
functions.push(FunctionDoc {
name: ext.name.name.clone(),
signature,
description: doc,
is_public: matches!(ext.visibility, ast::Visibility::Public),
properties: vec![],
});
}
ast::Declaration::ExternLet(ext) => {
let js_note = ext.js_name.as_ref()
.map(|n| format!(" = \"{}\"", n))
.unwrap_or_default();
let signature = format!(
"extern let {}: {}{}",
ext.name.name,
format_type(&ext.typ),
js_note
);
let doc = extract_doc_comment(source, ext.span.start);
functions.push(FunctionDoc {
name: ext.name.name.clone(),
signature,
description: doc,
is_public: matches!(ext.visibility, ast::Visibility::Public),
properties: vec![],
});
}
ast::Declaration::Effect(e) => {
let doc = extract_doc_comment(source, e.span.start);
let ops: Vec<String> = e.operations.iter()
@@ -3765,6 +3967,49 @@ c")"#;
assert_eq!(eval(source).unwrap(), r#""literal {braces}""#);
}
#[test]
fn test_multiline_string() {
let source = r#"
let s = """
hello
world
"""
let result = String.length(s)
"#;
// "hello\nworld" = 11 chars
assert_eq!(eval(source).unwrap(), "11");
}
#[test]
fn test_multiline_string_with_quotes() {
// Quotes are fine in the middle of triple-quoted strings
let source = "let s = \"\"\"\n She said \"hello\" to him.\n\"\"\"";
assert_eq!(eval(source).unwrap(), r#""She said "hello" to him.""#);
}
#[test]
fn test_multiline_string_interpolation() {
let source = r#"
let name = "Lux"
let s = """
Hello, {name}!
"""
"#;
assert_eq!(eval(source).unwrap(), r#""Hello, Lux!""#);
}
#[test]
fn test_multiline_string_empty() {
let source = r#"let s = """""""#;
assert_eq!(eval(source).unwrap(), r#""""#);
}
#[test]
fn test_multiline_string_inline() {
let source = r#"let s = """hello world""""#;
assert_eq!(eval(source).unwrap(), r#""hello world""#);
}
// Option tests
#[test]
fn test_option_constructors() {
@@ -3878,6 +4123,232 @@ c")"#;
assert_eq!(eval("let x = { a: 1, b: 2 } == { a: 1, b: 3 }").unwrap(), "false");
}
#[test]
fn test_record_spread() {
let source = r#"
let base = { x: 1, y: 2, z: 3 }
let updated = { ...base, y: 20 }
let result = updated.y
"#;
assert_eq!(eval(source).unwrap(), "20");
}
#[test]
fn test_deep_path_record_update() {
// Basic deep path: { ...base, pos.x: val } desugars to { ...base, pos: { ...base.pos, x: val } }
let source = r#"
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
let moved = { ...npc, pos.x: 50, pos.y: 60 }
let result = moved.pos.x
"#;
assert_eq!(eval(source).unwrap(), "50");
// Verify other fields are preserved through spread
let source2 = r#"
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
let moved = { ...npc, pos.x: 50 }
let result = moved.pos.y
"#;
assert_eq!(eval(source2).unwrap(), "20");
// Verify top-level spread fields preserved
let source3 = r#"
let npc = { name: "Goblin", pos: { x: 10, y: 20 } }
let moved = { ...npc, pos.x: 50 }
let result = moved.name
"#;
assert_eq!(eval(source3).unwrap(), "\"Goblin\"");
// Mix of flat and deep path fields
let source4 = r#"
let npc = { name: "Goblin", pos: { x: 10, y: 20 }, hp: 100 }
let updated = { ...npc, pos.x: 50, hp: 80 }
let result = (updated.pos.x, updated.hp, updated.name)
"#;
assert_eq!(eval(source4).unwrap(), "(50, 80, \"Goblin\")");
}
#[test]
fn test_deep_path_record_multilevel() {
// Multi-level deep path: world.physics.gravity
let source = r#"
let world = { name: "Earth", physics: { gravity: { x: 0, y: -10 }, drag: 1 } }
let updated = { ...world, physics.gravity.y: -20 }
let result = (updated.physics.gravity.y, updated.physics.drag, updated.name)
"#;
assert_eq!(eval(source).unwrap(), "(-20, 1, \"Earth\")");
}
#[test]
fn test_deep_path_conflict_error() {
// Field appears as both flat and deep path — should error
let result = eval(r#"
let base = { pos: { x: 1, y: 2 } }
let bad = { ...base, pos: { x: 10, y: 20 }, pos.x: 30 }
"#);
assert!(result.is_err());
}
#[test]
fn test_extern_fn_parse() {
// Extern fn should parse successfully
let source = r#"
extern fn getElementById(id: String): String
let x = 42
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_extern_fn_with_js_name() {
// Extern fn with JS name override
let source = r#"
extern fn getCtx(el: String, kind: String): String = "getContext"
let x = 42
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_extern_fn_call_errors_in_interpreter() {
// Calling an extern fn in the interpreter should produce a clear error
let source = r#"
extern fn alert(msg: String): Unit
let x = alert("hello")
"#;
let result = eval(source);
assert!(result.is_err());
let err = result.unwrap_err();
assert!(err.contains("extern") || err.contains("Extern") || err.contains("JavaScript"),
"Error should mention extern/JavaScript: {}", err);
}
#[test]
fn test_pub_extern_fn() {
// pub extern fn should parse
let source = r#"
pub extern fn requestAnimationFrame(callback: fn(): Unit): Int
let x = 42
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_extern_fn_js_codegen() {
// Verify JS backend emits extern fn calls without _lux suffix
use crate::codegen::js_backend::JsBackend;
use crate::parser::Parser;
use crate::lexer::Lexer;
let source = r#"
extern fn getElementById(id: String): String
extern fn getContext(el: String, kind: String): String = "getContext"
fn main(): Unit = {
let el = getElementById("canvas")
let ctx = getContext(el, "2d")
()
}
"#;
let tokens = Lexer::new(source).tokenize().unwrap();
let program = Parser::new(tokens).parse_program().unwrap();
let mut backend = JsBackend::new();
let js = backend.generate(&program, &std::collections::HashMap::new()).unwrap();
// getElementById should appear as-is (no _lux suffix)
assert!(js.contains("getElementById("), "JS should call getElementById directly: {}", js);
// getContext should use the JS name override
assert!(js.contains("getContext("), "JS should call getContext directly: {}", js);
// main should still be mangled
assert!(js.contains("main_lux"), "main should be mangled: {}", js);
}
#[test]
fn test_list_get_js_codegen() {
use crate::codegen::js_backend::JsBackend;
use crate::parser::Parser;
use crate::lexer::Lexer;
let source = r#"
fn main(): Unit = {
let xs = [10, 20, 30]
let result = List.get(xs, 1)
()
}
"#;
let tokens = Lexer::new(source).tokenize().unwrap();
let program = Parser::new(tokens).parse_program().unwrap();
let mut backend = JsBackend::new();
let js = backend.generate(&program, &std::collections::HashMap::new()).unwrap();
assert!(js.contains("Lux.Some"), "JS should contain Lux.Some for List.get: {}", js);
assert!(js.contains("Lux.None"), "JS should contain Lux.None for List.get: {}", js);
}
#[test]
fn test_let_main_js_codegen() {
use crate::codegen::js_backend::JsBackend;
use crate::parser::Parser;
use crate::lexer::Lexer;
let source = r#"
let main = fn() => {
print("hello from let main")
}
"#;
let tokens = Lexer::new(source).tokenize().unwrap();
let program = Parser::new(tokens).parse_program().unwrap();
let mut backend = JsBackend::new();
let js = backend.generate(&program, &std::collections::HashMap::new()).unwrap();
// Should contain the let binding
assert!(js.contains("const main"), "JS should contain 'const main': {}", js);
// Should auto-invoke main()
assert!(js.contains("main();"), "JS should auto-invoke main(): {}", js);
// Should NOT contain main_lux (let bindings aren't mangled)
assert!(!js.contains("main_lux"), "let main should not be mangled: {}", js);
}
#[test]
fn test_handler_js_codegen() {
use crate::codegen::js_backend::JsBackend;
use crate::parser::Parser;
use crate::lexer::Lexer;
let source = r#"
effect Log {
fn info(msg: String): Unit
fn debug(msg: String): Unit
}
handler consoleLogger: Log {
fn info(msg) = {
Console.print("[INFO] " + msg)
resume(())
}
fn debug(msg) = {
Console.print("[DEBUG] " + msg)
resume(())
}
}
"#;
let tokens = Lexer::new(source).tokenize().unwrap();
let program = Parser::new(tokens).parse_program().unwrap();
let mut backend = JsBackend::new();
let js = backend.generate(&program, &std::collections::HashMap::new()).unwrap();
// Handler should be emitted as a const object
assert!(js.contains("const consoleLogger_lux"), "JS should contain handler const: {}", js);
// Should have operation methods
assert!(js.contains("info: function(msg)"), "JS should contain info operation: {}", js);
assert!(js.contains("debug: function(msg)"), "JS should contain debug operation: {}", js);
// Should define resume locally
assert!(js.contains("const resume = (x) => x"), "JS should define resume: {}", js);
}
#[test]
fn test_invalid_escape_sequence() {
let result = eval(r#"let x = "\z""#);
@@ -4831,6 +5302,71 @@ c")"#;
}
}
// ============ Multi-line Arguments Tests ============
#[test]
fn test_multiline_function_args() {
let source = r#"
fn add(a: Int, b: Int): Int = a + b
let result = add(
1,
2
)
"#;
assert_eq!(eval(source).unwrap(), "3");
}
#[test]
fn test_multiline_function_args_with_lambda() {
let source = r#"
let xs = List.map(
[1, 2, 3],
fn(x) => x * 2
)
"#;
assert_eq!(eval(source).unwrap(), "[2, 4, 6]");
}
// ============ Tuple Index Tests ============
#[test]
fn test_tuple_index_access() {
let source = r#"
let pair = (42, "hello")
let first = pair.0
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_tuple_index_access_second() {
let source = r#"
let pair = (42, "hello")
let second = pair.1
"#;
assert_eq!(eval(source).unwrap(), "\"hello\"");
}
#[test]
fn test_tuple_index_triple() {
let source = r#"
let triple = (1, 2, 3)
let sum = triple.0 + triple.1 + triple.2
"#;
assert_eq!(eval(source).unwrap(), "6");
}
#[test]
fn test_tuple_index_in_function() {
let source = r#"
fn first(pair: (Int, String)): Int = pair.0
fn second(pair: (Int, String)): String = pair.1
let p = (42, "hello")
let result = first(p)
"#;
assert_eq!(eval(source).unwrap(), "42");
}
// Exhaustiveness checking tests
mod exhaustiveness_tests {
use super::*;
@@ -5286,4 +5822,225 @@ c")"#;
check_file("projects/rest-api/main.lux").unwrap();
}
}
// === Map type tests ===
#[test]
fn test_map_new_and_size() {
let source = r#"
let m = Map.new()
let result = Map.size(m)
"#;
assert_eq!(eval(source).unwrap(), "0");
}
#[test]
fn test_map_set_and_get() {
let source = r#"
let m = Map.new()
let m2 = Map.set(m, "name", "Alice")
let result = Map.get(m2, "name")
"#;
assert_eq!(eval(source).unwrap(), "Some(\"Alice\")");
}
#[test]
fn test_map_get_missing() {
let source = r#"
let m = Map.new()
let result = Map.get(m, "missing")
"#;
assert_eq!(eval(source).unwrap(), "None");
}
#[test]
fn test_map_contains() {
let source = r#"
let m = Map.set(Map.new(), "x", 1)
let result = (Map.contains(m, "x"), Map.contains(m, "y"))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_remove() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "a", 1), "b", 2)
let m2 = Map.remove(m, "a")
let result = (Map.size(m2), Map.contains(m2, "a"), Map.contains(m2, "b"))
"#;
assert_eq!(eval(source).unwrap(), "(1, false, true)");
}
#[test]
fn test_map_keys_and_values() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.keys(m)
"#;
assert_eq!(eval(source).unwrap(), "[\"a\", \"b\"]");
}
#[test]
fn test_map_from_list() {
let source = r#"
let m = Map.fromList([("x", 10), ("y", 20)])
let result = (Map.get(m, "x"), Map.size(m))
"#;
assert_eq!(eval(source).unwrap(), "(Some(10), 2)");
}
#[test]
fn test_map_to_list() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.toList(m)
"#;
assert_eq!(eval(source).unwrap(), "[(\"a\", 1), (\"b\", 2)]");
}
#[test]
fn test_map_merge() {
let source = r#"
let m1 = Map.fromList([("a", 1), ("b", 2)])
let m2 = Map.fromList([("b", 3), ("c", 4)])
let merged = Map.merge(m1, m2)
let result = (Map.get(merged, "a"), Map.get(merged, "b"), Map.get(merged, "c"))
"#;
assert_eq!(eval(source).unwrap(), "(Some(1), Some(3), Some(4))");
}
#[test]
fn test_map_immutability() {
let source = r#"
let m1 = Map.fromList([("a", 1)])
let m2 = Map.set(m1, "b", 2)
let result = (Map.size(m1), Map.size(m2))
"#;
assert_eq!(eval(source).unwrap(), "(1, 2)");
}
#[test]
fn test_map_is_empty() {
let source = r#"
let m1 = Map.new()
let m2 = Map.set(m1, "x", 1)
let result = (Map.isEmpty(m1), Map.isEmpty(m2))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_type_annotation() {
let source = r#"
fn lookup(m: Map<String, Int>, key: String): Option<Int> =
Map.get(m, key)
let m = Map.fromList([("age", 30)])
let result = lookup(m, "age")
"#;
assert_eq!(eval(source).unwrap(), "Some(30)");
}
// Ref cell tests
#[test]
fn test_ref_new_and_get() {
let source = r#"
let r = Ref.new(42)
let result = Ref.get(r)
"#;
assert_eq!(eval(source).unwrap(), "42");
}
#[test]
fn test_ref_set() {
let source = r#"
let r = Ref.new(0)
let _ = Ref.set(r, 10)
let result = Ref.get(r)
"#;
assert_eq!(eval(source).unwrap(), "10");
}
#[test]
fn test_ref_update() {
let source = r#"
let r = Ref.new(5)
let _ = Ref.update(r, fn(n) => n + 1)
let result = Ref.get(r)
"#;
assert_eq!(eval(source).unwrap(), "6");
}
#[test]
fn test_ref_multiple_updates() {
let source = r#"
let counter = Ref.new(0)
let _ = Ref.set(counter, 1)
let _ = Ref.update(counter, fn(n) => n * 10)
let _ = Ref.set(counter, Ref.get(counter) + 5)
let result = Ref.get(counter)
"#;
assert_eq!(eval(source).unwrap(), "15");
}
#[test]
fn test_ref_with_string() {
let source = r#"
let r = Ref.new("hello")
let _ = Ref.set(r, "world")
let result = Ref.get(r)
"#;
assert_eq!(eval(source).unwrap(), "\"world\"");
}
#[test]
fn test_file_copy() {
use std::io::Write;
// Create a temp file, copy it, verify contents
let dir = std::env::temp_dir().join("lux_test_file_copy");
let _ = std::fs::create_dir_all(&dir);
let src = dir.join("src.txt");
let dst = dir.join("dst.txt");
std::fs::File::create(&src).unwrap().write_all(b"hello copy").unwrap();
let _ = std::fs::remove_file(&dst);
let source = format!(r#"
fn main(): Unit with {{File}} =
File.copy("{}", "{}")
let _ = run main() with {{}}
let result = "done"
"#, src.display(), dst.display());
let result = eval(&source);
assert!(result.is_ok(), "File.copy failed: {:?}", result);
let contents = std::fs::read_to_string(&dst).unwrap();
assert_eq!(contents, "hello copy");
// Cleanup
let _ = std::fs::remove_dir_all(&dir);
}
#[test]
fn test_effectful_callback_propagation() {
// WISH-7: effectful callbacks in List.forEach should propagate effects
// This should type-check successfully because Console effect is inferred
let source = r#"
fn printAll(items: List<String>): Unit =
List.forEach(items, fn(x: String): Unit => Console.print(x))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback should type-check: {:?}", result);
}
#[test]
fn test_effectful_callback_in_map() {
// Effectful callback in List.map should propagate effects
let source = r#"
fn readAll(paths: List<String>): List<String> =
List.map(paths, fn(p: String): String => File.read(p))
let result = "ok"
"#;
let result = eval(source);
assert!(result.is_ok(), "Effectful callback in map should type-check: {:?}", result);
}
}

View File

@@ -52,6 +52,8 @@ impl Module {
Declaration::Let(l) => l.visibility == Visibility::Public,
Declaration::Type(t) => t.visibility == Visibility::Public,
Declaration::Trait(t) => t.visibility == Visibility::Public,
Declaration::ExternFn(e) => e.visibility == Visibility::Public,
Declaration::ExternLet(e) => e.visibility == Visibility::Public,
// Effects, handlers, and impls are always public for now
Declaration::Effect(_) | Declaration::Handler(_) | Declaration::Impl(_) => true,
}
@@ -279,6 +281,12 @@ impl ModuleLoader {
}
Declaration::Type(t) if t.visibility == Visibility::Public => {
exports.insert(t.name.name.clone());
// Also export constructors for ADT types
if let crate::ast::TypeDef::Enum(variants) = &t.definition {
for variant in variants {
exports.insert(variant.name.name.clone());
}
}
}
Declaration::Effect(e) => {
// Effects are always exported
@@ -288,6 +296,12 @@ impl ModuleLoader {
// Handlers are always exported
exports.insert(h.name.name.clone());
}
Declaration::ExternFn(e) if e.visibility == Visibility::Public => {
exports.insert(e.name.name.clone());
}
Declaration::ExternLet(e) if e.visibility == Visibility::Public => {
exports.insert(e.name.name.clone());
}
_ => {}
}
}
@@ -305,6 +319,11 @@ impl ModuleLoader {
self.cache.iter()
}
/// Get the module cache (for passing to C backend)
pub fn module_cache(&self) -> &HashMap<String, Module> {
&self.cache
}
/// Clear the module cache
pub fn clear_cache(&mut self) {
self.cache.clear();

View File

@@ -238,6 +238,7 @@ impl Parser {
match self.peek_kind() {
TokenKind::Fn => Ok(Declaration::Function(self.parse_function_decl(visibility, doc)?)),
TokenKind::Extern => self.parse_extern_decl(visibility, doc),
TokenKind::Effect => Ok(Declaration::Effect(self.parse_effect_decl(doc)?)),
TokenKind::Handler => Ok(Declaration::Handler(self.parse_handler_decl()?)),
TokenKind::Type => Ok(Declaration::Type(self.parse_type_decl(visibility, doc)?)),
@@ -245,7 +246,8 @@ impl Parser {
TokenKind::Trait => Ok(Declaration::Trait(self.parse_trait_decl(visibility, doc)?)),
TokenKind::Impl => Ok(Declaration::Impl(self.parse_impl_decl()?)),
TokenKind::Run => Err(self.error("Bare 'run' expressions are not allowed at top level. Use 'let _ = run ...' or 'let result = run ...'")),
_ => Err(self.error("Expected declaration (fn, effect, handler, type, trait, impl, or let)")),
TokenKind::Handle => Err(self.error("Bare 'handle' expressions are not allowed at top level. Use 'let _ = handle ...' or 'let result = handle ...'")),
_ => Err(self.error("Expected declaration (fn, extern, effect, handler, type, trait, impl, or let)")),
}
}
@@ -322,6 +324,109 @@ impl Parser {
})
}
/// Parse extern declaration: dispatch to extern fn or extern let
fn parse_extern_decl(&mut self, visibility: Visibility, doc: Option<String>) -> Result<Declaration, ParseError> {
// Peek past 'extern' to see if it's 'fn' or 'let'
if self.pos + 1 < self.tokens.len() {
match &self.tokens[self.pos + 1].kind {
TokenKind::Fn => Ok(Declaration::ExternFn(self.parse_extern_fn_decl(visibility, doc)?)),
TokenKind::Let => Ok(Declaration::ExternLet(self.parse_extern_let_decl(visibility, doc)?)),
_ => Err(self.error("Expected 'fn' or 'let' after 'extern'")),
}
} else {
Err(self.error("Expected 'fn' or 'let' after 'extern'"))
}
}
/// Parse extern let declaration: extern let name: Type = "jsName"
fn parse_extern_let_decl(&mut self, visibility: Visibility, doc: Option<String>) -> Result<ExternLetDecl, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Extern)?;
self.expect(TokenKind::Let)?;
let name = self.parse_ident()?;
// Type annotation
self.expect(TokenKind::Colon)?;
let typ = self.parse_type()?;
// Optional JS name override: = "jsName"
let js_name = if self.check(TokenKind::Eq) {
self.advance();
match self.peek_kind() {
TokenKind::String(s) => {
let name = s.clone();
self.advance();
Some(name)
}
_ => return Err(self.error("Expected string literal for JS name in extern let")),
}
} else {
None
};
let span = start.merge(self.previous_span());
Ok(ExternLetDecl {
visibility,
doc,
name,
typ,
js_name,
span,
})
}
/// Parse extern function declaration: extern fn name<T>(params): ReturnType = "jsName"
fn parse_extern_fn_decl(&mut self, visibility: Visibility, doc: Option<String>) -> Result<ExternFnDecl, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Extern)?;
self.expect(TokenKind::Fn)?;
let name = self.parse_ident()?;
// Optional type parameters
let type_params = if self.check(TokenKind::Lt) {
self.parse_type_params()?
} else {
Vec::new()
};
self.expect(TokenKind::LParen)?;
let params = self.parse_params()?;
self.expect(TokenKind::RParen)?;
// Return type
self.expect(TokenKind::Colon)?;
let return_type = self.parse_type()?;
// Optional JS name override: = "jsName"
let js_name = if self.check(TokenKind::Eq) {
self.advance();
match self.peek_kind() {
TokenKind::String(s) => {
let name = s.clone();
self.advance();
Some(name)
}
_ => return Err(self.error("Expected string literal for JS name in extern fn")),
}
} else {
None
};
let span = start.merge(self.previous_span());
Ok(ExternFnDecl {
visibility,
doc,
name,
type_params,
params,
return_type,
js_name,
span,
})
}
/// Parse effect declaration
fn parse_effect_decl(&mut self, doc: Option<String>) -> Result<EffectDecl, ParseError> {
let start = self.current_span();
@@ -845,6 +950,7 @@ impl Parser {
/// Parse function parameters
fn parse_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
let mut params = Vec::new();
self.skip_newlines();
while !self.check(TokenKind::RParen) {
let start = self.current_span();
@@ -854,9 +960,11 @@ impl Parser {
let span = start.merge(self.previous_span());
params.push(Parameter { name, typ, span });
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
@@ -1558,6 +1666,7 @@ impl Parser {
loop {
let op = match self.peek_kind() {
TokenKind::Plus => BinaryOp::Add,
TokenKind::PlusPlus => BinaryOp::Concat,
TokenKind::Minus => BinaryOp::Sub,
_ => break,
};
@@ -1646,6 +1755,20 @@ impl Parser {
} else if self.check(TokenKind::Dot) {
let start = expr.span();
self.advance();
// Check for tuple index access: expr.0, expr.1, etc.
if let TokenKind::Int(n) = self.peek_kind() {
let index = n as usize;
self.advance();
let span = start.merge(self.previous_span());
expr = Expr::TupleIndex {
object: Box::new(expr),
index,
span,
};
continue;
}
let field = self.parse_ident()?;
// Check if this is an effect operation: Effect.operation(args)
@@ -1681,11 +1804,14 @@ impl Parser {
fn parse_args(&mut self) -> Result<Vec<Expr>, ParseError> {
let mut args = Vec::new();
self.skip_newlines();
while !self.check(TokenKind::RParen) {
args.push(self.parse_expr()?);
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
@@ -1757,6 +1883,7 @@ impl Parser {
TokenKind::Let => self.parse_let_expr(),
TokenKind::Fn => self.parse_lambda_expr(),
TokenKind::Run => self.parse_run_expr(),
TokenKind::Handle => self.parse_handle_expr(),
TokenKind::Resume => self.parse_resume_expr(),
// Delimiters
@@ -1774,6 +1901,7 @@ impl Parser {
let condition = Box::new(self.parse_expr()?);
self.skip_newlines();
self.expect(TokenKind::Then)?;
self.skip_newlines();
let then_branch = Box::new(self.parse_expr()?);
@@ -1898,9 +2026,27 @@ impl Parser {
TokenKind::Ident(name) => {
// Check if it starts with uppercase (constructor) or lowercase (variable)
if name.chars().next().map_or(false, |c| c.is_uppercase()) {
self.parse_constructor_pattern()
self.parse_constructor_pattern_with_module(None)
} else {
let ident = self.parse_ident()?;
// Check for module-qualified constructor: module.Constructor
if self.check(TokenKind::Dot) {
// Peek ahead to see if next is an uppercase identifier
let dot_pos = self.pos;
self.advance(); // skip dot
if let TokenKind::Ident(next_name) = self.peek_kind() {
if next_name
.chars()
.next()
.map_or(false, |c| c.is_uppercase())
{
return self
.parse_constructor_pattern_with_module(Some(ident));
}
}
// Not a module-qualified constructor, backtrack
self.pos = dot_pos;
}
Ok(Pattern::Var(ident))
}
}
@@ -1910,25 +2056,40 @@ impl Parser {
}
}
fn parse_constructor_pattern(&mut self) -> Result<Pattern, ParseError> {
let start = self.current_span();
fn parse_constructor_pattern_with_module(
&mut self,
module: Option<Ident>,
) -> Result<Pattern, ParseError> {
let start = module
.as_ref()
.map(|m| m.span)
.unwrap_or_else(|| self.current_span());
let name = self.parse_ident()?;
if self.check(TokenKind::LParen) {
self.advance();
self.skip_newlines();
let mut fields = Vec::new();
while !self.check(TokenKind::RParen) {
fields.push(self.parse_pattern()?);
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
self.expect(TokenKind::RParen)?;
let span = start.merge(self.previous_span());
Ok(Pattern::Constructor { name, fields, span })
} else {
let span = name.span;
Ok(Pattern::Constructor {
module,
name,
fields,
span,
})
} else {
let span = start.merge(name.span);
Ok(Pattern::Constructor {
module,
name,
fields: Vec::new(),
span,
@@ -1939,12 +2100,15 @@ impl Parser {
fn parse_tuple_pattern(&mut self) -> Result<Pattern, ParseError> {
let start = self.current_span();
self.expect(TokenKind::LParen)?;
self.skip_newlines();
let mut elements = Vec::new();
while !self.check(TokenKind::RParen) {
elements.push(self.parse_pattern()?);
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
@@ -2074,6 +2238,7 @@ impl Parser {
fn parse_lambda_params(&mut self) -> Result<Vec<Parameter>, ParseError> {
let mut params = Vec::new();
self.skip_newlines();
while !self.check(TokenKind::RParen) {
let start = self.current_span();
@@ -2089,9 +2254,11 @@ impl Parser {
let span = start.merge(self.previous_span());
params.push(Parameter { name, typ, span });
self.skip_newlines();
if !self.check(TokenKind::RParen) {
self.expect(TokenKind::Comma)?;
self.skip_newlines();
}
}
@@ -2132,6 +2299,40 @@ impl Parser {
})
}
fn parse_handle_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Handle)?;
let expr = Box::new(self.parse_call_expr()?);
self.expect(TokenKind::With)?;
self.expect(TokenKind::LBrace)?;
self.skip_newlines();
let mut handlers = Vec::new();
while !self.check(TokenKind::RBrace) {
let effect = self.parse_ident()?;
self.expect(TokenKind::Eq)?;
let handler = self.parse_expr()?;
handlers.push((effect, handler));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
let end = self.current_span();
self.expect(TokenKind::RBrace)?;
Ok(Expr::Run {
expr,
handlers,
span: start.merge(end),
})
}
fn parse_resume_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::Resume)?;
@@ -2145,6 +2346,7 @@ impl Parser {
fn parse_tuple_or_paren_expr(&mut self) -> Result<Expr, ParseError> {
let start = self.current_span();
self.expect(TokenKind::LParen)?;
self.skip_newlines();
if self.check(TokenKind::RParen) {
self.advance();
@@ -2155,16 +2357,19 @@ impl Parser {
}
let first = self.parse_expr()?;
self.skip_newlines();
if self.check(TokenKind::Comma) {
// Tuple
let mut elements = vec![first];
while self.check(TokenKind::Comma) {
self.advance();
self.skip_newlines();
if self.check(TokenKind::RParen) {
break;
}
elements.push(self.parse_expr()?);
self.skip_newlines();
}
self.expect(TokenKind::RParen)?;
let span = start.merge(self.previous_span());
@@ -2190,12 +2395,39 @@ impl Parser {
}));
}
// Check if it's a record (ident: expr) or block
// Check for record spread: { ...expr, field: val }
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
return self.parse_record_expr_rest(start);
}
// Check if it's a record (ident: expr or ident.path: expr) or block
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
if matches!(lookahead, Some(TokenKind::Colon)) {
return self.parse_record_expr_rest(start);
}
// Check for deep path record: { ident.ident...: expr }
if matches!(lookahead, Some(TokenKind::Dot)) {
let mut look = self.pos + 2;
loop {
match self.tokens.get(look).map(|t| &t.kind) {
Some(TokenKind::Ident(_)) => {
look += 1;
match self.tokens.get(look).map(|t| &t.kind) {
Some(TokenKind::Colon) => {
return self.parse_record_expr_rest(start);
}
Some(TokenKind::Dot) => {
look += 1;
continue;
}
_ => break,
}
}
_ => break,
}
}
}
}
// It's a block
@@ -2203,13 +2435,40 @@ impl Parser {
}
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
let mut fields = Vec::new();
let mut raw_fields: Vec<(Vec<Ident>, Expr)> = Vec::new();
let mut spread = None;
let mut has_deep_paths = false;
// Check for spread: { ...expr, ... }
if self.check(TokenKind::DotDotDot) {
self.advance(); // consume ...
let spread_expr = self.parse_expr()?;
spread = Some(Box::new(spread_expr));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
while !self.check(TokenKind::RBrace) {
let name = self.parse_ident()?;
// Check for dotted path: pos.x, pos.x.y, etc.
let mut path = vec![name];
while self.check(TokenKind::Dot) {
self.advance(); // consume .
let segment = self.parse_ident()?;
path.push(segment);
}
if path.len() > 1 {
has_deep_paths = true;
}
self.expect(TokenKind::Colon)?;
let value = self.parse_expr()?;
fields.push((name, value));
raw_fields.push((path, value));
self.skip_newlines();
if self.check(TokenKind::Comma) {
@@ -2220,7 +2479,120 @@ impl Parser {
self.expect(TokenKind::RBrace)?;
let span = start.merge(self.previous_span());
Ok(Expr::Record { fields, span })
if has_deep_paths {
Self::desugar_deep_fields(spread, raw_fields, span)
} else {
// No deep paths — use flat fields directly (common case, no allocation overhead)
let fields = raw_fields
.into_iter()
.map(|(mut path, value)| (path.remove(0), value))
.collect();
Ok(Expr::Record {
spread,
fields,
span,
})
}
}
/// Desugar deep path record fields into nested record spread expressions.
/// `{ ...base, pos.x: vx, pos.y: vy }` becomes `{ ...base, pos: { ...base.pos, x: vx, y: vy } }`
fn desugar_deep_fields(
spread: Option<Box<Expr>>,
raw_fields: Vec<(Vec<Ident>, Expr)>,
outer_span: Span,
) -> Result<Expr, ParseError> {
use std::collections::HashMap;
// Group fields by first path segment, preserving order
let mut groups: Vec<(String, Vec<(Vec<Ident>, Expr)>)> = Vec::new();
let mut group_map: HashMap<String, usize> = HashMap::new();
for (path, value) in raw_fields {
let key = path[0].name.clone();
if let Some(&idx) = group_map.get(&key) {
groups[idx].1.push((path, value));
} else {
group_map.insert(key.clone(), groups.len());
groups.push((key, vec![(path, value)]));
}
}
let mut fields = Vec::new();
for (_, group) in groups {
let first_ident = group[0].0[0].clone();
let has_flat = group.iter().any(|(p, _)| p.len() == 1);
let has_deep = group.iter().any(|(p, _)| p.len() > 1);
if has_flat && has_deep {
return Err(ParseError {
message: format!(
"Field '{}' appears as both a direct field and a deep path prefix",
first_ident.name
),
span: first_ident.span,
});
}
if has_flat {
if group.len() > 1 {
return Err(ParseError {
message: format!("Duplicate field '{}'", first_ident.name),
span: group[1].0[0].span,
});
}
let (_, value) = group.into_iter().next().unwrap();
fields.push((first_ident, value));
} else {
// Deep paths — create nested record with spread from parent
let sub_spread = spread.as_ref().map(|s| {
Box::new(Expr::Field {
object: s.clone(),
field: first_ident.clone(),
span: first_ident.span,
})
});
// Strip first segment from all paths
let sub_fields: Vec<(Vec<Ident>, Expr)> = group
.into_iter()
.map(|(mut path, value)| {
path.remove(0);
(path, value)
})
.collect();
let has_nested_deep = sub_fields.iter().any(|(p, _)| p.len() > 1);
if has_nested_deep {
// Recursively desugar deeper paths
let nested =
Self::desugar_deep_fields(sub_spread, sub_fields, first_ident.span)?;
fields.push((first_ident, nested));
} else {
// All sub-paths are single-segment — build Record directly
let flat_fields: Vec<(Ident, Expr)> = sub_fields
.into_iter()
.map(|(mut path, value)| (path.remove(0), value))
.collect();
fields.push((
first_ident.clone(),
Expr::Record {
spread: sub_spread,
fields: flat_fields,
span: first_ident.span,
},
));
}
}
}
Ok(Expr::Record {
spread,
fields,
span: outer_span,
})
}
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {

View File

@@ -228,13 +228,14 @@ impl SymbolTable {
Declaration::Let(let_decl) => {
let is_public = matches!(let_decl.visibility, Visibility::Public);
let type_sig = let_decl.typ.as_ref().map(|t| self.type_expr_to_string(t));
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
let_decl.name.name.clone(),
SymbolKind::Variable,
let_decl.span,
type_sig,
is_public,
);
symbol.documentation = let_decl.doc.clone();
let id = self.add_symbol(scope_idx, symbol);
self.add_reference(id, let_decl.name.span, true, true);
@@ -244,6 +245,48 @@ impl SymbolTable {
Declaration::Handler(h) => self.visit_handler(h, scope_idx),
Declaration::Trait(t) => self.visit_trait(t, scope_idx),
Declaration::Impl(i) => self.visit_impl(i, scope_idx),
Declaration::ExternFn(ext) => {
let is_public = matches!(ext.visibility, Visibility::Public);
let params: Vec<String> = ext
.params
.iter()
.map(|p| format!("{}: {}", p.name.name, self.type_expr_to_string(&p.typ)))
.collect();
let sig = format!(
"extern fn {}({}): {}",
ext.name.name,
params.join(", "),
self.type_expr_to_string(&ext.return_type)
);
let mut symbol = self.new_symbol(
ext.name.name.clone(),
SymbolKind::Function,
ext.span,
Some(sig),
is_public,
);
symbol.documentation = ext.doc.clone();
let id = self.add_symbol(scope_idx, symbol);
self.add_reference(id, ext.name.span, true, true);
}
Declaration::ExternLet(ext) => {
let is_public = matches!(ext.visibility, Visibility::Public);
let sig = format!(
"extern let {}: {}",
ext.name.name,
self.type_expr_to_string(&ext.typ)
);
let mut symbol = self.new_symbol(
ext.name.name.clone(),
SymbolKind::Variable,
ext.span,
Some(sig),
is_public,
);
symbol.documentation = ext.doc.clone();
let id = self.add_symbol(scope_idx, symbol);
self.add_reference(id, ext.name.span, true, true);
}
}
}
@@ -279,13 +322,14 @@ impl SymbolTable {
};
let type_sig = format!("fn {}({}): {}{}{}", f.name.name, param_types.join(", "), return_type, properties, effects);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
f.name.name.clone(),
SymbolKind::Function,
f.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = f.doc.clone();
let fn_id = self.add_symbol(scope_idx, symbol);
self.add_reference(fn_id, f.name.span, true, false);
@@ -326,13 +370,14 @@ impl SymbolTable {
let is_public = matches!(t.visibility, Visibility::Public);
let type_sig = format!("type {}", t.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
t.name.name.clone(),
SymbolKind::Type,
t.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = t.doc.clone();
let type_id = self.add_symbol(scope_idx, symbol);
self.add_reference(type_id, t.name.span, true, false);
@@ -372,13 +417,14 @@ impl SymbolTable {
let is_public = true; // Effects are typically public
let type_sig = format!("effect {}", e.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
e.name.name.clone(),
SymbolKind::Effect,
e.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = e.doc.clone();
let effect_id = self.add_symbol(scope_idx, symbol);
// Add operations
@@ -409,13 +455,14 @@ impl SymbolTable {
let is_public = matches!(t.visibility, Visibility::Public);
let type_sig = format!("trait {}", t.name.name);
let symbol = self.new_symbol(
let mut symbol = self.new_symbol(
t.name.name.clone(),
SymbolKind::Type, // Traits are like types
t.name.span,
Some(type_sig),
is_public,
);
symbol.documentation = t.doc.clone();
self.add_symbol(scope_idx, symbol);
}
@@ -479,7 +526,7 @@ impl SymbolTable {
self.visit_expr(arg, scope_idx);
}
}
Expr::Field { object, .. } => {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.visit_expr(object, scope_idx);
}
Expr::If { condition, then_branch, else_branch, .. } => {
@@ -522,7 +569,10 @@ impl SymbolTable {
self.visit_expr(e, scope_idx);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.visit_expr(spread_expr, scope_idx);
}
for (_, e) in fields {
self.visit_expr(e, scope_idx);
}

View File

@@ -5,9 +5,9 @@
use std::collections::HashMap;
use crate::ast::{
self, BinaryOp, Declaration, EffectDecl, Expr, FunctionDecl, HandlerDecl, Ident, ImplDecl,
ImportDecl, LetDecl, Literal, LiteralKind, MatchArm, Parameter, Pattern, Program, Span,
Statement, TraitDecl, TypeDecl, TypeExpr, UnaryOp, VariantFields,
self, BinaryOp, Declaration, EffectDecl, ExternFnDecl, Expr, FunctionDecl, HandlerDecl, Ident,
ImplDecl, ImportDecl, LetDecl, Literal, LiteralKind, MatchArm, Parameter, Pattern, Program,
Span, Statement, TraitDecl, TypeDecl, TypeExpr, UnaryOp, VariantFields,
};
use crate::diagnostics::{find_similar_names, format_did_you_mean, Diagnostic, ErrorCode, Severity};
use crate::exhaustiveness::{check_exhaustiveness, missing_patterns_hint};
@@ -335,11 +335,14 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
Statement::Expr(e) => references_params(e, params),
}) || references_params(result, params)
}
Expr::Field { object, .. } => references_params(object, params),
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => references_params(object, params),
Expr::Lambda { body, .. } => references_params(body, params),
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)),
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| references_params(s, params))
|| fields.iter().any(|(_, e)| references_params(e, params))
}
Expr::Match { scrutinee, arms, .. } => {
references_params(scrutinee, params)
|| arms.iter().any(|a| references_params(&a.body, params))
@@ -516,10 +519,11 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
elements.iter().any(|e| has_recursive_calls(func_name, e))
}
Expr::Record { fields, .. } => {
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
}
Expr::Field { object, .. } => has_recursive_calls(func_name, object),
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
Expr::Let { value, body, .. } => {
has_recursive_calls(func_name, value) || has_recursive_calls(func_name, body)
}
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
// Build the record expression
Some(Expr::Record {
spread: None,
fields: field_exprs,
span,
})
@@ -976,6 +981,13 @@ impl TypeChecker {
if !fields.is_empty() {
self.env.bind(&name, TypeScheme::mono(Type::Record(fields)));
}
// Also copy type definitions so imported types are usable
for (type_name, type_def) in &module_checker.env.types {
if !self.env.types.contains_key(type_name) {
self.env.types.insert(type_name.clone(), type_def.clone());
}
}
}
ImportKind::Direct => {
// Import a specific name directly
@@ -1215,6 +1227,22 @@ impl TypeChecker {
let trait_impl = self.collect_impl(impl_decl);
self.env.trait_impls.push(trait_impl);
}
Declaration::ExternFn(ext) => {
// Register extern fn type signature (like a regular function but no body)
let param_types: Vec<Type> = ext
.params
.iter()
.map(|p| self.resolve_type(&p.typ))
.collect();
let return_type = self.resolve_type(&ext.return_type);
let fn_type = Type::function(param_types, return_type);
self.env.bind(&ext.name.name, TypeScheme::mono(fn_type));
}
Declaration::ExternLet(ext) => {
// Register extern let with its declared type
let typ = self.resolve_type(&ext.typ);
self.env.bind(&ext.name.name, TypeScheme::mono(typ));
}
}
}
@@ -1536,7 +1564,7 @@ impl TypeChecker {
// Use the declared type if present, otherwise use inferred
let final_type = if let Some(ref type_expr) = let_decl.typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&inferred, &declared) {
if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -1673,6 +1701,42 @@ impl TypeChecker {
span,
} => self.infer_field(object, field, *span),
Expr::TupleIndex {
object,
index,
span,
} => {
let object_type = self.infer_expr(object);
match &object_type {
Type::Tuple(types) => {
if *index < types.len() {
types[*index].clone()
} else {
self.errors.push(TypeError {
message: format!(
"Tuple index {} out of bounds for tuple with {} elements",
index,
types.len()
),
span: *span,
});
Type::Error
}
}
Type::Var(_) => Type::var(),
_ => {
self.errors.push(TypeError {
message: format!(
"Cannot use tuple index on non-tuple type {}",
object_type
),
span: *span,
});
Type::Error
}
}
}
Expr::Lambda {
params,
return_type,
@@ -1708,7 +1772,11 @@ impl TypeChecker {
span,
} => self.infer_block(statements, result, *span),
Expr::Record { fields, span } => self.infer_record(fields, *span),
Expr::Record {
spread,
fields,
span,
} => self.infer_record(spread.as_deref(), fields, *span),
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
@@ -1747,7 +1815,7 @@ impl TypeChecker {
match op {
BinaryOp::Add => {
// Add supports both numeric types and string concatenation
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1768,9 +1836,32 @@ impl TypeChecker {
}
}
BinaryOp::Concat => {
// Concat (++) supports strings and lists
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '++' must have same type: {}", e),
span,
});
}
match &left_type {
Type::String | Type::List(_) | Type::Var(_) => left_type,
_ => {
self.errors.push(TypeError {
message: format!(
"Operator '++' requires String or List operands, got {}",
left_type
),
span,
});
Type::Error
}
}
}
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
// Arithmetic: both operands must be same numeric type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1794,7 +1885,7 @@ impl TypeChecker {
BinaryOp::Eq | BinaryOp::Ne => {
// Equality: operands must have same type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1805,7 +1896,7 @@ impl TypeChecker {
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
// Comparison: operands must be same orderable type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1816,13 +1907,13 @@ impl TypeChecker {
BinaryOp::And | BinaryOp::Or => {
// Logical: both must be Bool
if let Err(e) = unify(&left_type, &Type::Bool) {
if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Left operand of '{}' must be Bool: {}", op, e),
span: left.span(),
});
}
if let Err(e) = unify(&right_type, &Type::Bool) {
if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Right operand of '{}' must be Bool: {}", op, e),
span: right.span(),
@@ -1836,7 +1927,7 @@ impl TypeChecker {
// right must be a function that accepts left's type
let result_type = Type::var();
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
if let Err(e) = unify(&right_type, &expected_fn) {
if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Pipe target must be a function accepting {}: {}",
@@ -1868,7 +1959,7 @@ impl TypeChecker {
}
},
UnaryOp::Not => {
if let Err(e) = unify(&operand_type, &Type::Bool) {
if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Operator '!' requires Bool operand: {}", e),
span,
@@ -1883,6 +1974,17 @@ impl TypeChecker {
let func_type = self.infer_expr(func);
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
// Check property constraints from where clauses
if let Expr::Var(func_id) = func {
if let Some(constraints) = self.property_constraints.get(&func_id.name).cloned() {
@@ -1919,7 +2021,7 @@ impl TypeChecker {
self.current_effects.clone(),
);
match unify(&func_type, &expected_fn) {
match unify_with_env(&func_type, &expected_fn, &self.env) {
Ok(subst) => result_type.apply(&subst),
Err(e) => {
// Provide more detailed error message based on the type of mismatch
@@ -1993,10 +2095,22 @@ impl TypeChecker {
if let Some((_, field_type)) = fields.iter().find(|(n, _)| n == &operation.name) {
// It's a function call on a module field
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
let result_type = Type::var();
let expected_fn = Type::function(arg_types, result_type.clone());
if let Err(e) = unify(field_type, &expected_fn) {
if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Type mismatch in {}.{} call: {}",
@@ -2052,6 +2166,17 @@ impl TypeChecker {
// Check argument types
let arg_types: Vec<Type> = args.iter().map(|a| self.infer_expr(a)).collect();
// Propagate effects from callback arguments to enclosing scope
for arg_type in &arg_types {
if let Type::Function { effects, .. } = arg_type {
for effect in &effects.effects {
if self.inferring_effects {
self.inferred_effects.insert(effect.clone());
}
}
}
}
if arg_types.len() != op.params.len() {
self.errors.push(TypeError {
message: format!(
@@ -2068,7 +2193,7 @@ impl TypeChecker {
for (i, (arg_type, (_, param_type))) in
arg_types.iter().zip(op.params.iter()).enumerate()
{
if let Err(e) = unify(arg_type, param_type) {
if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Argument {} of '{}.{}' has type {}, expected {}: {}",
@@ -2101,6 +2226,7 @@ impl TypeChecker {
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
let object_type = self.infer_expr(object);
let object_type = self.env.expand_type_alias(&object_type);
match &object_type {
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
@@ -2181,7 +2307,7 @@ impl TypeChecker {
// Check return type if specified
let ret_type = if let Some(rt) = return_type {
let declared = self.resolve_type(rt);
if let Err(e) = unify(&body_type, &declared) {
if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Lambda body type {} doesn't match declared {}: {}",
@@ -2247,7 +2373,7 @@ impl TypeChecker {
span: Span,
) -> Type {
let cond_type = self.infer_expr(condition);
if let Err(e) = unify(&cond_type, &Type::Bool) {
if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
span: condition.span(),
@@ -2257,7 +2383,7 @@ impl TypeChecker {
let then_type = self.infer_expr(then_branch);
let else_type = self.infer_expr(else_branch);
match unify(&then_type, &else_type) {
match unify_with_env(&then_type, &else_type, &self.env) {
Ok(subst) => then_type.apply(&subst),
Err(e) => {
self.errors.push(TypeError {
@@ -2298,7 +2424,7 @@ impl TypeChecker {
// Check guard if present
if let Some(ref guard) = arm.guard {
let guard_type = self.infer_expr(guard);
if let Err(e) = unify(&guard_type, &Type::Bool) {
if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Match guard must be Bool: {}", e),
span: guard.span(),
@@ -2314,7 +2440,7 @@ impl TypeChecker {
match &result_type {
None => result_type = Some(body_type),
Some(prev) => {
if let Err(e) = unify(prev, &body_type) {
if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Match arm has incompatible type: expected {}, got {}: {}",
@@ -2364,7 +2490,7 @@ impl TypeChecker {
Pattern::Literal(lit) => {
let lit_type = self.infer_literal(lit);
if let Err(e) = unify(&lit_type, expected) {
if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
self.errors.push(TypeError {
message: format!("Pattern literal type mismatch: {}", e),
span: lit.span,
@@ -2373,12 +2499,12 @@ impl TypeChecker {
Vec::new()
}
Pattern::Constructor { name, fields, span } => {
Pattern::Constructor { name, fields, span, .. } => {
// Look up constructor
// For now, handle Option specially
match name.name.as_str() {
"None" => {
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) {
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
self.errors.push(TypeError {
message: format!(
"None pattern doesn't match type {}: {}",
@@ -2391,7 +2517,7 @@ impl TypeChecker {
}
"Some" => {
let inner_type = Type::var();
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone())))
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
{
self.errors.push(TypeError {
message: format!(
@@ -2420,7 +2546,7 @@ impl TypeChecker {
Pattern::Tuple { elements, span } => {
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) {
if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
self.errors.push(TypeError {
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
span: *span,
@@ -2470,7 +2596,7 @@ impl TypeChecker {
if let Some(type_expr) = typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&value_type, &declared) {
if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -2491,12 +2617,47 @@ impl TypeChecker {
self.infer_expr(result)
}
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type {
let field_types: Vec<(String, Type)> = fields
fn infer_record(
&mut self,
spread: Option<&Expr>,
fields: &[(Ident, Expr)],
span: Span,
) -> Type {
// Start with spread fields if present
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
let spread_type = self.infer_expr(spread_expr);
let spread_type = self.env.expand_type_alias(&spread_type);
match spread_type {
Type::Record(spread_fields) => spread_fields,
_ => {
self.errors.push(TypeError {
message: format!(
"Spread expression must be a record type, got {}",
spread_type
),
span,
});
Vec::new()
}
}
} else {
Vec::new()
};
// Apply explicit field overrides
let explicit_types: Vec<(String, Type)> = fields
.iter()
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
.collect();
for (name, typ) in explicit_types {
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
existing.1 = typ;
} else {
field_types.push((name, typ));
}
}
Type::Record(field_types)
}
@@ -2513,7 +2674,7 @@ impl TypeChecker {
let first_type = self.infer_expr(&elements[0]);
for elem in &elements[1..] {
let elem_type = self.infer_expr(elem);
if let Err(e) = unify(&first_type, &elem_type) {
if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
self.errors.push(TypeError {
message: format!("List elements must have same type: {}", e),
span,
@@ -2819,7 +2980,7 @@ impl TypeChecker {
// Check return type matches if specified
if let Some(ref return_type_expr) = impl_method.return_type {
let return_type = self.resolve_type(return_type_expr);
if let Err(e) = unify(&body_type, &return_type) {
if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Method '{}' body has type {}, but declared return type is {}: {}",
@@ -2862,6 +3023,12 @@ impl TypeChecker {
"Option" if resolved_args.len() == 1 => {
return Type::Option(Box::new(resolved_args[0].clone()));
}
"Map" if resolved_args.len() == 2 => {
return Type::Map(Box::new(resolved_args[0].clone()), Box::new(resolved_args[1].clone()));
}
"Ref" if resolved_args.len() == 1 => {
return Type::Ref(Box::new(resolved_args[0].clone()));
}
_ => {}
}
}

View File

@@ -47,6 +47,10 @@ pub enum Type {
List(Box<Type>),
/// Option type (sugar for App(Option, [T]))
Option(Box<Type>),
/// Map type (sugar for App(Map, [K, V]))
Map(Box<Type>, Box<Type>),
/// Ref type — mutable reference cell holding a value of type T
Ref(Box<Type>),
/// Versioned type (e.g., User @v2)
Versioned {
base: Box<Type>,
@@ -118,7 +122,8 @@ impl Type {
}
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
Type::List(inner) | Type::Option(inner) | Type::Ref(inner) => inner.contains_var(var),
Type::Map(k, v) => k.contains_var(var) || v.contains_var(var),
Type::Versioned { base, .. } => base.contains_var(var),
_ => false,
}
@@ -158,6 +163,8 @@ impl Type {
),
Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
Type::Ref(inner) => Type::Ref(Box::new(inner.apply(subst))),
Type::Map(k, v) => Type::Map(Box::new(k.apply(subst)), Box::new(v.apply(subst))),
Type::Versioned { base, version } => Type::Versioned {
base: Box::new(base.apply(subst)),
version: version.clone(),
@@ -207,7 +214,12 @@ impl Type {
}
vars
}
Type::List(inner) | Type::Option(inner) => inner.free_vars(),
Type::List(inner) | Type::Option(inner) | Type::Ref(inner) => inner.free_vars(),
Type::Map(k, v) => {
let mut vars = k.free_vars();
vars.extend(v.free_vars());
vars
}
Type::Versioned { base, .. } => base.free_vars(),
_ => HashSet::new(),
}
@@ -279,6 +291,8 @@ impl fmt::Display for Type {
}
Type::List(inner) => write!(f, "List<{}>", inner),
Type::Option(inner) => write!(f, "Option<{}>", inner),
Type::Ref(inner) => write!(f, "Ref<{}>", inner),
Type::Map(k, v) => write!(f, "Map<{}, {}>", k, v),
Type::Versioned { base, version } => {
write!(f, "{} {}", base, version)
}
@@ -946,6 +960,46 @@ impl TypeEnv {
params: vec![("path".to_string(), Type::String)],
return_type: Type::Unit,
},
EffectOpDef {
name: "copy".to_string(),
params: vec![
("source".to_string(), Type::String),
("dest".to_string(), Type::String),
],
return_type: Type::Unit,
},
EffectOpDef {
name: "glob".to_string(),
params: vec![("pattern".to_string(), Type::String)],
return_type: Type::List(Box::new(Type::String)),
},
EffectOpDef {
name: "tryRead".to_string(),
params: vec![("path".to_string(), Type::String)],
return_type: Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
args: vec![Type::String, Type::String],
},
},
EffectOpDef {
name: "tryWrite".to_string(),
params: vec![
("path".to_string(), Type::String),
("content".to_string(), Type::String),
],
return_type: Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
args: vec![Type::Unit, Type::String],
},
},
EffectOpDef {
name: "tryDelete".to_string(),
params: vec![("path".to_string(), Type::String)],
return_type: Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
args: vec![Type::Unit, Type::String],
},
},
],
},
);
@@ -1146,6 +1200,15 @@ impl TypeEnv {
],
return_type: Type::Unit,
},
EffectOpDef {
name: "assertEqualMsg".to_string(),
params: vec![
("expected".to_string(), Type::Var(0)),
("actual".to_string(), Type::Var(0)),
("label".to_string(), Type::String),
],
return_type: Type::Unit,
},
EffectOpDef {
name: "assertNotEqual".to_string(),
params: vec![
@@ -1480,6 +1543,16 @@ impl TypeEnv {
Type::Option(Box::new(Type::var())),
),
),
(
"findIndex".to_string(),
Type::function(
vec![
Type::List(Box::new(Type::var())),
Type::function(vec![Type::var()], Type::Bool),
],
Type::Option(Box::new(Type::Int)),
),
),
(
"any".to_string(),
Type::function(
@@ -1524,6 +1597,50 @@ impl TypeEnv {
Type::Unit,
),
),
(
"sort".to_string(),
Type::function(
vec![Type::List(Box::new(Type::var()))],
Type::List(Box::new(Type::var())),
),
),
(
"sortBy".to_string(),
{
let elem = Type::var();
Type::function(
vec![
Type::List(Box::new(elem.clone())),
Type::function(vec![elem.clone(), elem], Type::Int),
],
Type::List(Box::new(Type::var())),
)
},
),
(
"zip".to_string(),
Type::function(
vec![
Type::List(Box::new(Type::var())),
Type::List(Box::new(Type::var())),
],
Type::List(Box::new(Type::Tuple(vec![Type::var(), Type::var()]))),
),
),
(
"flatten".to_string(),
Type::function(
vec![Type::List(Box::new(Type::List(Box::new(Type::var()))))],
Type::List(Box::new(Type::var())),
),
),
(
"contains".to_string(),
Type::function(
vec![Type::List(Box::new(Type::var())), Type::var()],
Type::Bool,
),
),
]);
env.bind("List", TypeScheme::mono(list_module_type));
@@ -1599,6 +1716,14 @@ impl TypeEnv {
"parseFloat".to_string(),
Type::function(vec![Type::String], Type::Option(Box::new(Type::Float))),
),
(
"indexOf".to_string(),
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
),
(
"lastIndexOf".to_string(),
Type::function(vec![Type::String, Type::String], Type::Option(Box::new(Type::Int))),
),
]);
env.bind("String", TypeScheme::mono(string_module_type));
@@ -1758,6 +1883,99 @@ impl TypeEnv {
]);
env.bind("Option", TypeScheme::mono(option_module_type));
// Map module
let map_v = || Type::var();
let map_type = || Type::Map(Box::new(Type::String), Box::new(Type::var()));
let map_module_type = Type::Record(vec![
(
"new".to_string(),
Type::function(vec![], map_type()),
),
(
"set".to_string(),
Type::function(
vec![map_type(), Type::String, map_v()],
map_type(),
),
),
(
"get".to_string(),
Type::function(
vec![map_type(), Type::String],
Type::Option(Box::new(map_v())),
),
),
(
"contains".to_string(),
Type::function(vec![map_type(), Type::String], Type::Bool),
),
(
"remove".to_string(),
Type::function(vec![map_type(), Type::String], map_type()),
),
(
"keys".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(Type::String))),
),
(
"values".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(map_v()))),
),
(
"size".to_string(),
Type::function(vec![map_type()], Type::Int),
),
(
"isEmpty".to_string(),
Type::function(vec![map_type()], Type::Bool),
),
(
"fromList".to_string(),
Type::function(
vec![Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()])))],
map_type(),
),
),
(
"toList".to_string(),
Type::function(
vec![map_type()],
Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()]))),
),
),
(
"merge".to_string(),
Type::function(vec![map_type(), map_type()], map_type()),
),
]);
env.bind("Map", TypeScheme::mono(map_module_type));
// Ref module
let ref_inner = || Type::var();
let ref_type = || Type::Ref(Box::new(Type::var()));
let ref_module_type = Type::Record(vec![
(
"new".to_string(),
Type::function(vec![ref_inner()], ref_type()),
),
(
"get".to_string(),
Type::function(vec![ref_type()], ref_inner()),
),
(
"set".to_string(),
Type::function(vec![ref_type(), ref_inner()], Type::Unit),
),
(
"update".to_string(),
Type::function(
vec![ref_type(), Type::function(vec![ref_inner()], ref_inner())],
Type::Unit,
),
),
]);
env.bind("Ref", TypeScheme::mono(ref_module_type));
// Result module
let result_type = Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
@@ -1870,9 +2088,47 @@ impl TypeEnv {
"round".to_string(),
Type::function(vec![Type::var()], Type::Int),
),
(
"sin".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"cos".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"atan2".to_string(),
Type::function(vec![Type::Float, Type::Float], Type::Float),
),
]);
env.bind("Math", TypeScheme::mono(math_module_type));
// Int module
let int_module_type = Type::Record(vec![
(
"toString".to_string(),
Type::function(vec![Type::Int], Type::String),
),
(
"toFloat".to_string(),
Type::function(vec![Type::Int], Type::Float),
),
]);
env.bind("Int", TypeScheme::mono(int_module_type));
// Float module
let float_module_type = Type::Record(vec![
(
"toString".to_string(),
Type::function(vec![Type::Float], Type::String),
),
(
"toInt".to_string(),
Type::function(vec![Type::Float], Type::Int),
),
]);
env.bind("Float", TypeScheme::mono(float_module_type));
env
}
@@ -1956,6 +2212,12 @@ impl TypeEnv {
Type::Option(inner) => {
Type::Option(Box::new(self.expand_type_alias(inner)))
}
Type::Map(k, v) => {
Type::Map(Box::new(self.expand_type_alias(k)), Box::new(self.expand_type_alias(v)))
}
Type::Ref(inner) => {
Type::Ref(Box::new(self.expand_type_alias(inner)))
}
Type::Versioned { base, version } => {
Type::Versioned {
base: Box::new(self.expand_type_alias(base)),
@@ -2032,7 +2294,9 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Function's required effects (e1) must be a subset of available effects (e2)
// A pure function (empty effects) can be called anywhere
// A function requiring {Logger} can be called in context with {Logger} or {Logger, Console}
if !e1.is_subset(&e2) {
// When expected effects (e2) are empty, it means "no constraint" (e.g., callback parameter)
// so we allow any actual effects through
if !e2.is_empty() && !e1.is_subset(&e2) {
return Err(format!(
"Effect mismatch: expected {{{}}}, got {{{}}}",
e1, e2
@@ -2114,6 +2378,16 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Option
(Type::Option(a), Type::Option(b)) => unify(a, b),
// Ref
(Type::Ref(a), Type::Ref(b)) => unify(a, b),
// Map
(Type::Map(k1, v1), Type::Map(k2, v2)) => {
let s1 = unify(k1, k2)?;
let s2 = unify(&v1.apply(&s1), &v2.apply(&s1))?;
Ok(s1.compose(&s2))
}
// Versioned types
(
Type::Versioned {

View File

@@ -14,6 +14,7 @@
pub type Html<M> =
| Element(String, List<Attr<M>>, List<Html<M>>)
| Text(String)
| RawHtml(String)
| Empty
// Attributes that can be applied to elements
@@ -41,6 +42,7 @@ pub type Attr<M> =
| OnKeyDown(fn(String): M)
| OnKeyUp(fn(String): M)
| DataAttr(String, String)
| Attribute(String, String)
// ============================================================================
// Element builders - Container elements
@@ -180,6 +182,28 @@ pub fn video<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
pub fn audio<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
Element("audio", attrs, children)
// ============================================================================
// Element builders - Document / Head elements
// ============================================================================
pub fn meta<M>(attrs: List<Attr<M>>): Html<M> =
Element("meta", attrs, [])
pub fn link<M>(attrs: List<Attr<M>>): Html<M> =
Element("link", attrs, [])
pub fn script<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
Element("script", attrs, children)
pub fn iframe<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
Element("iframe", attrs, children)
pub fn figure<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
Element("figure", attrs, children)
pub fn figcaption<M>(attrs: List<Attr<M>>, children: List<Html<M>>): Html<M> =
Element("figcaption", attrs, children)
// ============================================================================
// Element builders - Tables
// ============================================================================
@@ -285,6 +309,12 @@ pub fn onKeyUp<M>(h: fn(String): M): Attr<M> =
pub fn data<M>(name: String, value: String): Attr<M> =
DataAttr(name, value)
pub fn attr<M>(name: String, value: String): Attr<M> =
Attribute(name, value)
pub fn rawHtml<M>(content: String): Html<M> =
RawHtml(content)
// ============================================================================
// Utility functions
// ============================================================================
@@ -319,6 +349,7 @@ pub fn renderAttr<M>(attr: Attr<M>): String =
Checked(false) => "",
Name(n) => " name=\"" + n + "\"",
DataAttr(name, value) => " data-" + name + "=\"" + value + "\"",
Attribute(name, value) => " " + name + "=\"" + value + "\"",
// Event handlers are ignored in static rendering
OnClick(_) => "",
OnInput(_) => "",
@@ -355,6 +386,7 @@ pub fn render<M>(html: Html<M>): String =
}
},
Text(content) => escapeHtml(content),
RawHtml(content) => content,
Empty => ""
}
@@ -368,15 +400,47 @@ pub fn escapeHtml(s: String): String = {
s4
}
// Render a full HTML document
// Render a full HTML document (basic)
pub fn document(title: String, headExtra: List<Html<M>>, bodyContent: List<Html<M>>): String = {
let headElements = List.concat([
[Element("meta", [DataAttr("charset", "UTF-8")], [])],
[Element("meta", [Name("viewport"), Value("width=device-width, initial-scale=1.0")], [])],
[Element("meta", [Attribute("charset", "UTF-8")], [])],
[Element("meta", [Name("viewport"), Attribute("content", "width=device-width, initial-scale=1.0")], [])],
[Element("title", [], [Text(title)])],
headExtra
])
let doc = Element("html", [DataAttr("lang", "en")], [
let doc = Element("html", [Attribute("lang", "en")], [
Element("head", [], headElements),
Element("body", [], bodyContent)
])
"<!DOCTYPE html>\n" + render(doc)
}
// Render a full HTML document with SEO meta tags
pub fn seoDocument(
title: String,
description: String,
url: String,
ogImage: String,
headExtra: List<Html<M>>,
bodyContent: List<Html<M>>
): String = {
let headElements = List.concat([
[Element("meta", [Attribute("charset", "UTF-8")], [])],
[Element("meta", [Name("viewport"), Attribute("content", "width=device-width, initial-scale=1.0")], [])],
[Element("title", [], [Text(title)])],
[Element("meta", [Name("description"), Attribute("content", description)], [])],
[Element("meta", [Attribute("property", "og:title"), Attribute("content", title)], [])],
[Element("meta", [Attribute("property", "og:description"), Attribute("content", description)], [])],
[Element("meta", [Attribute("property", "og:type"), Attribute("content", "website")], [])],
[Element("meta", [Attribute("property", "og:url"), Attribute("content", url)], [])],
[Element("meta", [Attribute("property", "og:image"), Attribute("content", ogImage)], [])],
[Element("meta", [Name("twitter:card"), Attribute("content", "summary_large_image")], [])],
[Element("meta", [Name("twitter:title"), Attribute("content", title)], [])],
[Element("meta", [Name("twitter:description"), Attribute("content", description)], [])],
[Element("link", [Attribute("rel", "canonical"), Href(url)], [])],
headExtra
])
let doc = Element("html", [Attribute("lang", "en")], [
Element("head", [], headElements),
Element("body", [], bodyContent)
])

View File

@@ -625,6 +625,41 @@ pub fn router(routes: List<Route>, notFound: fn(Request): Response): Handler =
}
}
// ============================================================
// Static File Serving
// ============================================================
// Serve a static file from disk
pub fn serveStaticFile(basePath: String, requestPath: String): Response with {File} = {
let filePath = basePath + requestPath
if File.exists(filePath) then {
let content = File.read(filePath)
let mime = getMimeType(filePath)
{ status: 200, headers: [("Content-Type", mime)], body: content }
} else
{ status: 404, headers: textHeaders(), body: "Not Found" }
}
// ============================================================
// Form Body Parsing
// ============================================================
// Parse URL-encoded form body (same format as query strings)
pub fn parseFormBody(body: String): List<(String, String)> =
parseQueryParams(body)
// Get a form field value by name
pub fn getFormField(fields: List<(String, String)>, name: String): Option<String> =
getParam(fields, name)
// ============================================================
// Response Helpers
// ============================================================
// Send a Response using HttpServer effect (convenience wrapper)
pub fn sendResponse(resp: Response): Unit with {HttpServer} =
HttpServer.respondWithHeaders(resp.status, resp.body, resp.headers)
// ============================================================
// Example Usage
// ============================================================