18 Commits

Author SHA1 Message Date
3a2376cd49 feat: port AST definitions to Lux (self-hosting)
Translate all 30+ type definitions from src/ast.rs (727 lines Rust)
into Lux ADTs in projects/lux-compiler/ast.lux.

Types ported: Span, Ident, Visibility, Version, VersionConstraint,
BehavioralProperty, WhereClause, ModulePath, ImportDecl, Program,
Declaration, FunctionDecl, Parameter, EffectDecl, EffectOp, TypeDecl,
TypeDef, RecordField, Variant, VariantFields, Migration, HandlerDecl,
HandlerImpl, LetDecl, TraitDecl, TraitMethod, TraitBound, ImplDecl,
TraitConstraint, ImplMethod, TypeExpr, Expr (19 variants), Literal,
LiteralKind, BinaryOp, UnaryOp, Statement, MatchArm, Pattern.

Passes `lux check` and `lux run`.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:07:30 -05:00
4dfb04a1b6 chore: sync Cargo.lock with version 0.1.3
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:51 -05:00
3cdde02eb2 feat: add Int.toFloat/Float.toInt JS backend support and fix Map C codegen
- JS backend: Add Int/Float module dispatch in both Call and EffectOp paths
  for toFloat, toInt, and toString operations
- C backend: Fix lux_strdup → lux_string_dup in Map module codegen

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 02:05:40 -05:00
a5762d0397 feat: add built-in Map type with String keys
Add Map<String, V> as a first-class built-in type for key-value storage,
needed for self-hosting the compiler (parser/typechecker/interpreter all
rely heavily on hashmaps).

- types.rs: Type::Map(K,V) variant, all match arms (unify, apply, etc.)
- interpreter.rs: Value::Map, 12 BuiltinFn variants (new/set/get/contains/
  remove/keys/values/size/isEmpty/fromList/toList/merge), immutable semantics
- typechecker.rs: Map<K,V> resolution in resolve_type
- js_backend.rs: Map as JS Map with emit_map_operation()
- c_backend.rs: LuxMap struct (linear-scan), runtime fns, emit_map_operation()
- main.rs: 12 tests covering all Map operations
- validate.sh: now checks all projects/ directories too

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:45:13 -05:00
1132c621c6 fix: allow newlines before then in if/then/else expressions
The parser now skips newlines between the condition and `then` keyword,
enabling multiline if expressions like:
  if long_condition
    then expr1
    else expr2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:38:05 -05:00
a0fff1814e fix: JS backend scoping for let/match/if inside closures
Three related bugs fixed:
- BUG-009: let bindings inside lambdas hoisted to top-level
- BUG-011: match expressions inside lambdas hoisted to top-level
- BUG-012: variable name deduplication leaked across function scopes

Root cause: emit_expr() uses writeln() for statements, but lambdas
captured only the return value, not the emitted statements. Also,
var_substitutions from emit_function() leaked to subsequent code.

Fix: Lambda handler now captures all output emitted during body
evaluation and places it inside the function body. Both emit_function
and Lambda save/restore var_substitutions to prevent cross-scope leaks.
Lambda params are registered as identity substitutions to override any
outer bindings with the same name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 01:10:55 -05:00
4e9e823246 fix: record spread works with named type aliases
Resolve type aliases (e.g. Player -> { pos: Vec2, speed: Float })
before checking if spread expression is a record type. Previously
{ ...p, field: val } failed with "must be a record type, got Player"
when the variable had a named type annotation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-19 00:01:20 -05:00
6a2e4a7ac1 chore: bump version to 0.1.3 2026-02-18 23:06:10 -05:00
3d706cb32b feat: add record spread syntax { ...base, field: val }
Adds spread operator for records, allowing concise record updates:
  let p2 = { ...p, x: 5.0 }

Changes across the full pipeline:
- Lexer: new DotDotDot (...) token
- AST: optional spread field on Record variant
- Parser: detect ... at start of record expression
- Typechecker: merge spread record fields with explicit overrides
- Interpreter: evaluate spread, overlay explicit fields
- JS backend: emit native JS spread syntax
- C backend: copy spread into temp, assign overrides
- Formatter, linter, LSP, symbol table: propagate spread

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:27 -05:00
7c3bfa9301 feat: add Math.sin, Math.cos, Math.atan2 trig functions
Adds trigonometric functions to the Math module across interpreter,
type system, and C backend. JS backend already supported them.
Also adds #include <math.h> to C preamble and handles Math module
calls through both Call and EffectOp paths in C backend.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:12 -05:00
b56c5461f1 fix: JS const _ duplication and hardcoded version string
- JS backend now emits wildcard let bindings as side-effect statements
  instead of const _ declarations, fixing SyntaxError on multiple let _ = ...
- Version string now uses env!("CARGO_PKG_VERSION") to auto-sync with Cargo.toml
- Add -lm linker flag for math library support

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 23:05:03 -05:00
61e1469845 feat: add ++ concat operator and auto-invoke main
BUG-004: Add ++ operator for string and list concatenation across all
backends (interpreter, C, JS) with type checking and formatting support.

BUG-001: Auto-invoke top-level `let main = fn () => ...` when main is
a zero-parameter function, instead of just printing the function value.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 22:01:41 -05:00
bb0a288210 chore: bump version to 0.1.2 2026-02-18 21:16:44 -05:00
5d7f4633e1 docs: add explicit commit instructions to CLAUDE.md
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:27 -05:00
d05b13d840 fix: JS backend compiles print() to console.log()
Bare `print()` calls in Lux now emit `console.log()` in JS output
instead of undefined `print()`. Fixes BUG-006.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 21:09:07 -05:00
0ee3050704 chore: bump version to 0.1.1 2026-02-18 20:41:43 -05:00
80b1276f9f fix: release script auto-bumps patch by default
Release script now supports: patch (default), minor, major, or explicit
version. Auto-updates Cargo.toml and flake.nix before building.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:41:29 -05:00
bd843d2219 fix: record type aliases now work for unification and field access
Expand type aliases via unify_with_env() everywhere in the type checker,
not just in a few places. This fixes named record types like
`type Vec2 = { x: Float, y: Float }` — they now properly unify with
anonymous records and support field access (v.x, v.y).

Also adds scripts/validate.sh for automated full-suite regression
testing (Rust tests + all 5 package test suites + type checking).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-18 20:21:29 -05:00
20 changed files with 1713 additions and 111 deletions

View File

@@ -42,17 +42,45 @@ When making changes:
7. **Fix language limitations**: If you encounter parser/type system limitations, fix them (without regressions on guarantees or speed)
8. **Git commits**: Always use `--no-gpg-sign` flag
### Post-work checklist (run after each major piece of work)
### Post-work checklist (run after each committable change)
**MANDATORY: Run the full validation script after every committable change:**
```bash
nix develop --command cargo check # No Rust errors
nix develop --command cargo test # All tests pass (currently 381)
./target/release/lux check # Type check + lint all .lux files
./target/release/lux fmt # Format all .lux files
./target/release/lux lint # Standalone lint pass
./scripts/validate.sh
```
This script runs ALL of the following checks and will fail if any regress:
1. `cargo check` — no Rust compilation errors
2. `cargo test` — all Rust tests pass (currently 387)
3. `cargo build --release` — release binary builds
4. `lux test` on every package (path, frontmatter, xml, rss, markdown) — all 286 package tests pass
5. `lux check` on every package — type checking + lint passes
If `validate.sh` is not available or you need to run manually:
```bash
nix develop --command cargo check # No Rust errors
nix develop --command cargo test # All Rust tests pass
nix develop --command cargo build --release # Build release binary
cd ../packages/path && ../../lang/target/release/lux test # Package tests
cd ../packages/frontmatter && ../../lang/target/release/lux test
cd ../packages/xml && ../../lang/target/release/lux test
cd ../packages/rss && ../../lang/target/release/lux test
cd ../packages/markdown && ../../lang/target/release/lux test
```
**Do NOT commit if any check fails.** Fix the issue first.
### Commit after every piece of work
**After completing each logical unit of work, commit immediately.** Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
**After completing each logical unit of work, commit immediately.** This is NOT optional — every fix, feature, or change MUST be committed right away. Do not let changes accumulate uncommitted across multiple features. Each commit should be a single logical change (one feature, one bugfix, etc.). Use `--no-gpg-sign` flag for all commits.
**Commit workflow:**
1. Make the change
2. Run `./scripts/validate.sh` (all 13 checks must pass)
3. `git add` the relevant files
4. `git commit --no-gpg-sign -m "type: description"` (use conventional commits: fix/feat/chore/docs)
5. Move on to the next task
**Never skip committing.** If you fixed a bug, commit it. If you added a feature, commit it. If you updated docs, commit it. Do not batch unrelated changes into one commit.
**IMPORTANT: Always verify Lux code you write:**
- Run with interpreter: `./target/release/lux file.lux`
@@ -109,7 +137,7 @@ When working on any major task that involves writing Lux code, **document every
## Code Quality
- Fix all compiler warnings before committing
- Ensure all tests pass (currently 381 tests)
- Ensure all tests pass (currently 387 tests)
- Add new tests when adding features
- Keep examples and documentation in sync

2
Cargo.lock generated
View File

@@ -770,7 +770,7 @@ dependencies = [
[[package]]
name = "lux"
version = "0.1.0"
version = "0.1.3"
dependencies = [
"lsp-server",
"lsp-types",

View File

@@ -1,6 +1,6 @@
[package]
name = "lux"
version = "0.1.0"
version = "0.1.3"
edition = "2021"
description = "A functional programming language with first-class effects, schema evolution, and behavioral types"
license = "MIT"

View File

@@ -44,7 +44,7 @@
printf "\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m\n"
printf " \033[1;35m \033[0m v0.1.0\n"
printf " \033[1;35m \033[0m v0.1.3\n"
printf "\n"
printf " Functional language with first-class effects\n"
printf "\n"
@@ -62,7 +62,7 @@
packages.default = pkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.0";
version = "0.1.3";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;
@@ -79,7 +79,7 @@
};
in muslPkgs.rustPlatform.buildRustPackage {
pname = "lux";
version = "0.1.0";
version = "0.1.3";
src = ./.;
cargoLock.lockFile = ./Cargo.lock;

View File

@@ -0,0 +1,225 @@
// Lux AST — Self-hosted Abstract Syntax Tree definitions
//
// Direct translation of src/ast.rs into Lux ADTs.
// These types represent the parsed structure of a Lux program.
//
// Naming conventions to avoid collisions:
// Ex = Expr variant, Pat = Pattern, Te = TypeExpr
// Td = TypeDef, Vf = VariantFields, Op = Operator
// Decl = Declaration, St = Statement
// === Source Location ===
type Span = | Span(Int, Int)
// === Identifiers ===
type Ident = | Ident(String, Span)
// === Visibility ===
type Visibility = | Public | Private
// === Schema Evolution ===
type Version = | Version(Int, Span)
type VersionConstraint =
| VcExact(Version)
| VcAtLeast(Version)
| VcLatest(Span)
// === Behavioral Types ===
type BehavioralProperty =
| BpPure
| BpTotal
| BpIdempotent
| BpDeterministic
| BpCommutative
// === Trait Bound (needed before WhereClause) ===
type TraitBound = | TraitBound(Ident, List<TypeExpr>, Span)
// === Trait Constraint (needed before WhereClause) ===
type TraitConstraint = | TraitConstraint(Ident, List<TraitBound>, Span)
// === Where Clauses ===
type WhereClause =
| WcProperty(Ident, BehavioralProperty, Span)
| WcResult(Expr, Span)
| WcTrait(TraitConstraint)
// === Module Path ===
type ModulePath = | ModulePath(List<Ident>, Span)
// === Import ===
// path, alias, items, wildcard, span
type ImportDecl = | ImportDecl(ModulePath, Option<Ident>, Option<List<Ident>>, Bool, Span)
// === Program ===
type Program = | Program(List<ImportDecl>, List<Declaration>)
// === Declarations ===
type Declaration =
| DeclFunction(FunctionDecl)
| DeclEffect(EffectDecl)
| DeclType(TypeDecl)
| DeclHandler(HandlerDecl)
| DeclLet(LetDecl)
| DeclTrait(TraitDecl)
| DeclImpl(ImplDecl)
// === Parameter ===
type Parameter = | Parameter(Ident, TypeExpr, Span)
// === Effect Operation ===
type EffectOp = | EffectOp(Ident, List<Parameter>, TypeExpr, Span)
// === Record Field ===
type RecordField = | RecordField(Ident, TypeExpr, Span)
// === Variant Fields ===
type VariantFields =
| VfUnit
| VfTuple(List<TypeExpr>)
| VfRecord(List<RecordField>)
// === Variant ===
type Variant = | Variant(Ident, VariantFields, Span)
// === Migration ===
type Migration = | Migration(Version, Expr, Span)
// === Handler Impl ===
// op_name, params, resume, body, span
type HandlerImpl = | HandlerImpl(Ident, List<Ident>, Option<Ident>, Expr, Span)
// === Impl Method ===
// name, params, return_type, body, span
type ImplMethod = | ImplMethod(Ident, List<Parameter>, Option<TypeExpr>, Expr, Span)
// === Trait Method ===
// name, type_params, params, return_type, default_impl, span
type TraitMethod = | TraitMethod(Ident, List<Ident>, List<Parameter>, TypeExpr, Option<Expr>, Span)
// === Type Expressions ===
type TypeExpr =
| TeNamed(Ident)
| TeApp(TypeExpr, List<TypeExpr>)
| TeFunction(List<TypeExpr>, TypeExpr, List<Ident>)
| TeTuple(List<TypeExpr>)
| TeRecord(List<RecordField>)
| TeUnit
| TeVersioned(TypeExpr, VersionConstraint)
// === Literal ===
type LiteralKind =
| LitInt(Int)
| LitFloat(String)
| LitString(String)
| LitChar(Char)
| LitBool(Bool)
| LitUnit
type Literal = | Literal(LiteralKind, Span)
// === Binary Operators ===
type BinaryOp =
| OpAdd | OpSub | OpMul | OpDiv | OpMod
| OpEq | OpNe | OpLt | OpLe | OpGt | OpGe
| OpAnd | OpOr
| OpPipe | OpConcat
// === Unary Operators ===
type UnaryOp = | OpNeg | OpNot
// === Statements ===
type Statement =
| StExpr(Expr)
| StLet(Ident, Option<TypeExpr>, Expr, Span)
// === Match Arms ===
type MatchArm = | MatchArm(Pattern, Option<Expr>, Expr, Span)
// === Patterns ===
type Pattern =
| PatWildcard(Span)
| PatVar(Ident)
| PatLiteral(Literal)
| PatConstructor(Ident, List<Pattern>, Span)
| PatRecord(List<(Ident, Pattern)>, Span)
| PatTuple(List<Pattern>, Span)
// === Function Declaration ===
// visibility, doc, name, type_params, params, return_type, effects, properties, where_clauses, body, span
type FunctionDecl = | FunctionDecl(Visibility, Option<String>, Ident, List<Ident>, List<Parameter>, TypeExpr, List<Ident>, List<BehavioralProperty>, List<WhereClause>, Expr, Span)
// === Effect Declaration ===
// doc, name, type_params, operations, span
type EffectDecl = | EffectDecl(Option<String>, Ident, List<Ident>, List<EffectOp>, Span)
// === Type Declaration ===
// visibility, doc, name, type_params, version, definition, migrations, span
type TypeDecl = | TypeDecl(Visibility, Option<String>, Ident, List<Ident>, Option<Version>, TypeDef, List<Migration>, Span)
// === Handler Declaration ===
// name, params, effect, implementations, span
type HandlerDecl = | HandlerDecl(Ident, List<Parameter>, Ident, List<HandlerImpl>, Span)
// === Let Declaration ===
// visibility, doc, name, typ, value, span
type LetDecl = | LetDecl(Visibility, Option<String>, Ident, Option<TypeExpr>, Expr, Span)
// === Trait Declaration ===
// visibility, doc, name, type_params, super_traits, methods, span
type TraitDecl = | TraitDecl(Visibility, Option<String>, Ident, List<Ident>, List<TraitBound>, List<TraitMethod>, Span)
// === Impl Declaration ===
// type_params, constraints, trait_name, trait_args, target_type, methods, span
type ImplDecl = | ImplDecl(List<Ident>, List<TraitConstraint>, Ident, List<TypeExpr>, TypeExpr, List<ImplMethod>, Span)
// === Expressions ===
type Expr =
| ExLiteral(Literal)
| ExVar(Ident)
| ExBinaryOp(BinaryOp, Expr, Expr, Span)
| ExUnaryOp(UnaryOp, Expr, Span)
| ExCall(Expr, List<Expr>, Span)
| ExEffectOp(Ident, Ident, List<Expr>, Span)
| ExField(Expr, Ident, Span)
| ExTupleIndex(Expr, Int, Span)
| ExLambda(List<Parameter>, Option<TypeExpr>, List<Ident>, Expr, Span)
| ExLet(Ident, Option<TypeExpr>, Expr, Expr, Span)
| ExIf(Expr, Expr, Expr, Span)
| ExMatch(Expr, List<MatchArm>, Span)
| ExBlock(List<Statement>, Expr, Span)
| ExRecord(Option<Expr>, List<(Ident, Expr)>, Span)
| ExTuple(List<Expr>, Span)
| ExList(List<Expr>, Span)
| ExRun(Expr, List<(Ident, Expr)>, Span)
| ExResume(Expr, Span)

View File

@@ -5,12 +5,20 @@ set -euo pipefail
# Builds a static binary, generates changelog, and creates a Gitea release.
#
# Usage:
# ./scripts/release.sh [version]
# ./scripts/release.sh # auto-bump patch (0.2.0 → 0.2.1)
# ./scripts/release.sh patch # same as above
# ./scripts/release.sh minor # bump minor (0.2.0 → 0.3.0)
# ./scripts/release.sh major # bump major (0.2.0 → 1.0.0)
# ./scripts/release.sh v1.2.3 # explicit version
#
# Environment:
# GITEA_TOKEN - API token for git.qrty.ink (prompted if not set)
# GITEA_URL - Gitea instance URL (default: https://git.qrty.ink)
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
GITEA_URL="${GITEA_URL:-https://git.qrty.ink}"
REPO_OWNER="blu"
REPO_NAME="lux"
@@ -30,14 +38,33 @@ warn() { printf "${YELLOW}!!${NC} %s\n" "$1"; }
err() { printf "${RED}error:${NC} %s\n" "$1" >&2; exit 1; }
# --- Determine version ---
VERSION="${1:-}"
if [ -z "$VERSION" ]; then
VERSION=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
info "Version from Cargo.toml: v$VERSION"
fi
# Ensure v prefix
[[ "$VERSION" == v* ]] || VERSION="v$VERSION"
TAG="$VERSION"
CURRENT=$(grep '^version' Cargo.toml | head -1 | sed 's/.*"\(.*\)".*/\1/')
BUMP="${1:-patch}"
bump_version() {
local ver="$1" part="$2"
IFS='.' read -r major minor patch <<< "$ver"
case "$part" in
major) echo "$((major + 1)).0.0" ;;
minor) echo "$major.$((minor + 1)).0" ;;
patch) echo "$major.$minor.$((patch + 1))" ;;
*) echo "$part" ;; # treat as explicit version
esac
}
case "$BUMP" in
major|minor|patch)
VERSION=$(bump_version "$CURRENT" "$BUMP")
info "Bumping $BUMP: $CURRENT$VERSION"
;;
*)
# Explicit version — strip v prefix if present
VERSION="${BUMP#v}"
info "Explicit version: $VERSION"
;;
esac
TAG="v$VERSION"
# --- Check for clean working tree ---
if [ -n "$(git status --porcelain)" ]; then
@@ -50,7 +77,18 @@ fi
# --- Check if tag already exists ---
if git rev-parse "$TAG" >/dev/null 2>&1; then
err "Tag $TAG already exists. Bump version in Cargo.toml or choose a different version."
err "Tag $TAG already exists. Choose a different version."
fi
# --- Update version in source files ---
if [ "$VERSION" != "$CURRENT" ]; then
info "Updating version in Cargo.toml and flake.nix..."
sed -i "0,/^version = \"$CURRENT\"/s//version = \"$VERSION\"/" Cargo.toml
sed -i "s/version = \"$CURRENT\";/version = \"$VERSION\";/g" flake.nix
sed -i "s/v$CURRENT/v$VERSION/g" flake.nix
git add Cargo.toml flake.nix
git commit --no-gpg-sign -m "chore: bump version to $VERSION"
ok "Version updated and committed"
fi
# --- Generate changelog ---

93
scripts/validate.sh Executable file
View File

@@ -0,0 +1,93 @@
#!/usr/bin/env bash
set -euo pipefail
# Lux Full Validation Script
# Runs all checks: Rust tests, package tests, type checking, formatting, linting.
# Run after every committable change to ensure no regressions.
# cd to repo root (directory containing this script's parent)
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/.."
LUX="$(pwd)/target/release/lux"
PACKAGES_DIR="$(pwd)/../packages"
RED='\033[0;31m'
GREEN='\033[0;32m'
CYAN='\033[0;36m'
BOLD='\033[1m'
NC='\033[0m'
FAILED=0
TOTAL=0
step() {
TOTAL=$((TOTAL + 1))
printf "${CYAN}[%d]${NC} %s... " "$TOTAL" "$1"
}
ok() { printf "${GREEN}ok${NC} %s\n" "${1:-}"; }
fail() { printf "${RED}FAIL${NC} %s\n" "${1:-}"; FAILED=$((FAILED + 1)); }
# --- Rust checks ---
step "cargo check"
if nix develop --command cargo check 2>&1 | grep -q "Finished"; then ok; else fail; fi
step "cargo test"
OUTPUT=$(nix develop --command cargo test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "test result:" || echo "no result")
if echo "$RESULT" | grep -q "0 failed"; then ok "$RESULT"; else fail "$RESULT"; fi
# --- Build release binary ---
step "cargo build --release"
if nix develop --command cargo build --release 2>&1 | grep -q "Finished"; then ok; else fail; fi
# --- Package tests ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux test ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" test 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok "$RESULT"; else fail "$RESULT"; fi
fi
done
# --- Lux check on packages ---
for pkg in path frontmatter xml rss markdown; do
PKG_DIR="$PACKAGES_DIR/$pkg"
if [ -d "$PKG_DIR" ]; then
step "lux check ($pkg)"
OUTPUT=$(cd "$PKG_DIR" && "$LUX" check 2>&1 || true)
RESULT=$(echo "$OUTPUT" | grep "passed" | tail -1 || echo "no result")
if echo "$RESULT" | grep -q "passed"; then ok; else fail "$RESULT"; fi
fi
done
# --- Project checks ---
PROJECTS_DIR="$(pwd)/projects"
for proj_dir in "$PROJECTS_DIR"/*/; do
proj=$(basename "$proj_dir")
if [ -f "$proj_dir/main.lux" ]; then
step "lux check (project: $proj)"
OUTPUT=$("$LUX" check "$proj_dir/main.lux" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
fi
# Check any standalone .lux files in the project
for lux_file in "$proj_dir"/*.lux; do
[ -f "$lux_file" ] || continue
fname=$(basename "$lux_file")
[ "$fname" = "main.lux" ] && continue
step "lux check (project: $proj/$fname)"
OUTPUT=$("$LUX" check "$lux_file" 2>&1 || true)
if echo "$OUTPUT" | grep -qi "error"; then fail; else ok; fi
done
done
# --- Summary ---
printf "\n${BOLD}═══ Validation Summary ═══${NC}\n"
if [ $FAILED -eq 0 ]; then
printf "${GREEN}All %d checks passed.${NC}\n" "$TOTAL"
else
printf "${RED}%d/%d checks failed.${NC}\n" "$FAILED" "$TOTAL"
exit 1
fi

View File

@@ -541,7 +541,9 @@ pub enum Expr {
span: Span,
},
/// Record literal: { name: "Alice", age: 30 }
/// With optional spread: { ...base, name: "Bob" }
Record {
spread: Option<Box<Expr>>,
fields: Vec<(Ident, Expr)>,
span: Span,
},
@@ -622,6 +624,7 @@ pub enum BinaryOp {
Or,
// Other
Pipe, // |>
Concat, // ++
}
impl fmt::Display for BinaryOp {
@@ -641,6 +644,7 @@ impl fmt::Display for BinaryOp {
BinaryOp::And => write!(f, "&&"),
BinaryOp::Or => write!(f, "||"),
BinaryOp::Pipe => write!(f, "|>"),
BinaryOp::Concat => write!(f, "++"),
}
}
}

View File

@@ -730,10 +730,10 @@ impl CBackend {
}
// Check for string concatenation - use lux_string_concat instead of +
if matches!(op, BinaryOp::Add) {
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
let left_is_string = self.infer_expr_type(left).as_deref() == Some("LuxString");
let right_is_string = self.infer_expr_type(right).as_deref() == Some("LuxString");
if left_is_string || right_is_string {
if left_is_string || right_is_string || matches!(op, BinaryOp::Concat) {
return Ok(format!("lux_string_concat({}, {})", l, r));
}
}
@@ -858,6 +858,7 @@ impl CBackend {
self.writeln("#include <stdio.h>");
self.writeln("#include <stdlib.h>");
self.writeln("#include <string.h>");
self.writeln("#include <math.h>");
self.writeln("");
self.writeln("// === Lux Runtime Types ===");
self.writeln("");
@@ -881,6 +882,14 @@ impl CBackend {
self.writeln(" int64_t capacity;");
self.writeln("};");
self.writeln("");
self.writeln("// Map struct (linear-scan key-value table, string keys)");
self.writeln("typedef struct {");
self.writeln(" LuxString* keys;");
self.writeln(" void** values;");
self.writeln(" int64_t length;");
self.writeln(" int64_t capacity;");
self.writeln("} LuxMap;");
self.writeln("");
self.writeln("// === Reference Counting Infrastructure ===");
self.writeln("// Perceus-inspired RC system for automatic memory management.");
self.writeln("// See docs/REFERENCE_COUNTING.md for details.");
@@ -2042,6 +2051,76 @@ impl CBackend {
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
// === Map Runtime Functions ===
self.writeln("static LuxMap* lux_map_new(int64_t capacity) {");
self.writeln(" LuxMap* map = (LuxMap*)malloc(sizeof(LuxMap));");
self.writeln(" map->capacity = capacity > 0 ? capacity : 8;");
self.writeln(" map->keys = (LuxString*)calloc(map->capacity, sizeof(LuxString));");
self.writeln(" map->values = (void**)calloc(map->capacity, sizeof(void*));");
self.writeln(" map->length = 0;");
self.writeln(" return map;");
self.writeln("}");
self.writeln("");
self.writeln("static int64_t lux_map_find(LuxMap* map, LuxString key) {");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" if (map->keys[i] && strcmp(map->keys[i], key) == 0) return i;");
self.writeln(" }");
self.writeln(" return -1;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_clone(LuxMap* map) {");
self.writeln(" LuxMap* result = lux_map_new(map->capacity);");
self.writeln(" result->length = map->length;");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" result->keys[i] = lux_string_dup(map->keys[i]);");
self.writeln(" result->values[i] = map->values[i];");
self.writeln(" lux_incref(map->values[i]);");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_set(LuxMap* map, LuxString key, void* value) {");
self.writeln(" LuxMap* result = lux_map_clone(map);");
self.writeln(" int64_t idx = lux_map_find(result, key);");
self.writeln(" if (idx >= 0) {");
self.writeln(" lux_decref(result->values[idx]);");
self.writeln(" result->values[idx] = value;");
self.writeln(" lux_incref(value);");
self.writeln(" } else {");
self.writeln(" if (result->length >= result->capacity) {");
self.writeln(" result->capacity *= 2;");
self.writeln(" result->keys = (LuxString*)realloc(result->keys, sizeof(LuxString) * result->capacity);");
self.writeln(" result->values = (void**)realloc(result->values, sizeof(void*) * result->capacity);");
self.writeln(" }");
self.writeln(" result->keys[result->length] = lux_string_dup(key);");
self.writeln(" result->values[result->length] = value;");
self.writeln(" lux_incref(value);");
self.writeln(" result->length++;");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static int64_t lux_map_size(LuxMap* map) { return map->length; }");
self.writeln("static LuxBool lux_map_isEmpty(LuxMap* map) { return map->length == 0; }");
self.writeln("");
self.writeln("static LuxBool lux_map_contains(LuxMap* map, LuxString key) {");
self.writeln(" return lux_map_find(map, key) >= 0;");
self.writeln("}");
self.writeln("");
self.writeln("static LuxMap* lux_map_remove(LuxMap* map, LuxString key) {");
self.writeln(" LuxMap* result = lux_map_new(map->capacity);");
self.writeln(" for (int64_t i = 0; i < map->length; i++) {");
self.writeln(" if (strcmp(map->keys[i], key) != 0) {");
self.writeln(" result->keys[result->length] = lux_string_dup(map->keys[i]);");
self.writeln(" result->values[result->length] = map->values[i];");
self.writeln(" lux_incref(map->values[i]);");
self.writeln(" result->length++;");
self.writeln(" }");
self.writeln(" }");
self.writeln(" return result;");
self.writeln("}");
self.writeln("");
self.writeln("static Option lux_option_none(void) { return (Option){Option_TAG_NONE}; }");
self.writeln("static Option lux_option_some(void* value) { return (Option){Option_TAG_SOME, .data.some = {value}}; }");
self.writeln("");
@@ -2839,8 +2918,18 @@ impl CBackend {
}
}
// String concatenation for ++ and +
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
let left_is_string = self.infer_expr_type(left).as_deref() == Some("LuxString");
let right_is_string = self.infer_expr_type(right).as_deref() == Some("LuxString");
if left_is_string || right_is_string || matches!(op, BinaryOp::Concat) {
return Ok(format!("lux_string_concat({}, {})", l, r));
}
}
let op_str = match op {
BinaryOp::Add => "+",
BinaryOp::Concat => unreachable!("handled above"),
BinaryOp::Sub => "-",
BinaryOp::Mul => "*",
BinaryOp::Div => "/",
@@ -3003,6 +3092,9 @@ impl CBackend {
if module_name.name == "List" {
return self.emit_list_operation(&field.name, args);
}
if module_name.name == "Map" {
return self.emit_map_operation(&field.name, args);
}
// Int module
if module_name.name == "Int" && field.name == "toString" {
let arg = self.emit_expr(&args[0])?;
@@ -3011,6 +3103,10 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString");
return Ok(temp);
}
if module_name.name == "Int" && field.name == "toFloat" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxFloat){})", arg));
}
// Float module
if module_name.name == "Float" && field.name == "toString" {
let arg = self.emit_expr(&args[0])?;
@@ -3019,6 +3115,14 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString");
return Ok(temp);
}
if module_name.name == "Float" && field.name == "toInt" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxInt){})", arg));
}
// Math module
if module_name.name == "Math" {
return self.emit_math_operation(&field.name, args);
}
// Check for user-defined module function
let key = (module_name.name.clone(), field.name.clone());
if let Some(c_name) = self.module_functions.get(&key).cloned() {
@@ -3364,6 +3468,10 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString");
return Ok(temp);
}
"toFloat" => {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxFloat){})", arg));
}
_ => {}
}
}
@@ -3378,10 +3486,24 @@ impl CBackend {
self.register_rc_var(&temp, "LuxString");
return Ok(temp);
}
"toInt" => {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("((LuxInt){})", arg));
}
_ => {}
}
}
// Math module (treated as effect by parser but handled as direct C calls)
if effect.name == "Math" {
return self.emit_math_operation(&operation.name, args);
}
// Map module
if effect.name == "Map" {
return self.emit_map_operation(&operation.name, args);
}
// Built-in Console effect
if effect.name == "Console" {
if operation.name == "print" {
@@ -3844,13 +3966,35 @@ impl CBackend {
}
}
Expr::Record { fields, .. } => {
let field_strs: Result<Vec<_>, _> = fields.iter().map(|(name, val)| {
Expr::Record {
spread, fields, ..
} => {
if let Some(spread_expr) = spread {
// Evaluate spread source, then override fields
let base = self.emit_expr(spread_expr)?;
if fields.is_empty() {
Ok(base)
} else {
// Copy spread into a temp, then override fields
let temp = format!("_spread_{}", self.fresh_name());
self.writeln(&format!("__auto_type {} = {};", temp, base));
for (name, val) in fields {
let v = self.emit_expr(val)?;
self.writeln(&format!("{}.{} = {};", temp, name.name, v));
}
Ok(temp)
}
} else {
let field_strs: Result<Vec<_>, _> = fields
.iter()
.map(|(name, val)| {
let v = self.emit_expr(val)?;
Ok(format!(".{} = {}", name.name, v))
}).collect();
})
.collect();
Ok(format!("{{ {} }}", field_strs?.join(", ")))
}
}
Expr::Field { object, field, .. } => {
let obj = self.emit_expr(object)?;
@@ -3919,6 +4063,64 @@ impl CBackend {
}
}
/// Emit code for Math module operations (Math.sin, Math.cos, etc.)
fn emit_math_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op {
"abs" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("fabs({})", x))
}
"min" => {
let a = self.emit_expr(&args[0])?;
let b = self.emit_expr(&args[1])?;
Ok(format!("fmin({}, {})", a, b))
}
"max" => {
let a = self.emit_expr(&args[0])?;
let b = self.emit_expr(&args[1])?;
Ok(format!("fmax({}, {})", a, b))
}
"sqrt" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("sqrt({})", x))
}
"pow" => {
let base = self.emit_expr(&args[0])?;
let exp = self.emit_expr(&args[1])?;
Ok(format!("pow({}, {})", base, exp))
}
"floor" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)floor({})", x))
}
"ceil" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)ceil({})", x))
}
"round" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("(int64_t)round({})", x))
}
"sin" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("sin({})", x))
}
"cos" => {
let x = self.emit_expr(&args[0])?;
Ok(format!("cos({})", x))
}
"atan2" => {
let y = self.emit_expr(&args[0])?;
let x = self.emit_expr(&args[1])?;
Ok(format!("atan2({}, {})", y, x))
}
_ => Err(CGenError {
message: format!("Math.{} not supported in C backend", op),
span: None,
}),
}
}
/// Emit code for List module operations (List.map, List.filter, etc.)
fn emit_list_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op {
@@ -4420,6 +4622,140 @@ impl CBackend {
}
}
/// Emit code for Map module operations
fn emit_map_operation(&mut self, op: &str, args: &[Expr]) -> Result<String, CGenError> {
match op {
"new" => {
let temp = format!("_map_new_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_new(8);", temp));
Ok(temp)
}
"set" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let val = self.emit_expr(&args[2])?;
let boxed_val = self.box_value(&val, None);
let temp = format!("_map_set_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_set({}, {}, {});", temp, map, key, boxed_val));
Ok(temp)
}
"get" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let idx_temp = format!("_map_idx_{}", self.fresh_name());
let result_temp = format!("_map_get_{}", self.fresh_name());
self.writeln(&format!("int64_t {} = lux_map_find({}, {});", idx_temp, map, key));
self.writeln(&format!("Option {};", result_temp));
self.writeln(&format!("if ({} >= 0) {{", idx_temp));
self.indent += 1;
self.writeln(&format!("lux_incref({}->values[{}]);", map, idx_temp));
self.writeln(&format!("{} = lux_option_some({}->values[{}]);", result_temp, map, idx_temp));
self.indent -= 1;
self.writeln("} else {");
self.indent += 1;
self.writeln(&format!("{} = lux_option_none();", result_temp));
self.indent -= 1;
self.writeln("}");
Ok(result_temp)
}
"contains" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!("lux_map_contains({}, {})", map, key))
}
"remove" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let temp = format!("_map_rm_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_remove({}, {});", temp, map, key));
Ok(temp)
}
"keys" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_keys_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
// Sort keys: simple insertion sort
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", map));
self.indent += 1;
self.writeln(&format!("LuxString _ks = lux_string_dup({}->keys[_i]);", map));
self.writeln(&format!("lux_list_push({}, _ks);", temp));
self.indent -= 1;
self.writeln("}");
// Sort via bubble sort (small N)
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++)", temp));
self.writeln(&format!(" for (int64_t _j = _i+1; _j < {}->length; _j++)", temp));
self.writeln(&format!(" if (strcmp({}->elements[_i], {}->elements[_j]) > 0) {{", temp, temp));
self.writeln(&format!(" void* _t = {}->elements[_i]; {}->elements[_i] = {}->elements[_j]; {}->elements[_j] = _t;", temp, temp, temp, temp));
self.writeln(" }");
Ok(temp)
}
"values" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_vals_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
// Sort by key first, then collect values
self.writeln(&format!("int64_t* _idx = (int64_t*)malloc(sizeof(int64_t) * {}->length);", map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) _idx[_i] = _i;", map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++)", map));
self.writeln(&format!(" for (int64_t _j = _i+1; _j < {}->length; _j++)", map));
self.writeln(&format!(" if (strcmp({}->keys[_idx[_i]], {}->keys[_idx[_j]]) > 0) {{ int64_t _t = _idx[_i]; _idx[_i] = _idx[_j]; _idx[_j] = _t; }}", map, map));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", map));
self.indent += 1;
self.writeln(&format!("lux_incref({}->values[_idx[_i]]);", map));
self.writeln(&format!("lux_list_push({}, {}->values[_idx[_i]]);", temp, map));
self.indent -= 1;
self.writeln("}");
self.writeln("free(_idx);");
Ok(temp)
}
"size" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("lux_map_size({})", map))
}
"isEmpty" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("lux_map_isEmpty({})", map))
}
"fromList" => {
let list = self.emit_expr(&args[0])?;
let temp = format!("_map_fl_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_new({}->length);", temp, list));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", list));
self.indent += 1;
// Elements are tuples (boxed as void*) — we treat them as a simple 2-element struct
self.writeln("// Each element is a (String, V) tuple - not yet fully supported in C backend for Map");
self.indent -= 1;
self.writeln("}");
Ok(temp)
}
"toList" => {
let map = self.emit_expr(&args[0])?;
let temp = format!("_map_tl_{}", self.fresh_name());
self.writeln(&format!("LuxList* {} = lux_list_new({}->length);", temp, map));
self.writeln("// Map.toList not fully supported in C backend yet");
Ok(temp)
}
"merge" => {
let m1 = self.emit_expr(&args[0])?;
let m2 = self.emit_expr(&args[1])?;
let temp = format!("_map_merge_{}", self.fresh_name());
self.writeln(&format!("LuxMap* {} = lux_map_clone({});", temp, m1));
self.writeln(&format!("for (int64_t _i = 0; _i < {}->length; _i++) {{", m2));
self.indent += 1;
self.writeln(&format!("LuxMap* _next = lux_map_set({}, {}->keys[_i], {}->values[_i]);", temp, m2, m2));
self.writeln(&format!("free({}->keys); free({}->values); free({});", temp, temp, temp));
self.writeln(&format!("{} = _next;", temp));
self.indent -= 1;
self.writeln("}");
Ok(temp)
}
_ => Err(CGenError {
message: format!("Unsupported Map operation: {}", op),
span: None,
}),
}
}
fn emit_expr_with_substitution(&mut self, expr: &Expr, from: &str, to: &str) -> Result<String, CGenError> {
// Simple substitution - in a real implementation, this would be more sophisticated
match expr {
@@ -4732,11 +5068,13 @@ impl CBackend {
"toString" => return Some("LuxString".to_string()),
"parse" => return Some("Option".to_string()),
"abs" | "min" | "max" => return Some("LuxInt".to_string()),
"toFloat" => return Some("LuxFloat".to_string()),
_ => {}
},
"Float" => match field.name.as_str() {
"toString" => return Some("LuxString".to_string()),
"parse" => return Some("Option".to_string()),
"toInt" => return Some("LuxInt".to_string()),
_ => return Some("LuxFloat".to_string()),
},
_ => {
@@ -4788,6 +5126,7 @@ impl CBackend {
if effect.name == "Int" {
match operation.name.as_str() {
"toString" => return Some("LuxString".to_string()),
"toFloat" => return Some("LuxFloat".to_string()),
_ => return None,
}
}
@@ -4795,6 +5134,7 @@ impl CBackend {
if effect.name == "Float" {
match operation.name.as_str() {
"toString" => return Some("LuxString".to_string()),
"toInt" => return Some("LuxInt".to_string()),
_ => return None,
}
}
@@ -5821,7 +6161,10 @@ impl CBackend {
}
self.collect_free_vars(body, &inner_bound, free);
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_free_vars(spread_expr, bound, free);
}
for (_, val) in fields {
self.collect_free_vars(val, bound, free);
}

View File

@@ -888,7 +888,8 @@ impl JsBackend {
let prev_has_handlers = self.has_handlers;
self.has_handlers = is_effectful;
// Clear var substitutions for this function
// Save and clear var substitutions for this function scope
let saved_substitutions = self.var_substitutions.clone();
self.var_substitutions.clear();
// Emit function body
@@ -896,6 +897,7 @@ impl JsBackend {
self.writeln(&format!("return {};", body_code));
self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
self.indent -= 1;
self.writeln("}");
@@ -909,13 +911,16 @@ impl JsBackend {
let val = self.emit_expr(&let_decl.value)?;
let var_name = &let_decl.name.name;
// Check if this is a run expression (often results in undefined)
// We still want to execute it for its side effects
if var_name == "_" {
// Wildcard binding: just execute for side effects
self.writeln(&format!("{};", val));
} else {
self.writeln(&format!("const {} = {};", var_name, val));
// Register the variable for future use
self.var_substitutions
.insert(var_name.clone(), var_name.clone());
}
Ok(())
}
@@ -954,12 +959,17 @@ impl JsBackend {
let r = self.emit_expr(right)?;
// Check for string concatenation
if matches!(op, BinaryOp::Add) {
if matches!(op, BinaryOp::Add | BinaryOp::Concat) {
if self.is_string_expr(left) || self.is_string_expr(right) {
return Ok(format!("({} + {})", l, r));
}
}
// ++ on lists: use .concat()
if matches!(op, BinaryOp::Concat) {
return Ok(format!("{}.concat({})", l, r));
}
let op_str = match op {
BinaryOp::Add => "+",
BinaryOp::Sub => "-",
@@ -974,6 +984,7 @@ impl JsBackend {
BinaryOp::Ge => ">=",
BinaryOp::And => "&&",
BinaryOp::Or => "||",
BinaryOp::Concat => unreachable!("handled above"),
BinaryOp::Pipe => {
// Pipe operator: x |> f becomes f(x)
return Ok(format!("{}({})", r, l));
@@ -1034,6 +1045,11 @@ impl JsBackend {
name, value, body, ..
} => {
let val = self.emit_expr(value)?;
if name.name == "_" {
// Wildcard binding: just execute for side effects
self.writeln(&format!("{};", val));
} else {
let var_name = format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
@@ -1041,11 +1057,14 @@ impl JsBackend {
// Add substitution
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
let body_result = self.emit_expr(body)?;
// Remove substitution
if name.name != "_" {
self.var_substitutions.remove(&name.name);
}
Ok(body_result)
}
@@ -1057,6 +1076,31 @@ impl JsBackend {
if module_name.name == "List" {
return self.emit_list_operation(&field.name, args);
}
if module_name.name == "Map" {
return self.emit_map_operation(&field.name, args);
}
}
}
// Int/Float module operations
if let Expr::Field { object, field, .. } = func.as_ref() {
if let Expr::Var(module_name) = object.as_ref() {
if module_name.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toFloat" => return Ok(arg),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
if module_name.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match field.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
}
}
@@ -1066,6 +1110,10 @@ impl JsBackend {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("String({})", arg));
}
if ident.name == "print" {
let arg = self.emit_expr(&args[0])?;
return Ok(format!("console.log({})", arg));
}
}
let arg_strs: Result<Vec<_>, _> = args.iter().map(|a| self.emit_expr(a)).collect();
@@ -1142,6 +1190,26 @@ impl JsBackend {
return self.emit_math_operation(&operation.name, args);
}
// Special case: Int module operations
if effect.name == "Int" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toFloat" => return Ok(arg), // JS numbers are already floats
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Float module operations
if effect.name == "Float" {
let arg = self.emit_expr(&args[0])?;
match operation.name.as_str() {
"toInt" => return Ok(format!("Math.trunc({})", arg)),
"toString" => return Ok(format!("String({})", arg)),
_ => {}
}
}
// Special case: Result module operations (not an effect)
if effect.name == "Result" {
return self.emit_result_operation(&operation.name, args);
@@ -1152,6 +1220,11 @@ impl JsBackend {
return self.emit_json_operation(&operation.name, args);
}
// Special case: Map module operations (not an effect)
if effect.name == "Map" {
return self.emit_map_operation(&operation.name, args);
}
// Special case: Html module operations (not an effect)
if effect.name == "Html" {
return self.emit_html_operation(&operation.name, args);
@@ -1197,18 +1270,39 @@ impl JsBackend {
param_names
};
// Save handler state
// Save state
let prev_has_handlers = self.has_handlers;
let saved_substitutions = self.var_substitutions.clone();
self.has_handlers = !effects.is_empty();
// Register lambda params as themselves (override any outer substitutions)
for p in &all_params {
self.var_substitutions.insert(p.clone(), p.clone());
}
// Capture any statements emitted during body evaluation
let output_start = self.output.len();
let prev_indent = self.indent;
self.indent += 1;
let body_code = self.emit_expr(body)?;
self.writeln(&format!("return {};", body_code));
// Extract body statements and restore output
let body_statements = self.output[output_start..].to_string();
self.output.truncate(output_start);
self.indent = prev_indent;
// Restore state
self.has_handlers = prev_has_handlers;
self.var_substitutions = saved_substitutions;
let indent_str = " ".repeat(self.indent);
Ok(format!(
"(function({}) {{ return {}; }})",
"(function({}) {{\n{}{}}})",
all_params.join(", "),
body_code
body_statements,
indent_str,
))
}
@@ -1228,27 +1322,36 @@ impl JsBackend {
}
Statement::Let { name, value, .. } => {
let val = self.emit_expr(value)?;
let var_name = format!("{}_{}", name.name, self.fresh_name());
if name.name == "_" {
self.writeln(&format!("{};", val));
} else {
let var_name =
format!("{}_{}", name.name, self.fresh_name());
self.writeln(&format!("const {} = {};", var_name, val));
self.var_substitutions
.insert(name.name.clone(), var_name.clone());
}
}
}
}
// Emit result
self.emit_expr(result)
}
Expr::Record { fields, .. } => {
let field_strs: Result<Vec<_>, _> = fields
.iter()
.map(|(name, expr)| {
Expr::Record {
spread, fields, ..
} => {
let mut parts = Vec::new();
if let Some(spread_expr) = spread {
let spread_code = self.emit_expr(spread_expr)?;
parts.push(format!("...{}", spread_code));
}
for (name, expr) in fields {
let val = self.emit_expr(expr)?;
Ok(format!("{}: {}", name.name, val))
})
.collect();
Ok(format!("{{ {} }}", field_strs?.join(", ")))
parts.push(format!("{}: {}", name.name, val));
}
Ok(format!("{{ {} }}", parts.join(", ")))
}
Expr::Tuple { elements, .. } => {
@@ -2067,6 +2170,86 @@ impl JsBackend {
}
}
/// Emit Map module operations using JS Map
fn emit_map_operation(
&mut self,
operation: &str,
args: &[Expr],
) -> Result<String, JsGenError> {
match operation {
"new" => Ok("new Map()".to_string()),
"set" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
let val = self.emit_expr(&args[2])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.set({}, {}); return m; }})()",
map, key, val
))
}
"get" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"({0}.has({1}) ? Lux.Some({0}.get({1})) : Lux.None())",
map, key
))
}
"contains" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!("{}.has({})", map, key))
}
"remove" => {
let map = self.emit_expr(&args[0])?;
let key = self.emit_expr(&args[1])?;
Ok(format!(
"(function() {{ var m = new Map({}); m.delete({}); return m; }})()",
map, key
))
}
"keys" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("Array.from({}.keys()).sort()", map))
}
"values" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({0}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }}).map(function(e) {{ return e[1]; }})",
map
))
}
"size" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("{}.size", map))
}
"isEmpty" => {
let map = self.emit_expr(&args[0])?;
Ok(format!("({}.size === 0)", map))
}
"fromList" => {
let list = self.emit_expr(&args[0])?;
Ok(format!("new Map({}.map(function(t) {{ return [t[0], t[1]]; }}))", list))
}
"toList" => {
let map = self.emit_expr(&args[0])?;
Ok(format!(
"Array.from({}.entries()).sort(function(a,b) {{ return a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0; }})",
map
))
}
"merge" => {
let m1 = self.emit_expr(&args[0])?;
let m2 = self.emit_expr(&args[1])?;
Ok(format!("new Map([...{}, ...{}])", m1, m2))
}
_ => Err(JsGenError {
message: format!("Unknown Map operation: {}", operation),
span: None,
}),
}
}
/// Emit Html module operations for type-safe HTML construction
fn emit_html_operation(
&mut self,
@@ -2338,7 +2521,7 @@ impl JsBackend {
}
}
Expr::BinaryOp { op, left, right, .. } => {
matches!(op, BinaryOp::Add)
matches!(op, BinaryOp::Add | BinaryOp::Concat)
&& (self.is_string_expr(left) || self.is_string_expr(right))
}
_ => false,

View File

@@ -688,15 +688,17 @@ impl Formatter {
.join(", ")
)
}
Expr::Record { fields, .. } => {
format!(
"{{ {} }}",
fields
.iter()
.map(|(name, val)| format!("{}: {}", name.name, self.format_expr(val)))
.collect::<Vec<_>>()
.join(", ")
)
Expr::Record {
spread, fields, ..
} => {
let mut parts = Vec::new();
if let Some(spread_expr) = spread {
parts.push(format!("...{}", self.format_expr(spread_expr)));
}
for (name, val) in fields {
parts.push(format!("{}: {}", name.name, self.format_expr(val)));
}
format!("{{ {} }}", parts.join(", "))
}
Expr::EffectOp { effect, operation, args, .. } => {
format!(
@@ -753,6 +755,7 @@ impl Formatter {
BinaryOp::Ge => ">=",
BinaryOp::And => "&&",
BinaryOp::Or => "||",
BinaryOp::Concat => "++",
BinaryOp::Pipe => "|>",
}
}

View File

@@ -74,6 +74,9 @@ pub enum BuiltinFn {
MathFloor,
MathCeil,
MathRound,
MathSin,
MathCos,
MathAtan2,
// Additional List operations
ListIsEmpty,
@@ -97,7 +100,9 @@ pub enum BuiltinFn {
// Int/Float operations
IntToString,
IntToFloat,
FloatToString,
FloatToInt,
// JSON operations
JsonParse,
@@ -119,6 +124,20 @@ pub enum BuiltinFn {
JsonString,
JsonArray,
JsonObject,
// Map operations
MapNew,
MapSet,
MapGet,
MapContains,
MapRemove,
MapKeys,
MapValues,
MapSize,
MapIsEmpty,
MapFromList,
MapToList,
MapMerge,
}
/// Runtime value
@@ -133,6 +152,7 @@ pub enum Value {
List(Vec<Value>),
Tuple(Vec<Value>),
Record(HashMap<String, Value>),
Map(HashMap<String, Value>),
Function(Rc<Closure>),
Handler(Rc<HandlerValue>),
/// Built-in function
@@ -164,6 +184,7 @@ impl Value {
Value::List(_) => "List",
Value::Tuple(_) => "Tuple",
Value::Record(_) => "Record",
Value::Map(_) => "Map",
Value::Function(_) => "Function",
Value::Handler(_) => "Handler",
Value::Builtin(_) => "Function",
@@ -212,6 +233,11 @@ impl Value {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Map(xs), Value::Map(ys)) => {
xs.len() == ys.len() && xs.iter().all(|(k, v)| {
ys.get(k).map(|yv| Value::values_equal(v, yv)).unwrap_or(false)
})
}
(Value::Constructor { name: n1, fields: f1 }, Value::Constructor { name: n2, fields: f2 }) => {
n1 == n2 && f1.len() == f2.len() && f1.iter().zip(f2.iter()).all(|(x, y)| Value::values_equal(x, y))
}
@@ -282,6 +308,16 @@ impl TryFromValue for Vec<Value> {
}
}
impl TryFromValue for HashMap<String, Value> {
const TYPE_NAME: &'static str = "Map";
fn try_from_value(value: &Value) -> Option<Self> {
match value {
Value::Map(m) => Some(m.clone()),
_ => None,
}
}
}
impl TryFromValue for Value {
const TYPE_NAME: &'static str = "any";
fn try_from_value(value: &Value) -> Option<Self> {
@@ -328,6 +364,18 @@ impl fmt::Display for Value {
}
write!(f, " }}")
}
Value::Map(entries) => {
write!(f, "Map {{")?;
let mut sorted: Vec<_> = entries.iter().collect();
sorted.sort_by_key(|(k, _)| (*k).clone());
for (i, (key, value)) in sorted.iter().enumerate() {
if i > 0 {
write!(f, ", ")?;
}
write!(f, "\"{}\": {}", key, value)?;
}
write!(f, "}}")
}
Value::Function(_) => write!(f, "<function>"),
Value::Builtin(b) => write!(f, "<builtin:{:?}>", b),
Value::Handler(_) => write!(f, "<handler>"),
@@ -1072,18 +1120,23 @@ impl Interpreter {
("floor".to_string(), Value::Builtin(BuiltinFn::MathFloor)),
("ceil".to_string(), Value::Builtin(BuiltinFn::MathCeil)),
("round".to_string(), Value::Builtin(BuiltinFn::MathRound)),
("sin".to_string(), Value::Builtin(BuiltinFn::MathSin)),
("cos".to_string(), Value::Builtin(BuiltinFn::MathCos)),
("atan2".to_string(), Value::Builtin(BuiltinFn::MathAtan2)),
]));
env.define("Math", math_module);
// Int module
let int_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::IntToString)),
("toFloat".to_string(), Value::Builtin(BuiltinFn::IntToFloat)),
]));
env.define("Int", int_module);
// Float module
let float_module = Value::Record(HashMap::from([
("toString".to_string(), Value::Builtin(BuiltinFn::FloatToString)),
("toInt".to_string(), Value::Builtin(BuiltinFn::FloatToInt)),
]));
env.define("Float", float_module);
@@ -1110,16 +1163,72 @@ impl Interpreter {
("object".to_string(), Value::Builtin(BuiltinFn::JsonObject)),
]));
env.define("Json", json_module);
// Map module
let map_module = Value::Record(HashMap::from([
("new".to_string(), Value::Builtin(BuiltinFn::MapNew)),
("set".to_string(), Value::Builtin(BuiltinFn::MapSet)),
("get".to_string(), Value::Builtin(BuiltinFn::MapGet)),
("contains".to_string(), Value::Builtin(BuiltinFn::MapContains)),
("remove".to_string(), Value::Builtin(BuiltinFn::MapRemove)),
("keys".to_string(), Value::Builtin(BuiltinFn::MapKeys)),
("values".to_string(), Value::Builtin(BuiltinFn::MapValues)),
("size".to_string(), Value::Builtin(BuiltinFn::MapSize)),
("isEmpty".to_string(), Value::Builtin(BuiltinFn::MapIsEmpty)),
("fromList".to_string(), Value::Builtin(BuiltinFn::MapFromList)),
("toList".to_string(), Value::Builtin(BuiltinFn::MapToList)),
("merge".to_string(), Value::Builtin(BuiltinFn::MapMerge)),
]));
env.define("Map", map_module);
}
/// Execute a program
pub fn run(&mut self, program: &Program) -> Result<Value, RuntimeError> {
let mut last_value = Value::Unit;
let mut has_main_let = false;
for decl in &program.declarations {
// Track if there's a top-level `let main = ...`
if let Declaration::Let(let_decl) = decl {
if let_decl.name.name == "main" {
has_main_let = true;
}
}
last_value = self.eval_declaration(decl)?;
}
// Auto-invoke main if it was defined as a let binding with a function value
if has_main_let {
if let Some(main_val) = self.global_env.get("main") {
if let Value::Function(ref closure) = main_val {
if closure.params.is_empty() {
let span = Span { start: 0, end: 0 };
let mut result = self.eval_call(main_val.clone(), vec![], span)?;
// Trampoline loop
loop {
match result {
EvalResult::Value(v) => {
last_value = v;
break;
}
EvalResult::Effect(req) => {
last_value = self.handle_effect(req)?;
break;
}
EvalResult::TailCall { func, args, span } => {
result = self.eval_call(func, args, span)?;
}
EvalResult::Resume(v) => {
last_value = v;
break;
}
}
}
}
}
}
}
Ok(last_value)
}
@@ -1525,8 +1634,28 @@ impl Interpreter {
self.eval_expr_tail(result, &block_env, tail)
}
Expr::Record { fields, .. } => {
Expr::Record {
spread, fields, ..
} => {
let mut record = HashMap::new();
// If there's a spread, evaluate it and start with its fields
if let Some(spread_expr) = spread {
let spread_val = self.eval_expr(spread_expr, env)?;
if let Value::Record(spread_fields) = spread_val {
record = spread_fields;
} else {
return Err(RuntimeError {
message: format!(
"Spread expression must evaluate to a record, got {}",
spread_val.type_name()
),
span: Some(expr.span()),
});
}
}
// Override with explicit fields
for (name, expr) in fields {
let val = self.eval_expr(expr, env)?;
record.insert(name.name.clone(), val);
@@ -1599,6 +1728,18 @@ impl Interpreter {
span: Some(span),
}),
},
BinaryOp::Concat => match (left, right) {
(Value::String(a), Value::String(b)) => Ok(Value::String(a + &b)),
(Value::List(a), Value::List(b)) => {
let mut result = a;
result.extend(b);
Ok(Value::List(result))
}
(l, r) => Err(RuntimeError {
message: format!("Cannot concatenate {} and {}", l.type_name(), r.type_name()),
span: Some(span),
}),
},
BinaryOp::Sub => match (left, right) {
(Value::Int(a), Value::Int(b)) => Ok(Value::Int(a - b)),
(Value::Float(a), Value::Float(b)) => Ok(Value::Float(a - b)),
@@ -2287,6 +2428,26 @@ impl Interpreter {
}
}
BuiltinFn::IntToFloat => {
if args.len() != 1 {
return Err(err("Int.toFloat requires 1 argument"));
}
match &args[0] {
Value::Int(n) => Ok(EvalResult::Value(Value::Float(*n as f64))),
v => Err(err(&format!("Int.toFloat expects Int, got {}", v.type_name()))),
}
}
BuiltinFn::FloatToInt => {
if args.len() != 1 {
return Err(err("Float.toInt requires 1 argument"));
}
match &args[0] {
Value::Float(f) => Ok(EvalResult::Value(Value::Int(*f as i64))),
v => Err(err(&format!("Float.toInt expects Float, got {}", v.type_name()))),
}
}
BuiltinFn::TypeOf => {
if args.len() != 1 {
return Err(err("typeOf requires 1 argument"));
@@ -2463,6 +2624,45 @@ impl Interpreter {
}
}
BuiltinFn::MathSin => {
if args.len() != 1 {
return Err(err("Math.sin requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.sin()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).sin()))),
v => Err(err(&format!("Math.sin expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathCos => {
if args.len() != 1 {
return Err(err("Math.cos requires 1 argument"));
}
match &args[0] {
Value::Float(n) => Ok(EvalResult::Value(Value::Float(n.cos()))),
Value::Int(n) => Ok(EvalResult::Value(Value::Float((*n as f64).cos()))),
v => Err(err(&format!("Math.cos expects number, got {}", v.type_name()))),
}
}
BuiltinFn::MathAtan2 => {
if args.len() != 2 {
return Err(err("Math.atan2 requires 2 arguments: y, x"));
}
let y = match &args[0] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
let x = match &args[1] {
Value::Float(n) => *n,
Value::Int(n) => *n as f64,
v => return Err(err(&format!("Math.atan2 expects number, got {}", v.type_name()))),
};
Ok(EvalResult::Value(Value::Float(y.atan2(x))))
}
// Additional List operations
BuiltinFn::ListIsEmpty => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "List.isEmpty", span)?;
@@ -2952,6 +3152,128 @@ impl Interpreter {
}
Ok(EvalResult::Value(Value::Json(serde_json::Value::Object(map))))
}
// Map operations
BuiltinFn::MapNew => {
Ok(EvalResult::Value(Value::Map(HashMap::new())))
}
BuiltinFn::MapSet => {
if args.len() != 3 {
return Err(err("Map.set requires 3 arguments: map, key, value"));
}
let mut map = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.set expects Map as first argument, got {}", v.type_name()))),
};
let key = match &args[1] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.set expects String key, got {}", v.type_name()))),
};
map.insert(key, args[2].clone());
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapGet => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.get", span)?;
match map.get(&key) {
Some(v) => Ok(EvalResult::Value(Value::Constructor {
name: "Some".to_string(),
fields: vec![v.clone()],
})),
None => Ok(EvalResult::Value(Value::Constructor {
name: "None".to_string(),
fields: vec![],
})),
}
}
BuiltinFn::MapContains => {
let (map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.contains", span)?;
Ok(EvalResult::Value(Value::Bool(map.contains_key(&key))))
}
BuiltinFn::MapRemove => {
let (mut map, key) = Self::expect_args_2::<HashMap<String, Value>, String>(&args, "Map.remove", span)?;
map.remove(&key);
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapKeys => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.keys", span)?;
let mut keys: Vec<String> = map.keys().cloned().collect();
keys.sort();
Ok(EvalResult::Value(Value::List(
keys.into_iter().map(Value::String).collect(),
)))
}
BuiltinFn::MapValues => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.values", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries.into_iter().map(|(_, v)| v).collect(),
)))
}
BuiltinFn::MapSize => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.size", span)?;
Ok(EvalResult::Value(Value::Int(map.len() as i64)))
}
BuiltinFn::MapIsEmpty => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.isEmpty", span)?;
Ok(EvalResult::Value(Value::Bool(map.is_empty())))
}
BuiltinFn::MapFromList => {
let list = Self::expect_arg_1::<Vec<Value>>(&args, "Map.fromList", span)?;
let mut map = HashMap::new();
for item in list {
match item {
Value::Tuple(fields) if fields.len() == 2 => {
let key = match &fields[0] {
Value::String(s) => s.clone(),
v => return Err(err(&format!("Map.fromList expects (String, V) tuples, got {} key", v.type_name()))),
};
map.insert(key, fields[1].clone());
}
_ => return Err(err("Map.fromList expects List<(String, V)>")),
}
}
Ok(EvalResult::Value(Value::Map(map)))
}
BuiltinFn::MapToList => {
let map = Self::expect_arg_1::<HashMap<String, Value>>(&args, "Map.toList", span)?;
let mut entries: Vec<(String, Value)> = map.into_iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
Ok(EvalResult::Value(Value::List(
entries
.into_iter()
.map(|(k, v)| Value::Tuple(vec![Value::String(k), v]))
.collect(),
)))
}
BuiltinFn::MapMerge => {
if args.len() != 2 {
return Err(err("Map.merge requires 2 arguments: map1, map2"));
}
let mut map1 = match &args[0] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as first argument, got {}", v.type_name()))),
};
let map2 = match &args[1] {
Value::Map(m) => m.clone(),
v => return Err(err(&format!("Map.merge expects Map as second argument, got {}", v.type_name()))),
};
for (k, v) in map2 {
map1.insert(k, v);
}
Ok(EvalResult::Value(Value::Map(map1)))
}
}
}
@@ -3117,6 +3439,11 @@ impl Interpreter {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(Value::Map(a), Value::Map(b)) => {
a.len() == b.len() && a.iter().all(|(k, v)| {
b.get(k).map(|bv| self.values_equal(v, bv)).unwrap_or(false)
})
}
(
Value::Constructor {
name: n1,
@@ -5044,6 +5371,7 @@ mod tests {
// Create a simple migration that adds a field
// Migration: old.name -> { name: old.name, email: "unknown" }
let migration_body = Expr::Record {
spread: None,
fields: vec![
(
Ident::new("name", Span::default()),

View File

@@ -70,6 +70,7 @@ pub enum TokenKind {
// Operators
Plus, // +
PlusPlus, // ++
Minus, // -
Star, // *
Slash, // /
@@ -89,6 +90,7 @@ pub enum TokenKind {
Arrow, // =>
ThinArrow, // ->
Dot, // .
DotDotDot, // ...
Colon, // :
ColonColon, // ::
Comma, // ,
@@ -160,6 +162,7 @@ impl fmt::Display for TokenKind {
TokenKind::True => write!(f, "true"),
TokenKind::False => write!(f, "false"),
TokenKind::Plus => write!(f, "+"),
TokenKind::PlusPlus => write!(f, "++"),
TokenKind::Minus => write!(f, "-"),
TokenKind::Star => write!(f, "*"),
TokenKind::Slash => write!(f, "/"),
@@ -179,6 +182,7 @@ impl fmt::Display for TokenKind {
TokenKind::Arrow => write!(f, "=>"),
TokenKind::ThinArrow => write!(f, "->"),
TokenKind::Dot => write!(f, "."),
TokenKind::DotDotDot => write!(f, "..."),
TokenKind::Colon => write!(f, ":"),
TokenKind::ColonColon => write!(f, "::"),
TokenKind::Comma => write!(f, ","),
@@ -268,7 +272,14 @@ impl<'a> Lexer<'a> {
let kind = match c {
// Single-character tokens
'+' => TokenKind::Plus,
'+' => {
if self.peek() == Some('+') {
self.advance();
TokenKind::PlusPlus
} else {
TokenKind::Plus
}
}
'*' => TokenKind::Star,
'%' => TokenKind::Percent,
'(' => TokenKind::LParen,
@@ -364,7 +375,22 @@ impl<'a> Lexer<'a> {
TokenKind::Pipe
}
}
'.' => TokenKind::Dot,
'.' => {
if self.peek() == Some('.') {
// Check for ... (need to peek past second dot)
// We look at source directly since we can only peek one ahead
let next_next = self.source[self.pos..].chars().nth(1);
if next_next == Some('.') {
self.advance(); // consume second '.'
self.advance(); // consume third '.'
TokenKind::DotDotDot
} else {
TokenKind::Dot
}
} else {
TokenKind::Dot
}
}
':' => {
if self.peek() == Some(':') {
self.advance();

View File

@@ -513,7 +513,10 @@ impl Linter {
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => {
self.collect_refs_expr(object);
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.collect_refs_expr(spread_expr);
}
for (_, val) in fields {
self.collect_refs_expr(val);
}

View File

@@ -1571,7 +1571,10 @@ fn collect_call_site_hints(
collect_call_site_hints(source, e, param_names, hints);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
collect_call_site_hints(source, spread_expr, param_names, hints);
}
for (_, e) in fields {
collect_call_site_hints(source, e, param_names, hints);
}

View File

@@ -37,7 +37,7 @@ use std::borrow::Cow;
use std::collections::HashSet;
use typechecker::TypeChecker;
const VERSION: &str = "0.1.0";
const VERSION: &str = env!("CARGO_PKG_VERSION");
const HELP: &str = r#"
Lux - A functional language with first-class effects
@@ -902,6 +902,7 @@ fn compile_to_c(path: &str, output_path: Option<&str>, run_after: bool, emit_c:
.args(["-O2", "-o"])
.arg(&output_bin)
.arg(&temp_c)
.arg("-lm")
.output();
match compile_result {
@@ -5440,4 +5441,122 @@ c")"#;
check_file("projects/rest-api/main.lux").unwrap();
}
}
// === Map type tests ===
#[test]
fn test_map_new_and_size() {
let source = r#"
let m = Map.new()
let result = Map.size(m)
"#;
assert_eq!(eval(source).unwrap(), "0");
}
#[test]
fn test_map_set_and_get() {
let source = r#"
let m = Map.new()
let m2 = Map.set(m, "name", "Alice")
let result = Map.get(m2, "name")
"#;
assert_eq!(eval(source).unwrap(), "Some(\"Alice\")");
}
#[test]
fn test_map_get_missing() {
let source = r#"
let m = Map.new()
let result = Map.get(m, "missing")
"#;
assert_eq!(eval(source).unwrap(), "None");
}
#[test]
fn test_map_contains() {
let source = r#"
let m = Map.set(Map.new(), "x", 1)
let result = (Map.contains(m, "x"), Map.contains(m, "y"))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_remove() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "a", 1), "b", 2)
let m2 = Map.remove(m, "a")
let result = (Map.size(m2), Map.contains(m2, "a"), Map.contains(m2, "b"))
"#;
assert_eq!(eval(source).unwrap(), "(1, false, true)");
}
#[test]
fn test_map_keys_and_values() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.keys(m)
"#;
assert_eq!(eval(source).unwrap(), "[\"a\", \"b\"]");
}
#[test]
fn test_map_from_list() {
let source = r#"
let m = Map.fromList([("x", 10), ("y", 20)])
let result = (Map.get(m, "x"), Map.size(m))
"#;
assert_eq!(eval(source).unwrap(), "(Some(10), 2)");
}
#[test]
fn test_map_to_list() {
let source = r#"
let m = Map.set(Map.set(Map.new(), "b", 2), "a", 1)
let result = Map.toList(m)
"#;
assert_eq!(eval(source).unwrap(), "[(\"a\", 1), (\"b\", 2)]");
}
#[test]
fn test_map_merge() {
let source = r#"
let m1 = Map.fromList([("a", 1), ("b", 2)])
let m2 = Map.fromList([("b", 3), ("c", 4)])
let merged = Map.merge(m1, m2)
let result = (Map.get(merged, "a"), Map.get(merged, "b"), Map.get(merged, "c"))
"#;
assert_eq!(eval(source).unwrap(), "(Some(1), Some(3), Some(4))");
}
#[test]
fn test_map_immutability() {
let source = r#"
let m1 = Map.fromList([("a", 1)])
let m2 = Map.set(m1, "b", 2)
let result = (Map.size(m1), Map.size(m2))
"#;
assert_eq!(eval(source).unwrap(), "(1, 2)");
}
#[test]
fn test_map_is_empty() {
let source = r#"
let m1 = Map.new()
let m2 = Map.set(m1, "x", 1)
let result = (Map.isEmpty(m1), Map.isEmpty(m2))
"#;
assert_eq!(eval(source).unwrap(), "(true, false)");
}
#[test]
fn test_map_type_annotation() {
let source = r#"
fn lookup(m: Map<String, Int>, key: String): Option<Int> =
Map.get(m, key)
let m = Map.fromList([("age", 30)])
let result = lookup(m, "age")
"#;
assert_eq!(eval(source).unwrap(), "Some(30)");
}
}

View File

@@ -1558,6 +1558,7 @@ impl Parser {
loop {
let op = match self.peek_kind() {
TokenKind::Plus => BinaryOp::Add,
TokenKind::PlusPlus => BinaryOp::Concat,
TokenKind::Minus => BinaryOp::Sub,
_ => break,
};
@@ -1791,6 +1792,7 @@ impl Parser {
let condition = Box::new(self.parse_expr()?);
self.skip_newlines();
self.expect(TokenKind::Then)?;
self.skip_newlines();
let then_branch = Box::new(self.parse_expr()?);
@@ -2207,6 +2209,11 @@ impl Parser {
}));
}
// Check for record spread: { ...expr, field: val }
if matches!(self.peek_kind(), TokenKind::DotDotDot) {
return self.parse_record_expr_rest(start);
}
// Check if it's a record (ident: expr) or block
if matches!(self.peek_kind(), TokenKind::Ident(_)) {
let lookahead = self.tokens.get(self.pos + 1).map(|t| &t.kind);
@@ -2221,6 +2228,20 @@ impl Parser {
fn parse_record_expr_rest(&mut self, start: Span) -> Result<Expr, ParseError> {
let mut fields = Vec::new();
let mut spread = None;
// Check for spread: { ...expr, ... }
if self.check(TokenKind::DotDotDot) {
self.advance(); // consume ...
let spread_expr = self.parse_expr()?;
spread = Some(Box::new(spread_expr));
self.skip_newlines();
if self.check(TokenKind::Comma) {
self.advance();
}
self.skip_newlines();
}
while !self.check(TokenKind::RBrace) {
let name = self.parse_ident()?;
@@ -2237,7 +2258,11 @@ impl Parser {
self.expect(TokenKind::RBrace)?;
let span = start.merge(self.previous_span());
Ok(Expr::Record { fields, span })
Ok(Expr::Record {
spread,
fields,
span,
})
}
fn parse_block_rest(&mut self, start: Span) -> Result<Expr, ParseError> {

View File

@@ -527,7 +527,10 @@ impl SymbolTable {
self.visit_expr(e, scope_idx);
}
}
Expr::Record { fields, .. } => {
Expr::Record { spread, fields, .. } => {
if let Some(spread_expr) = spread {
self.visit_expr(spread_expr, scope_idx);
}
for (_, e) in fields {
self.visit_expr(e, scope_idx);
}

View File

@@ -339,7 +339,10 @@ fn references_params(expr: &Expr, params: &[&str]) -> bool {
Expr::Lambda { body, .. } => references_params(body, params),
Expr::Tuple { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::List { elements, .. } => elements.iter().any(|e| references_params(e, params)),
Expr::Record { fields, .. } => fields.iter().any(|(_, e)| references_params(e, params)),
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| references_params(s, params))
|| fields.iter().any(|(_, e)| references_params(e, params))
}
Expr::Match { scrutinee, arms, .. } => {
references_params(scrutinee, params)
|| arms.iter().any(|a| references_params(&a.body, params))
@@ -516,8 +519,9 @@ fn has_recursive_calls(func_name: &str, body: &Expr) -> bool {
Expr::Tuple { elements, .. } | Expr::List { elements, .. } => {
elements.iter().any(|e| has_recursive_calls(func_name, e))
}
Expr::Record { fields, .. } => {
fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
Expr::Record { spread, fields, .. } => {
spread.as_ref().is_some_and(|s| has_recursive_calls(func_name, s))
|| fields.iter().any(|(_, e)| has_recursive_calls(func_name, e))
}
Expr::Field { object, .. } | Expr::TupleIndex { object, .. } => has_recursive_calls(func_name, object),
Expr::Let { value, body, .. } => {
@@ -672,6 +676,7 @@ fn generate_auto_migration_expr(
// Build the record expression
Some(Expr::Record {
spread: None,
fields: field_exprs,
span,
})
@@ -1536,7 +1541,7 @@ impl TypeChecker {
// Use the declared type if present, otherwise use inferred
let final_type = if let Some(ref type_expr) = let_decl.typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&inferred, &declared) {
if let Err(e) = unify_with_env(&inferred, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -1744,7 +1749,11 @@ impl TypeChecker {
span,
} => self.infer_block(statements, result, *span),
Expr::Record { fields, span } => self.infer_record(fields, *span),
Expr::Record {
spread,
fields,
span,
} => self.infer_record(spread.as_deref(), fields, *span),
Expr::Tuple { elements, span } => self.infer_tuple(elements, *span),
@@ -1783,7 +1792,7 @@ impl TypeChecker {
match op {
BinaryOp::Add => {
// Add supports both numeric types and string concatenation
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1804,9 +1813,32 @@ impl TypeChecker {
}
}
BinaryOp::Concat => {
// Concat (++) supports strings and lists
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '++' must have same type: {}", e),
span,
});
}
match &left_type {
Type::String | Type::List(_) | Type::Var(_) => left_type,
_ => {
self.errors.push(TypeError {
message: format!(
"Operator '++' requires String or List operands, got {}",
left_type
),
span,
});
Type::Error
}
}
}
BinaryOp::Sub | BinaryOp::Mul | BinaryOp::Div | BinaryOp::Mod => {
// Arithmetic: both operands must be same numeric type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1830,7 +1862,7 @@ impl TypeChecker {
BinaryOp::Eq | BinaryOp::Ne => {
// Equality: operands must have same type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1841,7 +1873,7 @@ impl TypeChecker {
BinaryOp::Lt | BinaryOp::Le | BinaryOp::Gt | BinaryOp::Ge => {
// Comparison: operands must be same orderable type
if let Err(e) = unify(&left_type, &right_type) {
if let Err(e) = unify_with_env(&left_type, &right_type, &self.env) {
self.errors.push(TypeError {
message: format!("Operands of '{}' must have same type: {}", op, e),
span,
@@ -1852,13 +1884,13 @@ impl TypeChecker {
BinaryOp::And | BinaryOp::Or => {
// Logical: both must be Bool
if let Err(e) = unify(&left_type, &Type::Bool) {
if let Err(e) = unify_with_env(&left_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Left operand of '{}' must be Bool: {}", op, e),
span: left.span(),
});
}
if let Err(e) = unify(&right_type, &Type::Bool) {
if let Err(e) = unify_with_env(&right_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Right operand of '{}' must be Bool: {}", op, e),
span: right.span(),
@@ -1872,7 +1904,7 @@ impl TypeChecker {
// right must be a function that accepts left's type
let result_type = Type::var();
let expected_fn = Type::function(vec![left_type.clone()], result_type.clone());
if let Err(e) = unify(&right_type, &expected_fn) {
if let Err(e) = unify_with_env(&right_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Pipe target must be a function accepting {}: {}",
@@ -1904,7 +1936,7 @@ impl TypeChecker {
}
},
UnaryOp::Not => {
if let Err(e) = unify(&operand_type, &Type::Bool) {
if let Err(e) = unify_with_env(&operand_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Operator '!' requires Bool operand: {}", e),
span,
@@ -1955,7 +1987,7 @@ impl TypeChecker {
self.current_effects.clone(),
);
match unify(&func_type, &expected_fn) {
match unify_with_env(&func_type, &expected_fn, &self.env) {
Ok(subst) => result_type.apply(&subst),
Err(e) => {
// Provide more detailed error message based on the type of mismatch
@@ -2032,7 +2064,7 @@ impl TypeChecker {
let result_type = Type::var();
let expected_fn = Type::function(arg_types, result_type.clone());
if let Err(e) = unify(field_type, &expected_fn) {
if let Err(e) = unify_with_env(field_type, &expected_fn, &self.env) {
self.errors.push(TypeError {
message: format!(
"Type mismatch in {}.{} call: {}",
@@ -2104,7 +2136,7 @@ impl TypeChecker {
for (i, (arg_type, (_, param_type))) in
arg_types.iter().zip(op.params.iter()).enumerate()
{
if let Err(e) = unify(arg_type, param_type) {
if let Err(e) = unify_with_env(arg_type, param_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Argument {} of '{}.{}' has type {}, expected {}: {}",
@@ -2137,6 +2169,7 @@ impl TypeChecker {
fn infer_field(&mut self, object: &Expr, field: &Ident, span: Span) -> Type {
let object_type = self.infer_expr(object);
let object_type = self.env.expand_type_alias(&object_type);
match &object_type {
Type::Record(fields) => match fields.iter().find(|(n, _)| n == &field.name) {
@@ -2217,7 +2250,7 @@ impl TypeChecker {
// Check return type if specified
let ret_type = if let Some(rt) = return_type {
let declared = self.resolve_type(rt);
if let Err(e) = unify(&body_type, &declared) {
if let Err(e) = unify_with_env(&body_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Lambda body type {} doesn't match declared {}: {}",
@@ -2283,7 +2316,7 @@ impl TypeChecker {
span: Span,
) -> Type {
let cond_type = self.infer_expr(condition);
if let Err(e) = unify(&cond_type, &Type::Bool) {
if let Err(e) = unify_with_env(&cond_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("If condition must be Bool, got {}: {}", cond_type, e),
span: condition.span(),
@@ -2293,7 +2326,7 @@ impl TypeChecker {
let then_type = self.infer_expr(then_branch);
let else_type = self.infer_expr(else_branch);
match unify(&then_type, &else_type) {
match unify_with_env(&then_type, &else_type, &self.env) {
Ok(subst) => then_type.apply(&subst),
Err(e) => {
self.errors.push(TypeError {
@@ -2334,7 +2367,7 @@ impl TypeChecker {
// Check guard if present
if let Some(ref guard) = arm.guard {
let guard_type = self.infer_expr(guard);
if let Err(e) = unify(&guard_type, &Type::Bool) {
if let Err(e) = unify_with_env(&guard_type, &Type::Bool, &self.env) {
self.errors.push(TypeError {
message: format!("Match guard must be Bool: {}", e),
span: guard.span(),
@@ -2350,7 +2383,7 @@ impl TypeChecker {
match &result_type {
None => result_type = Some(body_type),
Some(prev) => {
if let Err(e) = unify(prev, &body_type) {
if let Err(e) = unify_with_env(prev, &body_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Match arm has incompatible type: expected {}, got {}: {}",
@@ -2400,7 +2433,7 @@ impl TypeChecker {
Pattern::Literal(lit) => {
let lit_type = self.infer_literal(lit);
if let Err(e) = unify(&lit_type, expected) {
if let Err(e) = unify_with_env(&lit_type, expected, &self.env) {
self.errors.push(TypeError {
message: format!("Pattern literal type mismatch: {}", e),
span: lit.span,
@@ -2414,7 +2447,7 @@ impl TypeChecker {
// For now, handle Option specially
match name.name.as_str() {
"None" => {
if let Err(e) = unify(expected, &Type::Option(Box::new(Type::var()))) {
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(Type::var())), &self.env) {
self.errors.push(TypeError {
message: format!(
"None pattern doesn't match type {}: {}",
@@ -2427,7 +2460,7 @@ impl TypeChecker {
}
"Some" => {
let inner_type = Type::var();
if let Err(e) = unify(expected, &Type::Option(Box::new(inner_type.clone())))
if let Err(e) = unify_with_env(expected, &Type::Option(Box::new(inner_type.clone())), &self.env)
{
self.errors.push(TypeError {
message: format!(
@@ -2456,7 +2489,7 @@ impl TypeChecker {
Pattern::Tuple { elements, span } => {
let element_types: Vec<Type> = elements.iter().map(|_| Type::var()).collect();
if let Err(e) = unify(expected, &Type::Tuple(element_types.clone())) {
if let Err(e) = unify_with_env(expected, &Type::Tuple(element_types.clone()), &self.env) {
self.errors.push(TypeError {
message: format!("Tuple pattern doesn't match type {}: {}", expected, e),
span: *span,
@@ -2506,7 +2539,7 @@ impl TypeChecker {
if let Some(type_expr) = typ {
let declared = self.resolve_type(type_expr);
if let Err(e) = unify(&value_type, &declared) {
if let Err(e) = unify_with_env(&value_type, &declared, &self.env) {
self.errors.push(TypeError {
message: format!(
"Variable '{}' has type {}, but declared type is {}: {}",
@@ -2527,12 +2560,47 @@ impl TypeChecker {
self.infer_expr(result)
}
fn infer_record(&mut self, fields: &[(Ident, Expr)], _span: Span) -> Type {
let field_types: Vec<(String, Type)> = fields
fn infer_record(
&mut self,
spread: Option<&Expr>,
fields: &[(Ident, Expr)],
span: Span,
) -> Type {
// Start with spread fields if present
let mut field_types: Vec<(String, Type)> = if let Some(spread_expr) = spread {
let spread_type = self.infer_expr(spread_expr);
let spread_type = self.env.expand_type_alias(&spread_type);
match spread_type {
Type::Record(spread_fields) => spread_fields,
_ => {
self.errors.push(TypeError {
message: format!(
"Spread expression must be a record type, got {}",
spread_type
),
span,
});
Vec::new()
}
}
} else {
Vec::new()
};
// Apply explicit field overrides
let explicit_types: Vec<(String, Type)> = fields
.iter()
.map(|(name, expr)| (name.name.clone(), self.infer_expr(expr)))
.collect();
for (name, typ) in explicit_types {
if let Some(existing) = field_types.iter_mut().find(|(n, _)| n == &name) {
existing.1 = typ;
} else {
field_types.push((name, typ));
}
}
Type::Record(field_types)
}
@@ -2549,7 +2617,7 @@ impl TypeChecker {
let first_type = self.infer_expr(&elements[0]);
for elem in &elements[1..] {
let elem_type = self.infer_expr(elem);
if let Err(e) = unify(&first_type, &elem_type) {
if let Err(e) = unify_with_env(&first_type, &elem_type, &self.env) {
self.errors.push(TypeError {
message: format!("List elements must have same type: {}", e),
span,
@@ -2855,7 +2923,7 @@ impl TypeChecker {
// Check return type matches if specified
if let Some(ref return_type_expr) = impl_method.return_type {
let return_type = self.resolve_type(return_type_expr);
if let Err(e) = unify(&body_type, &return_type) {
if let Err(e) = unify_with_env(&body_type, &return_type, &self.env) {
self.errors.push(TypeError {
message: format!(
"Method '{}' body has type {}, but declared return type is {}: {}",
@@ -2898,6 +2966,9 @@ impl TypeChecker {
"Option" if resolved_args.len() == 1 => {
return Type::Option(Box::new(resolved_args[0].clone()));
}
"Map" if resolved_args.len() == 2 => {
return Type::Map(Box::new(resolved_args[0].clone()), Box::new(resolved_args[1].clone()));
}
_ => {}
}
}

View File

@@ -47,6 +47,8 @@ pub enum Type {
List(Box<Type>),
/// Option type (sugar for App(Option, [T]))
Option(Box<Type>),
/// Map type (sugar for App(Map, [K, V]))
Map(Box<Type>, Box<Type>),
/// Versioned type (e.g., User @v2)
Versioned {
base: Box<Type>,
@@ -119,6 +121,7 @@ impl Type {
Type::Tuple(elements) => elements.iter().any(|e| e.contains_var(var)),
Type::Record(fields) => fields.iter().any(|(_, t)| t.contains_var(var)),
Type::List(inner) | Type::Option(inner) => inner.contains_var(var),
Type::Map(k, v) => k.contains_var(var) || v.contains_var(var),
Type::Versioned { base, .. } => base.contains_var(var),
_ => false,
}
@@ -158,6 +161,7 @@ impl Type {
),
Type::List(inner) => Type::List(Box::new(inner.apply(subst))),
Type::Option(inner) => Type::Option(Box::new(inner.apply(subst))),
Type::Map(k, v) => Type::Map(Box::new(k.apply(subst)), Box::new(v.apply(subst))),
Type::Versioned { base, version } => Type::Versioned {
base: Box::new(base.apply(subst)),
version: version.clone(),
@@ -208,6 +212,11 @@ impl Type {
vars
}
Type::List(inner) | Type::Option(inner) => inner.free_vars(),
Type::Map(k, v) => {
let mut vars = k.free_vars();
vars.extend(v.free_vars());
vars
}
Type::Versioned { base, .. } => base.free_vars(),
_ => HashSet::new(),
}
@@ -279,6 +288,7 @@ impl fmt::Display for Type {
}
Type::List(inner) => write!(f, "List<{}>", inner),
Type::Option(inner) => write!(f, "Option<{}>", inner),
Type::Map(k, v) => write!(f, "Map<{}, {}>", k, v),
Type::Versioned { base, version } => {
write!(f, "{} {}", base, version)
}
@@ -1775,6 +1785,73 @@ impl TypeEnv {
]);
env.bind("Option", TypeScheme::mono(option_module_type));
// Map module
let map_v = || Type::var();
let map_type = || Type::Map(Box::new(Type::String), Box::new(Type::var()));
let map_module_type = Type::Record(vec![
(
"new".to_string(),
Type::function(vec![], map_type()),
),
(
"set".to_string(),
Type::function(
vec![map_type(), Type::String, map_v()],
map_type(),
),
),
(
"get".to_string(),
Type::function(
vec![map_type(), Type::String],
Type::Option(Box::new(map_v())),
),
),
(
"contains".to_string(),
Type::function(vec![map_type(), Type::String], Type::Bool),
),
(
"remove".to_string(),
Type::function(vec![map_type(), Type::String], map_type()),
),
(
"keys".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(Type::String))),
),
(
"values".to_string(),
Type::function(vec![map_type()], Type::List(Box::new(map_v()))),
),
(
"size".to_string(),
Type::function(vec![map_type()], Type::Int),
),
(
"isEmpty".to_string(),
Type::function(vec![map_type()], Type::Bool),
),
(
"fromList".to_string(),
Type::function(
vec![Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()])))],
map_type(),
),
),
(
"toList".to_string(),
Type::function(
vec![map_type()],
Type::List(Box::new(Type::Tuple(vec![Type::String, map_v()]))),
),
),
(
"merge".to_string(),
Type::function(vec![map_type(), map_type()], map_type()),
),
]);
env.bind("Map", TypeScheme::mono(map_module_type));
// Result module
let result_type = Type::App {
constructor: Box::new(Type::Named("Result".to_string())),
@@ -1887,6 +1964,18 @@ impl TypeEnv {
"round".to_string(),
Type::function(vec![Type::var()], Type::Int),
),
(
"sin".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"cos".to_string(),
Type::function(vec![Type::Float], Type::Float),
),
(
"atan2".to_string(),
Type::function(vec![Type::Float, Type::Float], Type::Float),
),
]);
env.bind("Math", TypeScheme::mono(math_module_type));
@@ -1896,6 +1985,10 @@ impl TypeEnv {
"toString".to_string(),
Type::function(vec![Type::Int], Type::String),
),
(
"toFloat".to_string(),
Type::function(vec![Type::Int], Type::Float),
),
]);
env.bind("Int", TypeScheme::mono(int_module_type));
@@ -1905,6 +1998,10 @@ impl TypeEnv {
"toString".to_string(),
Type::function(vec![Type::Float], Type::String),
),
(
"toInt".to_string(),
Type::function(vec![Type::Float], Type::Int),
),
]);
env.bind("Float", TypeScheme::mono(float_module_type));
@@ -1991,6 +2088,9 @@ impl TypeEnv {
Type::Option(inner) => {
Type::Option(Box::new(self.expand_type_alias(inner)))
}
Type::Map(k, v) => {
Type::Map(Box::new(self.expand_type_alias(k)), Box::new(self.expand_type_alias(v)))
}
Type::Versioned { base, version } => {
Type::Versioned {
base: Box::new(self.expand_type_alias(base)),
@@ -2151,6 +2251,13 @@ pub fn unify(t1: &Type, t2: &Type) -> Result<Substitution, String> {
// Option
(Type::Option(a), Type::Option(b)) => unify(a, b),
// Map
(Type::Map(k1, v1), Type::Map(k2, v2)) => {
let s1 = unify(k1, k2)?;
let s2 = unify(&v1.apply(&s1), &v2.apply(&s1))?;
Ok(s1.compose(&s2))
}
// Versioned types
(
Type::Versioned {