Inspired by the recent discussions around splitting JavaScript into a minimal js0 and an extended jssugar layer, a specific architectural approach came to mind. I want to share this purely as an unpolished conceptual thought experiment, focusing on how we might achieve perfect semantic consistency between build-time and run-time through a "dual-path" compiler architecture.
Concept
- js0 (Underlying Layer): A minimalist engine target that can optionally—or completely—shed historical baggage (such as implicit type coercion), thereby significantly reducing engine complexity.
- jssugar (Upper Layer): Fully compatible with existing JS, realized through a unified compiler that supports dual-path deployment:
- Pre-compilation: Trades build time for ultimate runtime performance.
- Built-in compilation: Browsers uniformly embed a Wasm-based compiler to dynamically compile, cache (e.g., via engine-level caches or Service Workers), and execute code at load time. (For instance, writing this compiler in Rust allows it to support offline pre-compilation while also being compiled into Wasm for built-in browser use, ensuring perfectly consistent behavior across both paths.)
Compatibility
- Old browsers + Old code: Run directly on traditional engines.
- New browsers + Old code: The built-in compiler transparently transpiles historical syntax (e.g.,
==) into js0. - New browsers + New code: Pre-compiled js0 is passed through directly; un-pre-compiled code goes through the built-in compiler.
- Old browsers + New code: Published code is pre-compiled and downgraded to traditional ES during the build process.
Win-Win
- Engines: V8, JSC, SpiderMonkey, QuickJS, etc., can drastically reduce their development and maintenance costs.
- Developers: js0 is permanently frozen; jssugar painlessly introduces features like operator overloading and structs, addressing long-standing pain points in fields such as AI, gaming, and scientific computing.
- Standards: New syntax can be instantly adopted and used in the jssugar layer, putting an end to the long adaptation periods previously required by underlying engines.
(Self-correction: I realize that forcing "New browsers + Old code" through a runtime Wasm compiler is practically counterproductive to performance. In a realistic iteration, new browsers would likely still run legacy code via traditional fast-paths, reserving the built-in compiler only for explicitly tagged jssugar. I left it in the matrix above mostly to illustrate the theoretical "ultimate cleanup" capability of this dual-path design.)
In reality, the industry will likely lean entirely on offline tooling for the jssugar -> js0 transition. However, as a theoretical extreme to guarantee 100% behavioral parity between offline builds and runtime execution, I found the logical closure of this architecture interesting.
I'm curious to hear the community's thoughts: setting aside the immediate performance constraints, does the concept of a "unified Rust compiler outputting both native binaries and Wasm for browser embedding" hold any merit for the future of JS tooling? Or is the performance cost of runtime transpilation an absolute dealbreaker that makes this not worth exploring?