I expect the first example above to match, but it fails in node (v8) and firefox (spidermonkey). It works in bun (jscore). The second example works in all 3. It seems that including an explicit supplemental codepoint in the charset flips v8 and spidermonkey into allowing codepoints overall.
If flags contains "u", let fullUnicode be true; else let fullUnicode be false.
โฎ
If fullUnicode is true, let input be StringToCodePoints(S). Otherwise, let input be a List whose elements are the code units that are the elements of S.
AFAICT, fullUnicode-ness should not depend on whether a CharSet contains a supplementary code-point. Only on whether the "u" flag is present.
A CharSet is a mathematical set of characters. In the context of a Unicode pattern, โall charactersโ means the CharSet containing all code point values; otherwise โall charactersโ means the CharSet containing all code unit values.