This is more of an alert: I'm going to add an unauthorized addition to the language in the form of two functions, Object.deepFreeze, and Object.isDeepFrozen.
Normally I'd never remotely consider creating library functionality by modifying builtin prototypes, but I need the WeakSet that knows which objects are deeply frozen to be a global singleton.
If I put the WeakSet into a module, having two conflicting versions of the module would create two versions of the map. That quickly leads to double-validation and doubled memory costs!
Exposing the map directly through the global object also isn't OK as it would lose encapsulation. So the only real approach available to me at all that works is to extend the Object prototype with the deepFreeze and isDeepFrozen methods which close over the set of deep-frozen objects.
Here's what it looks like:
let { freeze, getPrototypeOf, getOwnPropertyNames, getOwnPropertySymbols } = Object;
let deepFrozen = new WeakSet();
if (!Object.deepFreeze) {
let isDeepFrozen = Object.isDeepFrozen = (value) => {
return deepFrozen.has(value) || typeof value !== 'object' || value;
};
Object.deepFreeze = (value) => {
if (!isDeepFrozen(value)) {
let proto = getPrototypeOf(value);
if (proto && proto !== Object.prototype && proto !== Array.prototype)
throw new Error();
for (let name of getOwnPropertyNames(value))
deepFreeze(value[name]);
for (let name of getOwnPropertySymbols(value))
deepFreeze(value[name]);
deepFrozen.add(freeze(value));
}
return value;
};
}
I bring this to the attention of TC39 because this isn't going into some low-rank site but into BABLR, the successor to Babel. Even TC39 will end up depending on this so it probably will have to be standardized sooner or later
All this has to do with how the implementation behaves if TC39 does ever add an official method with that name, right?
If I defer to the official implementation like I do right now, my unauthorized polyfill could actually block TC39 from using that method name for a different implementation because adding a new method with that name would change the meaning of existing code.
If I use an extra field to overwrite only official implementations (but not my own), then TC39 is free to standardize Object.deepFreeze with a different implementation, but if they do then my non-conditional overwriting of that name would go from being harmless to breaking other standard code, which obviously I don't want.
I'm going to add an unauthorized addition to the language in the form of two functions, Object.deepFreeze, and Object.isDeepFrozen.
Normally I'd never remotely consider creating library functionality by modifying builtin prototypes, but I need the WeakSet that knows which objects are deeply frozen to be a global singleton.
If I put the WeakMap into a module, having two conflicting versions of the module would create two versions of the map. That quickly leads to double-validation and doubled memory costs!
Exposing the map directly through the global object also isn't OK as it would lose encapsulation. So the only real approach available to me at all that works is to extend the Object prototype with the deepFreeze and isDeepFrozen methods which close over the set of deep-frozen objects.
I don't see why you'd need to extend Object to achieve this.
Yes, you need a global that's shared between those "two conflicting
versions of the module", but nothing says it needs to be a method on Object. And you don't have to expose the set itself either.
A simple global boolean flag would be sufficient to produce a warning:
if (globalThis.__conrad_bucks_heavily_optimised_deepfreeze_lib) {
console.warn('You appear to have multiple instances of
conartist6/deepfreeze loaded, get your dependencies in order!');
} else {
globalThis.__conrad_bucks_heavily_optimised_deepfreeze_lib = true;
}
but of course you could also use the same approach to store the function
pair and gloss over the conflict, while simply also exporting them from
the module:
export let deepFreeze, isDeepFrozen;
if (globalThis.__conrad_bucks_heavily_optimised_deepfreeze) {
console.debug('Ignoring that you have multiple instances of
conartist6/deepfreeze loaded, just using first');
deepFreeze = globalThis.__conrad_bucks_heavily_optimised_deepfreeze;
isDeepFrozen = globalThis.__conrad_bucks_heavily_optimised_isdeepfrozen;
} else {
const knownFrozen = new WeakSet();
globalThis.__conrad_bucks_heavily_optimised_deepfreeze =
deepFreeze = (…) => { … };
globalThis.__conrad_bucks_heavily_optimised_isdeepfrozen =
isDeepFrozen = (…) => { … };
}
Your first example isn't solving the problem because there isn't any way for the second library definition to access the cache defined in the first one.
In the second example you solve that by introducing a new global static function, just like I did!
The only difference from there was that I tried to choose the simplest name that accurately described the implementation.
Look I'm happy to work with the committee if there's a path forward, but my faith in the process is very limited. I've designed critically important future standards like the stream abstraction and the macro processing system, and nobody even wants to talk about them much less review the work.