Object.deepFreeze

This is more of an alert: I'm going to add an unauthorized addition to the language in the form of two functions, Object.deepFreeze, and Object.isDeepFrozen.

Normally I'd never remotely consider creating library functionality by modifying builtin prototypes, but I need the WeakSet that knows which objects are deeply frozen to be a global singleton.

If I put the WeakSet into a module, having two conflicting versions of the module would create two versions of the map. That quickly leads to double-validation and doubled memory costs!

Exposing the map directly through the global object also isn't OK as it would lose encapsulation. So the only real approach available to me at all that works is to extend the Object prototype with the deepFreeze and isDeepFrozen methods which close over the set of deep-frozen objects.

Here's what it looks like:

let { freeze, getPrototypeOf, getOwnPropertyNames, getOwnPropertySymbols } = Object;
let deepFrozen = new WeakSet();

if (!Object.deepFreeze) {
  let isDeepFrozen = Object.isDeepFrozen = (value) => {
    return deepFrozen.has(value) || typeof value !== 'object' || value;
  };

  Object.deepFreeze = (value) => {
    if (!isDeepFrozen(value)) {
      let proto = getPrototypeOf(value);

      if (proto && proto !== Object.prototype && proto !== Array.prototype)
        throw new Error();

      for (let name of getOwnPropertyNames(value))
        deepFreeze(value[name]);
  
      for (let name of getOwnPropertySymbols(value))
        deepFreeze(value[name]);

      deepFrozen.add(freeze(value));
    }

    return value;
  };
}

I bring this to the attention of TC39 because this isn't going into some low-rank site but into BABLR, the successor to Babel. Even TC39 will end up depending on this so it probably will have to be standardized sooner or later

Please do not write if (!Object.deepFreeze). If you insist on modifying built-in prototypes, just unconditionally assign.

What's your thinking?

Conditional assignment is doing a lot of heavy lifting here in terms of making sure there's only ever one deepFrozen cache per realm...

You could add an extra field to your custom one so you can detect if it came from your package.

All this has to do with how the implementation behaves if TC39 does ever add an official method with that name, right?

If I defer to the official implementation like I do right now, my unauthorized polyfill could actually block TC39 from using that method name for a different implementation because adding a new method with that name would change the meaning of existing code.

If I use an extra field to overwrite only official implementations (but not my own), then TC39 is free to standardize Object.deepFreeze with a different implementation, but if they do then my non-conditional overwriting of that name would go from being harmless to breaking other standard code, which obviously I don't want.

Hi!

I'm going to add an unauthorized addition to the language in the form of two functions, Object.deepFreeze, and Object.isDeepFrozen.

Normally I'd never remotely consider creating library functionality by modifying builtin prototypes, but I need the WeakSet that knows which objects are deeply frozen to be a global singleton.

If I put the WeakMap into a module, having two conflicting versions of the module would create two versions of the map. That quickly leads to double-validation and doubled memory costs!

Exposing the map directly through the global object also isn't OK as it would lose encapsulation. So the only real approach available to me at all that works is to extend the Object prototype with the deepFreeze and isDeepFrozen methods which close over the set of deep-frozen objects.

I don't see why you'd need to extend Object to achieve this.
Yes, you need a global that's shared between those "two conflicting
versions of the module", but nothing says it needs to be a method on
Object. And you don't have to expose the set itself either.

A simple global boolean flag would be sufficient to produce a warning:

if (globalThis.__conrad_bucks_heavily_optimised_deepfreeze_lib) {
   console.warn('You appear to have multiple instances of 
conartist6/deepfreeze loaded, get your dependencies in order!');
} else {
   globalThis.__conrad_bucks_heavily_optimised_deepfreeze_lib = true;
}

but of course you could also use the same approach to store the function
pair and gloss over the conflict, while simply also exporting them from
the module:

export let deepFreeze, isDeepFrozen;

if (globalThis.__conrad_bucks_heavily_optimised_deepfreeze) {
   console.debug('Ignoring that you have multiple instances of 
conartist6/deepfreeze loaded, just using first');
   deepFreeze = globalThis.__conrad_bucks_heavily_optimised_deepfreeze;
   isDeepFrozen = globalThis.__conrad_bucks_heavily_optimised_isdeepfrozen;
} else {
   const knownFrozen = new WeakSet();
   globalThis.__conrad_bucks_heavily_optimised_deepfreeze =
     deepFreeze = (…) => { … };
   globalThis.__conrad_bucks_heavily_optimised_isdeepfrozen =
     isDeepFrozen = (…) => { … };
}

kind regards,
Bergi

Your first example isn't solving the problem because there isn't any way for the second library definition to access the cache defined in the first one.

In the second example you solve that by introducing a new global static function, just like I did!

The only difference from there was that I tried to choose the simplest name that accurately described the implementation.

Look I'm happy to work with the committee if there's a path forward, but my faith in the process is very limited. I've designed critically important future standards like the stream abstraction and the macro processing system, and nobody even wants to talk about them much less review the work.

You might consider adopting the approach we are using in @endo/harden.

With the most recent release, the package provides a ponyfill harden function which is based on a function installed on the Object intrinsic, Object.for(ā€˜harden’), which establishes a race to install a version and entrain its internal WeakSet. The harden you get is effectively equivalent to your proposed Object.deepFreeze by default and enables dependent libraries to make their interfaces tamper-resistant without obligating them to stand on deeply frozen intrinsics in every configuration. Then, if a user of your library elects to run in an environment with ses, which must win the race to install Symbol.for(ā€˜harden’) by calling lockdown before anyone uses harden, your library gets a fully tamper-proof interface.

That is to say, we seek to establish a convention for ā€œhardened modulesā€, that rely on an underlying race to install Object[Symbol.for(ā€˜harden’)], such that these modules are portable between locked-down environments and non-locked-down environments.

The benefit of this arrangement is that it creates a low-cost and portable way for libraries to defend the integrity of their interfaces in composition with other modules of the same Realm, where the application may choose to opt-in to greater degrees of supply chain attack resistance, but without obligating every dependent application or library to accept the trade-offs (notably, property-override-mistake footguns).

1 Like

@kriskowal Cool to know that I'm not even the only one who has had this exact need force them into the pattern of making unauthorized additions to Object!

One specific piece of feedback I have is endo's implementation of harden invokes getters. I had to revise the initial implementation I shared here so that it would not invoke getters because in my environment a potentially massive amount of lazy computation is hidden behind getters, and I'm careful to ensure that they freeze their results anyway. I presume that protecting laziness is why Object.freeze itself does not invoke getters.

So my polyfill implementation is now simplified to this:

Object.deepFreeze = (value) => {
  if (!isDeepFrozen(value)) {
    for (let name of getOwnPropertyNames(value)) {
      Object.deepFreeze(getOwnPropertyDescriptor(value, name).value);
    }
    for (let name of getOwnPropertySymbols(value)) {
      Object.deepFreeze(getOwnPropertyDescriptor(value, name).value);
    }
    deepFrozen.add(freeze(value));
  }

  return value;
};

Also I tested it and found that this code is quite slow, with the main bottleneck appearing to be the deepFrozen weak set. I presume that if the language supported such an operation the language could make it fast, but with the tools I have at my disposal I am not able to make this fast.

That's why I'm carefully leaving room for a native implementation to replace mine though, because a native implementation could be fast.

Hi @conartist6 , note that Moddable’s XS JavaScript engine does provide a native implementation of harden. I believe XS’s native implementation is indeed much faster as you expect.

You’re algorithm does the transitive walk the most natural way, and the way we tried first – to use JS recursion. However, this causes hugely deep stacks – for example as deep as linked lists are long. This was problematic on some engines, so we switched from depth first to breadth first. Fortunately, WeakSet iteration was already perfectly set up from breadth first visitation!

Unfortunately, breadth first requires us to use two WeakSets, so we don’t add anything to the long-lived set if the algorithm aborts in the middle due to a thrown error.

@conartist6 , would you be interested in co-championing a harden proposal to tc39?

Note: our current implementation of harden makes a weird special case for typed arrays, since at the time they could not be frozen. Due to the Immutable ArrayBuffer proposal, already in progress for v8, soon we’ll be able to create freezable typed arrays. So any proposed harden should omit this special case.

@markm I'm interested as long as we can land on something that will meet both our needs! I need an implementation which does not trigger lazy computations when freezing. Triggering all lazy computations eagerly is hugely costly for me because my system uses a lazily-instantiated facade layer for perf.

Is this something you think we could come to an agreement on how to support?

It's interesting that you're looking at making this a transactional operation. I don't have any such concern in my proposed code, presumably because I'm not triggering getters so there's not really any chance at all for the operation to fail part way through.

I took a look at the frozen buffers proposal and I don't think it has any bearing at all on Object.deepFreeze. That proposal makes clear that object-based immutability is considered to be a completely separate property from immutability as applied to array buffers. I'm particularly sad that it leaves Map and Set without any working definition of immutability. And it doesn't help that a subclass of ArrayBuffer could claim to be immutable while being mutable...

Just out of curiosity how does harden deal with Map and Set?

That is not the expectation. Are you sure?

Edit: I just double checked and don't see anywhere harden would invoke the getter.

Like any object with functions. It freezes the API surface but doesn't affect any internal state (as long as that state is truly private and not pseudo private properties)

Note that this deepFreeze implementation is incomplete; it leaves the accessor methods theirselves unfrozen (e.g., Object.assign(Object.getOwnPropertyDescriptor(Object.deepFreeze({ get foo(){} }), "foo").get, { late: true }).late === true). In fact, it doesn’t handle functions at all—Object.isDeepFrozen(function(){}) === true!

I strongly second @kriskowal’s suggestion to look at @endo/harden, which already handles these cases and more.

Yeah I am looking at harden. I was incorrect in my initial assumption that it triggered getters, so it may be usable for me (but that was wrong).

I would definitely consider unfrozen functions a bug in my implementation, as that could be used to trick an interface expecting immutable data into taking mutable data instead if the function isn't directly checking for typeof obj === 'object', which of course is not a common defensive check.

Unfrozen getters don't feel as much like a bug to me as I don't immediately see any way that they could be used to hoodwink code with an API defined in terms of objects.

Here is my updated definition of isDeepFrozen:

let objectTypes = ['object', 'function'];

export const isDeepFrozen = (value) => {
  return !objectTypes.includes(typeof value) || value === null || deepFrozen.has(value);
};

(edit: the ! was initially missing)

I just did a simple test to see if harden's behavior of freezing getter functions is in line with what Object.freeze does and it is not:

let o = Object.freeze({get test() {}});
Object.isFrozen(Object.getOwnPropertyDescriptor(o, 'test').get); // false

Well, yeah, harden is specifically a transitive freeze that by design affects not just the input object but also objects reachable from it. Which is exactly the topic that you’re raising here—there would be no point to this discussion at all if Object.freeze already fully covered your use case.

Huh? Freezing a function doesn’t change the code that executes when it is called, it just prevents the ability to add new properties or remove/replace existing property descriptors.

I admit to being completely baffled and unable to construct a scenario in which manipulation of a function’s properties is acceptable if it is an accessor but unacceptable if it is not.