[Issue] BigInt serialization

Thank you for your reply.

  • If I am understanding you correctly, your argument here is that (many?) developers have avoided creating JavaScript programs that implement large-BigInt serialization to ArrayBuffers (or that use large BigInts at all).
  • The argument is that they have avoided doing so because cannot implement such large-BigInt serialization sufficiently quickly.
  • This forms a chicken-and-egg problem between the ecosystem and the language committee / engine implementors.

I acknowledge your frustration. It sounds like you yourself are one of those developers.

Even in cases like these, it is still important to find specific real-world use cases, such as any:

  • Real JavaScript programs that would have used a performant BigInt serialization API to ArrayBuffers but did not because it could be implemented performantly.
  • Real JavaScript programs that avoided BigInts completely because serialization of large BigInts could not be done performantly.
  • Real non-JavaScript programs that do use performant serialization of BigInt-like integers.

A previous comment here refers to companies using BigInts in general, but TC39 needs specific real-world cases and specific evidence of impact.

  • Your own specific experiences here may be valuable.
  • The examples would be particularly valuable if they come from open-source code repositories.
  • They also could be testimonies from software companies of their experiences.
  • A champion would be able point to such examples as evidence of real-world need.
  • You’ve already seen skepticism of how great an impact this feature would have, so such clear evidence of impact would be crucial.
    • Note that, because the engines have seen very tiny usage of BigInts on the web, some engine implementors have even expressed regret that BigInts are in the core language at all!.
  • I myself am interested in hearing what specific existing apps serialize such big BigInts to ArrayBuffers, although I don’t have the bandwidth to take on new proposals.

With that said, of course, even providing real-world use cases does not guarantee that a TC39 person will prioritize a feature and take it up as champion.

  • TC39 has limited attention split across many fronts.
  • And, in the end, the engine implementors have limited resources.
  • I’m not saying that built-in BigInt serialization to ArrayBuffers will never be in the language.
    • BigInt serialization may well enter the language someday…if we can find specific real-world use cases and evidence of impact and if the engines and the rest of TC39 find them compelling.
    • Without real-world use cases and evidence of impact, then there are plenty of other proposals that do have them that should be prioritized first. I hope that makes sense, and I look forward to hearing any such examples that you might have.

I fully agree with this. Currently the only O(n) implementation is to use .toString(16) or something similar, it seems ugly and unnatural, also have a relatively large overhead. It is a fundamental operation for BigInt to do more interoperation for something like wasm, or etc.

It's more about what is possible in JavaScript directly not just "userspace" generally. Being "turning complete" only means that the language is able to simulate running other turning complete languages.

For example, JavaScript can implement a VM that runs Python on Java, but JavaScript itself doesn't have all the same features as those languages.

This means that there is still space for JavaScript proposals to add new fundamental capabilities that were not directly possible within JavaScript. For example when WeakRef was added.

2 Likes

Yep, and with any luck, bigint serialization too. Hate to have to simulate another type of bigint as the status quo currently stands