Multidimensional Arrays - unexpected behaviour

I have been working with large amounts of data; particularly with multidimensional arrays containing a lot of stuff.

const grid = new Array(1000).fill(new Array(1000).fill(null))

when I do something like:

grid[0][0] = 123
console.log(grid)
/*
[
    [123, null, null, null, ... null], // index 0
    [123, null, null, null, ... null],
    [123, null, null, null, ... null],
    ........
    [123, null, null, null, ... null], // index 999
]
*/

My only intention is to change the value at grid[0][0] but when I assign a value to that position, every first value of sibling arrays are also modified.

I've only tested it in Chrome. Is it a browser specific bug or something entirely weird going on??

__ Edit __
I get that it's because of Arrays being a reference rather a value.
Is there a way to fix this issue without writing a crazy amount of code?

Yes, you're filling an array with 1000 references to the same "inner" array. This is not a browser/engine bug, but expected behaviour of fill.

You can instead use Array.from(…) or .fill().map(…):

const grid = Array.from({length: 1000}, () => Array.from({length: 1000}, () => null));

That will run the callback 1000 times to create 1000 inner arrays. You can also use the second parameter to get the index and initialise your grid values based on coordinates.

Thanks for the help👍. I was actually making a crossword puzzle mini game with this. Does the tuples proposal fare any better in such cases, after all they are value type right?

You wouldn’t hit that exact problem with tuples as they are immutable. You would have gotten an exception when attempting to mutate it (assuming strict mode)

Just as a matter of curiosity, why not take this approach?

const grid = new Array(1000 * 1000).fill(null);
let x = 0, y=0;
grid[y*1000 + x] = 123;

What I'm asking is, since the grid isn't sparse, why nest it at all?

That could work as well; but I just wanted to know what was going on with fill when arrays are nested.
In the end, I went with a third approach; filling an array with 1000 × 1000 null seemed terribly inefficient.