问题描述:

I am relatively unfamiliar with JavaScript, and I was recently told that a JavaScript array contains a `length`

variable of type `Number`

. This length is automatically updated as the array is updated to the number of elements in the array.

However, I was also told that internally, JavaScript uses a 64-bit floating point representation for its `Number`

class. We know that floating point arithmetic cannot exactly represent all integers within its range.

So my question is, what happens with large arrays, where `length + 1`

cannot exactly represent the next largest integer in the sequence?

According to this the maximum length of an Array is `4,294,967,295`

. `Number.MAX_SAFE_INTEGER`

is `9,007,199,254,740,991`

so you won't have to worry because the engine won't let you get that far, example:

```
new Array(4294967296); // RangeError: Invalid array length
```

Relevant part of the spec:

- c. Let
newLenbe ToUint32(Desc.[[Value]]).

b. IfnewLenis not equal to ToNumber( Desc.[[Value]]), throw a RangeError exception

So given our example length `4294967296`

:

```
var length = 4294967296;
var int32length = length >>> 0; // Convert to int32
int32length === 0; // Can't represent this as int32
length !== int32length; // Therefore RangeException
```

The maximum length of an array according to the ECMA-262 5th Edition specification is bound by an unsigned 32-bit integer due to the ToUint32 abstract operation, so the longest possible array could have 232-1 = 4,294,967,295 = 4.29 billion elements. This is according to Maximum size of an Array in Javascript..

so I guess @RGraham is right