1

I'm trying to understand how a JavaSript object sorts its properties. As far as I understood, from ECMA262, the first properties are always an integer index properties. For example, if print these objects using Node.js, Deno, or Bun:

console.log({ a: 0, [-1]: 1 })
console.log({ a: 0, [0]: 1 })
console.log({ a: 0, [2 ** 32 - 2]: 1 })
console.log({ a: 0, [2 ** 32 - 1]: 1 })
console.log({ a: 0, [2 ** 32]: 1 })

we will have

{ a: 0, '-1': 1 }
{ '0': 1, a: 0 }
{ '4294967294': 1, a: 0 }
{ a: 0, '4294967295': 1 }
{ a: 0, '4294967296': 1 }

It looks like an integer index is defined in the range [0, 2^32-2]. It matches the definition of an array index:

An array index is an integer index whose numeric value i is in the range +0 ≤ i < 2^32 - 1.

However, it's different from the definition of an integer index:

An integer index is a String-valued property key that is a canonical numeric String (see 7.1.16) and whose numeric value is either +0 or a positive integer ≤ 2^53−1.

So, my question is, should JavaScript engines use [0, 2^53-1] or ECMAScript 2015 should use [0, 2^32-2] for the definition of an integer index? Did I miss something?

1 Answer 1

1

This algorithm was changed in 2018 to align with implementation reality to put only array indices first, not integer indices.

No, engines implementing ES2015 should not redefine integer indices to mean something else than the spec says. They should have used 253-1 (but didn't, at the time). On the other hand, engines implementing ES2019 do and should use array indices (up to 232-2) in the OrdinaryOwnPropertyKeys algorithm, and that's what you are seeing when trying this out.

Not the answer you're looking for? Browse other questions tagged or ask your own question.