I think my answer here answers the question, but the question is not exactly a duplicate, so I am enclosing a copy of my answer.
story
The problem is that decimal integer literals cannot have leading zeros:
DecimalIntegerLiteral ::
0
NonZeroDigit DecimalDigits(opt)
However, ECMAScript 3 allowed (as an optional extension) to parse literals with leading zeros in base 8:
OctalIntegerLiteral ::
0 OctalDigit
OctalIntegerLiteral OctalDigit
But ECMAScript 5 forbids doing this in strict mode:
(. 10.1.1) NumericLiteral OctalIntegerLiteral, B.1.1.
ECMAScript 6 BinaryIntegerLiteral OctalIntegerLiteral, :
- BinaryIntegerLiteral,
0b 0B. - OctalIntegerLiteral,
0o 0O. - HexIntegerLiteral,
0x 0X.
OctalIntegerLiteral LegacyOctalIntegerLiteral, - .
, 8, 0o 0O ( ) parseInt.
, 10, parseInt.
0100o10, 0O10- ECMAScript 6 .
- ECMAScript 6
8.
parseInt('010', 8)parseInt('010', 10)