Why can't I add a Dog Face field (u + 1f436) to my object without using String?

It:

var foo = {
 🐶 : true //Truely adorable
};

Gives me an illegal character error in Firefox and Chrome. Nonetheless,

var foo = {
  '🐶' : true
};

Works great. Why?
(You can also answer for a wider Unicode character set, but I really want to learn more about Dog)

+4
source share
1 answer

As the ECMAScript standard defines , valid identifiers must begin with a Unicode code point using the Unicode property ID_Start.

This does not correspond to a poor dog ,: (

You can use any of these code points as the first character of your identifier:

http://unicode.org/cldr/utility/list-unicodeset.jsp?a=[:ID_Start=Yes:]

+7

All Articles