Python: the octal escape character \ 033 from a dictionary value is converted to a print statement on a UTF-8 character instead

I experimented a bit with output color in terminal and Python 2.7.3 . ANSI color codes are always rendered flawlessly in the terminal, and this is one small exception that I could not pinpoint exactly what the specific definition of the dictionary is.

This is what causes confusion:

color = { 'white': "\033[1,37m", 'yellow': "\033[1,33m", 'green': "\033[1,32m", 'blue': "\033[1,34m", 'cyan': "\033[1,36m", 'red': "\033[1,31m", 'magenta': "\033[1,35m", 'black': "\033[1,30m", 'darkwhite': "\033[0,37m", 'darkyellow': "\033[0,33m", 'darkgreen': "\033[0,32m", 'darkblue': "\033[0,34m", 'darkcyan': "\033[0,36m", 'darkred': "\033[0,31m", 'darkmagenta':"\033[0,35m", 'darkblack': "\033[0,30m", 'off': "\033[0,0m" } yellow = "\033[1;33m" off = "\033[0;0m" print color['yellow'] + "string to render" + color['off'] # fails to render properly print "%(yellow)sstring to render%(off)s" % color # ditto print "%sstring to render%s" % (color['yellow'], color['off'])# ditto print yellow + "string to render" + off # as intended pp = pprint.PrettyPrinter(indent=6) pp.pprint(color) 

Output for PrettyPrinter:

 { 'black': '\x1b[1,30m', 'blue': '\x1b[1,34m', 'cyan': '\x1b[1,36m', 'darkblack': '\x1b[0,30m', 'darkblue': '\x1b[0,34m', 'darkcyan': '\x1b[0,36m', 'darkgreen': '\x1b[0,32m', 'darkmagenta': '\x1b[0,35m', 'darkred': '\x1b[0,31m', 'darkwhite': '\x1b[0,37m', 'darkyellow': '\x1b[0,33m', 'green': '\x1b[1,32m', 'magenta': '\x1b[1,35m', 'off': '\x1b[0,0m', 'red': '\x1b[1,31m', 'white': '\x1b[1,37m', 'yellow': '\x1b[1,33m'} 

It seems to me that this is the correct translation into hexadecimal format. Despite this fact, dictionary values ​​are not properly passed to the print statement. Neither raw nor Unicode (out of desperation) string literary modifiers change anything. I have to miss something pretty obvious. On terminals without UTF-8 support, the Unicode character is omitted.

I saw implementations for termcolor :

 if os.getenv('ANSI_COLORS_DISABLED') is None: fmt_str = '\033[%dm%s' if color is not None: text = fmt_str % (COLORS[color], text) if on_color is not None: text = fmt_str % (HIGHLIGHTS[on_color], text) if attrs is not None: for attr in attrs: text = fmt_str % (ATTRIBUTES[attr], text) text += RESET return text 

colorama :

 CSI = '\033[' def code_to_chars(code): return CSI + str(code) + 'm' class AnsiCodes(object): def __init__(self, codes): for name in dir(codes): if not name.startswith('_'): value = getattr(codes, name) setattr(self, name, code_to_chars(value)) 

And a couple more others. Analytically, they all avoid identifying the entire sequence in a dictionary. I agree that this approach is lexically justified. Nevertheless, the fact remains: the escape character from the dictionary value is not interpreted correctly, except, say, in the Perl, C ++ hash, vector ized map <string, string> or C struct (if hidden display) char *string .

And this leads to the question: Is there any specific reason for the standard, if possible, as to why the dictionary (albeit a duplication of this :) interpolation deviates from a simple string?


Here's the fixed color code dict (tab-indentated if editing, SO seems to break tabs for reading):

 color = { 'white': "\033[1;37m", 'yellow': "\033[1;33m", 'green': "\033[1;32m", 'blue': "\033[1;34m", 'cyan': "\033[1;36m", 'red': "\033[1;31m", 'magenta': "\033[1;35m", 'black': "\033[1;30m", 'darkwhite': "\033[0;37m", 'darkyellow': "\033[0;33m", 'darkgreen': "\033[0;32m", 'darkblue': "\033[0;34m", 'darkcyan': "\033[0;36m", 'darkred': "\033[0;31m", 'darkmagenta':"\033[0;35m", 'darkblack': "\033[0;30m", 'off': "\033[0;0m" } 
+8
python dictionary terminal escaping
source share
1 answer

I looked through your source code and I think the problem is with the color definition in the dictionary.

If you carefully observe, the dictionary value for the color looks like \ 033 [1.30 m for white. However, this should be \ 033 [1; 30 m Note that you are using the character , (comma) instead of the character ; (semicolon) . As a test, I created a subset of the color dictionary and performed these tests.

 >>> color = {'white' :'\033[1;37m', 'yellow':'\033[1;33m', 'off' : '\033[0;0m'} >>> print color['white'] + 'string' + color['off'] string #this string is white in color >>> print color['yellow'] + 'string' + color['off'] string #this string is yellow in color >>> color['yellow'] = '\033[1,33m' #incorrect color code - using a , character instead of ; >>> print color['yellow'] + 'string' + color['off'] string #prints the string in console default color ie not in yellow color >>> 

Hope this helps

+7
source share

All Articles