ASCII char for int conversions in C

Possible duplicate:
Char to convert int to C.

I remember how long I learned in the course that converting from ASCII char to int by subtracting "0" is bad.

For instance:

int converted;
char ascii = '8';

converted = ascii - '0';

Why is this considered bad practice? Is this because some systems do not use ASCII? This question has long been looking for me.

+5
source share
3 answers

, , strtol ( ), . , . , , char .

C , .

5.2.1/3 :

0 [ : 0,1,2,3,4,5,6,7,8,9 ] , .

, , C.

+7

: -, C 0-9.

ASCII C, . , , atoi. , , (, , ), , , . , US-ASCII (UTF-8, ), . ebcdic ().

+3

, C, , , a char int.

Unfortunately, this educational toy somehow became part of the typical arsenal of most C developers, partly because C does not provide a convenient call for this (it often depends on the platform, I'm not even sure what it is).

As a rule, this code is not portable for platforms other than ASCII, and for future transitions to other encodings. It is also not very readable. Minimum wrap this trick in function.

0
source

All Articles