Why does this loop work endlessly in c?

I was just experimenting with code in C programming and found out weird behavior. Well ... Since I'm not an expert on C, so I don't know if this is strange or normal.

Basically, my question is about the difference between two lines of code: -

char a = 'h'; // here variable a is not an array of "char" 

and

 char a = 'hi'; //here variable a is not an array of "char" as well (i don't know if compiler assumes it as an array or not but , at least i didn't declared it that way ) 

I used the following codes

first: -

 char a =0; for(;a<'hi';a++) { printf("%d= hello world \n",a); } 

second: -

 char a; for(a='h';a<'hi';a++) { printf("%d= hello world \n",a); } 

both of the above loops continue to work forever,

Can someone tell me why so?

I may miss a very simple programming concept. please help me guys

+5
source share
2 answers

This is because 'hi' is of type int not a char . It also corrects the value 26729. But, most likely, the loop variable (assuming char is a 1-byte type and the byte has 8 bits) is limited to 127 and then overflows.

Please note that these are:

 char a =0; char m = 'hi'; for(; a < m; a++) { printf("%d= hello world \n",a); } 

will work because 'hi' will be forcibly bound to char (105).

'hi' is a multi-character literal. This is not a common programming practice, it is a "lesser known" C function that has become part of the C99 standard. Additional information about them: http://zipcon.net/~swhite/docs/computers/languages/c_multi-char_const.html

+10
source

In C (unlike C ++, as indicated in some comments), character literals always have an int type. It does not matter if it is a regular single-character literal such as "c", or a multi-character literal such as "hello." It always has an int type, which must contain at least 16 bits. A char contains exactly one byte.

When comparing integer values โ€‹โ€‹of various types, the rules of integer promotion are applied, and the integer value of a smaller size becomes larger. This is why a < 'hi' can only be 1 ("true"). Even if int assigned, the variable a never contain anything more than MAX_CHAR . But the multi-character literal 'hi' is an int with a larger value than in the implementation of your compiler .

The reason a < m can be successful is because when you declare m you initialize it with 'hi' , which is converted to a type char , which really has the ability to compare no less than other char .

+1
source

Source: https://habr.com/ru/post/1212091/


All Articles