Difference and definition of alphabetic and symbolic constants in C?

I had problems understanding and using symbolic and literal constants, and I was wondering if you could explain them to anyone and highlight their differences. Thanks!

+7
source share
4 answers

A symbol is what the compiler deals with. The compiler considers const pretty much how it processes a variable. #define , on the other hand, is something the compiler doesn't even know about, because the precompiler converts it to its value. This is like searching and replacing. If you do

 #define A 5 

and then

 b += A; 

The precompiler translates it to

 b += 5; 

and all compilers see number 5.

+3
source

A literal constant is a value entered directly into your program where necessary. for example

int tempInt = 10;

tempInt is an int variable; 10 is a literal constant. You cannot assign a value of 10, and its value cannot be changed. A character constant is a constant that is represented by a name, just like a variable is represented. However, unlike a variable, after a constant is initialized, its value cannot be changed.

If your program has one integer variable with the names of the students and other named classes, you can calculate how many students you have, given the known number of classes, if you know that there were 15 students in the class:

students = classes * 15;

+3
source

I think that you mean that a literal constant is a primitive expression like "string" or 2 or false, while a symbolic constant is when you give it a name, for example, const int MagicNumber = 42. Both can be used as expressions, but you can refer to the latter with a name. Useful if you use the same constant from many places.

0
source

(Borrowing from earlier posts) A literal constant is a value entered directly into your program where necessary. for example

 int breakpoint = 10; 

The control point of a variable is an integer (int); 10 is a literal constant. You cannot assign a value of 10, and its value cannot be changed. Unlike a variable, a constant cannot be changed after it is assigned a value (initialized).

A symbol is what the compiler deals with. In this example, TEN is a symbolic constant created using the #define function. #define is something the compiler doesn't even know about, because the precompiler converts it to an assigned (defined) value. The precompiler searches and replaces each character constant inside your program with a value.

 #define TEN 10 /* These two lines of code become one... */ breakpoint += TEN; /* after running through the precompiler */ 

The precompiler translates it to

 Breakpoint += 10; 

The compiler never sees TEN, but only its assigned value, 10. Why is this useful? What if the breakpoint changes to 11. Instead of looking at the entire program and changing each variable definition to a new value that was set using the literal constant, 10, change the definition of one character constant ... TEN to 11 and let the precompiler insert changes for you.

0
source

All Articles