Why use a macro when the same work is done by a variable? in C language?

I do not see the need to use a macro to define something, for example, in the example below:

#define TELEPHONE 443 //why use this int telephone = 443; //when same job is done by a regular variable 

why does anyone use a macro to define (declare a variable in C)? like in the above scenario?

0
c macros
source share
4 answers

There are several cases where a macro is more suitable. At the top of the head:

1. Define a constant:

 #define PI 3.1415926 

2. Use as a constant expression.

 #define BUFSIZE 1024 

later you can use

 int buffer[BUFSIZE]; 

This task cannot be completed even if you use const int bufsize = 1024; ( without using a variable length array )


EDIT

To clarify the use of const comment, in this code:

 int main() { const int x = 5; int array[x]; return 0; } 

array is actually a variable-length array, compile it with gcc test.c -std=c89 -pedantic , generates a warning:

 warning: ISO C90 forbids variable-size array `array' 

The problem is that the const variable in C is not a constant expression, this is exactly my point of using #define in this situation. And this is one of the differences between C and C ++.

Link: C> Frequently Asked Questions

+1
source share

Why use a macro to define (C variable declaration)?

A macro does not define a variable; it defines a constant. Your C program that uses TELEPHONE , that #define d is indistinguishable from the compiler from a program that contains a hard-coded value of 443 .

Unlike the int telephone variable, the TELEPHONE macro cannot be changed at run time and has no address.

Finally, you can specify macro values ​​on the command line when you call the compiler - something impossible to do when dealing with C variables. Instead of defining #define TELEPHONE 443 you can pass the value on the command line, for example:

 gcc -DTELEPHONE=443 myprog.c 
+1
source share

Using a variable occupies a space in the final image, which may not be required or desirable (for example, in embedded applications), while the definition is processed strictly by the compiler (preprocessor).

+1
source share

dasblinkenlight pretty much covers it, but I would just clarify some of the things they say, macros really don't define constants, you do this with the const modifier, and you should use this most of the time. A macro defines substitution text, the only thing it knows about the language you write in is using spaces. So for macros you can do something like this

 #define CODE_SNIPPET 10; i++ ) printf( "Hello %d\n" for( int i = 0; i < CODE_SNIPPET , i ); 

which will be converted by the preprocessor to

 for( int i = 0; i < 10; i++ ) printf( "Hello %d\n" , i ); 

It is very powerful, but clearly open to abuse. I very rarely use macros for constants, instead I will define the values ​​as const in the .c file, and then declare it extern in the header, as

 const int kAnswerToEveryThing = 42; // .c file extern const int kAnswerToEveryThing; // .h file 

I have one project that I use macros quite a bit, this is not c in Objective-C, but the macro system is identical, and I defined macros that extend into class methods that add metadata to the class that should be used by my library so that it performed the same functions as Java annotations.

0
source share

All Articles