What is the difference between declaring variables as short int and short? In the gcc compiler, short takes 2 bytes (checked with sizeof (short)), and short int also gives 2 bytes size. Are both the same - different? In this case, these ads will be useful?
Thanks in advance
short not suitable for short int , they are equivalent in any C compiler.
short
short int
The same goes for long int vs long , long long int vs long long .
long int
long
long long int
long long
short , short int , signed short int and signed short are all the same data types.
signed short int
signed short
So sizeof(short) == sizeof(short int)
sizeof(short) == sizeof(short int)
The same goes for long
Source: https://habr.com/ru/post/1212611/More articles:Polymer template: what are the valid checkboxes for content entry points - templatesHow to create an aspect in an interface method that extends from the Super interface - javaChrome extension: failed to load javascript file - google-chrome-extensionCan't debug my quicklook plugin under Xcode - debuggingCan TypeScript output annotations for the Closure Compiler? - javascriptGPPSignIn does not call openURL: sourceApplication method in appDelegate after returning - iosResources in iOS Custom Infrastructures - iosUsage (?SyncFusion JS chart - tooltips behave incorrectly - javascriptWhere do System.out.println () messages go when it is called in the client GWT module? - javaAll Articles