Byte vs Short vs Int (along with unsigned change) In C #?

I was told that as long as the memory size does not cause much concern, it is always better to use int instead of a byte or short, because it is easier for the processor to process int (the processor should do extra things to work with bytes and shorts). Is this true in C #?

+7
source share
1 answer

It depends more on the processor than on the language. An 8-bit microcontroller will almost certainly be able to access an 8-bit char faster than a 32-bit int.

Knowing this limitation allows the developers of the algorithm to plan accordingly: one of the reasons why Rijndael won the AES contest is because the designers planned to execute the 8-bit versions as quickly as possible, in addition to maintaining the execution speed on 32-bit or larger processors.

But for 32-bit and 64-bit microprocessors, the key is key: data alignment and bulk data access: int access is often much faster than char , and long long systems (64 bits) can be even faster. (But 64-bit operations on a 32-bit machine are much slower, so using 64-bit data types makes the most sense when the data actually makes more sense in 64 bits.)

+6
source

All Articles