Does CUDA use a GPU RAM card?

For example, if I have a GPU with 2 GB of RAM, and in my application a large array is allocated, for example 1 GB, as mapped memory (locked host memory mapped to the GPU address space allocated with cudaHostAlloc()), there will be the amount of GPU memory available will be reduced for 1 GB of mapped memory or do I still have (close to) 2 GB, as before distribution and use?

+4
source share
1 answer

Mapping host memory so that it appears in the GPU address space does not consume memory from the GPU's internal memory.

, cudaMemGetInfo

+5

All Articles