On Linux (and I assume the same thing happens on Mac), when a program allocates memory, the OS does not actually allocate it until it uses it.
If the program never uses memory, then the OS should not spend RAM on it, but it puts the OS in the place when the program asks for a ton of memory and actually needs to use it, but the OS is not enough.
If this happens, the OS can either start killing other secondary processes, or give RAM to the request process, or just kill the request process (what is happening now).
The original 4 GB of memory that Python uses is most likely a page where numpy sets 1 in the identity matrix; the remaining pages have not yet been used. Performing a mathematical operation, such as 2*x , starts accessing (and thus alloocating) all pages until the OS runs out of memory and kills your process.
Colonel Thirty Two
source share