Here are my two cents:
I would keep a clean copy of the original array for maximum performance. You can either save the reference copy
var vectorsOrig = [ [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0] ];
or make a dynamic clean clone of the original array using a slice ((recursively in your case for a deep copy):
var clonedVectors = [0, 0, 0, 0, 0].slice(0);
Regardless, the approach of dropping a vector link to the original copy will be faster than looping and resetting each node . If your old vector array object is no longer referenced, JavaScript will garbage collect it.
With that said, the question is to get a clean copy every time. Once the hard-coded instance gives you one clean copy, and you have to clone it after that. You also don't want to dynamically generate using the same loops as a reset option. My advice is to write a clone function that simply returns a new hard or initialized array:
function newVector() { return [ [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0] ]; } var vector = newVector(); vector[1][2] = 11; console.dir(vector); vector = newVector(); // your old array will be garbage-collected if no longer referenced by any other reference console.dir(vector);
Ideally, it is best to compare different approaches.
EDIT Thanks to Vega, I modified his test to test three approaches. In Chrome and IE9, this solution seems to be the fastest, in FF (15.0.1) manual iteration seems to be faster (perhaps memory allocation / control is slower in FF). http://jsperf.com/array-zero-test/2