It depends on what you mean. If we are talking about simple arrays, which means a sequence of contiguos memory locations, and for initialization, you mean the value for each memory cell of this “matrix”, then the answer is no , better than O (n * m) is impossible, and we can do it prove:
Suppose that the correct fill(A[n][m], init_val) (that is, fills all the memory cells of A ) has a complexity g(n,m) that is less than O(n*m) (the value g(n,m) not part of Ω(n*m) ), then for sufficiently large n and m we will have g(n,m) < n*m = the number of memory locations. Since filling a memory cell requires one operation, the fill algorithm can fill in no more than g(n,m) locations [actually half, because it should also perform at least the operation “select” another memory location, unless the hardware provides combined operation], which is strictly less than n*m , which means that the fill algorithm is incorrect.
The same thing happens if filling memory cell k takes a constant time, you just need to select larger values of n and m .
As already mentioned, you can use other data structures to avoid the initialization time O(n^2) . amit suggestion uses some kind of lazy evaluation, which allows us not to initialize the array at all, but to do it only when accessing elements.
Note that this removes the cost of Ω(n^2) at the beginning, but requires more complex operations to access the elements of the array, and also requires more memory.
It is not clear what your interviewer had in mind: it takes time Ω(L) to convert an array to a linked list (where L is the length of the array), so just converting the entire matrix to a linked list will take Ω(n^2) time plus real initialization. Using recursion does not help at all, you just find yourself in recurrence expressions such as T(n) = 2T(n/2) + O(1) , which again will not be useful for asymptotic complexity.
As a rule, all algorithms should scan at least all of their data, except that they have some form of knowledge in advance (for example, items are sorted). In your case, the scanning space is Θ(n^2) , and therefore each algorithm that wants to fill it must be at least Ω(n^2) . Anything less than this complexity either makes some assumptions (for example, memory contains the default initializer value → O(1) ), or solves another problem (for example, use lazy arrays or other data structures).