If you grow on some fixed constant C, then no, the runtime will not be O (n). Instead, it will be & Theta; (n 2 ).
To see this, think about what happens if you perform a sequence of sequential operations C. Of these operations, C-1 of them will take O (1) time, because space already exists. The last operation will take O (n) time, because it needs to redistribute the array, add a space and copy everything. Therefore, any sequence of operations C takes time O (n + c).
Now consider what happens if you complete a sequence of n operations. Break these operations into blocks of size C; there will be n / C. The overall work required to complete these operations will be
(c + c) + (2c + c) + (3c + c) + ... + (n + c)
= cn/c + (c + 2c + 3c +... + nc/c)
= n + c (1 + 2 + 3 +... + n/c)
= n + c (n/c) (n/c + 1)/2
= n + n (n/c + 1)/2
= n + n 2/c + n/2
= & Theta; (n 2)
, , :
1 + 2 + 4 + 8 + 16 + 32 +... + n
= 1 + 2 + 4 + 8 +... + 2 log n
= 2 log n + 1 - 1
= 2n - 1
= & Theta; (n)
SO.
2 - 1 + 2 + 4 + 8 + 16 +...
2 0 + 2 1 + 2 2 +... + 2 n-1
2 n - 1. , , 32- , 2 32 - 1.