There is no such linear growth asymptotics O(n + k) , where k is a constant. If k were a constant and you returned to the limit of the algorithmic growth rate, you will see that O(n + k) = O(n) , because the constants fall out within.
Your answer may be O(n + k) due to the variable k , which basically does not depend on another input set n . You see that this is usually compared to moves in the analysis of sorting algorithms.
To answer the question of why we omit k in Big-O notation (which, it seems to me, does not teach well, which leads to this confusion), one definition (as I recall) of O () is as follows

Read: f(n) is in O( g(n) ) iff there exists d and n_0 where for all n > n_0, f(n) <= d * g(n)
Let's try to apply it to our problem here, where k is a constant and, therefore, f (x) = k and g (x) = 1.
- Are there
d and n_0 to satisfy these requirements?
Trivially, the answer, of course, is yes. We choose d> k and for n> 0 the definition is satisfied.
im so confused
source share