The difference between time complexity and runtime

It’s just interesting if there is a conversation in the question about the execution time of the algorithm, does this mean the same thing as Time Complexity, or is there a difference between the two?

+7
source share
5 answers

Duration of work is the duration of a program. Time complexity is a description of the asymptotic behavior of the runtime, since the size of the input file tends to infinity.

You can say that the run time is “equal” to O (n ^ 2) or something else, because this is an idiomatic way of describing complexity classes and designating large O. Actually, the run time is not a complexity class, it is either a duration or a function, which gives you duration. "Being O (n ^ 2)" is the mathematical property of this function, not its full characteristic. The exact runtime can be 2036 * n ^ 2 + 17453 * n + 18464 processor cycles or whatever. Not that you very often need to know this in detail, and in any case, it may depend on the actual input, as well as on the size of the input.

+12
source

Timing and lead time are two different things.

Time complexity is a complete theoretical concept related to algorithms, while runtime is the time that code should execute, not theoretically.

Two algorithms can have the same time complexity, for example, O (n ^ 2), but twice as long as the other.

+1
source

"Runtime" refers to the algorithm in question:

Another algorithm can solve this problem asymptotically faster, that is, with shorter runtime.

The "complexity of time", on the other hand, is inherent in the problem under consideration. It is defined as the smallest runtime of any algorithm that solves this problem.

The same applies to other measures of algorithmic value, such as memory, #processors, communication volume, etc.

(Bloom's acceleration theorem demonstrates that the “least” time cannot be achieved at all ...)

+1
source

To analyze the algorithm, it is necessary to determine the amount of resources (for example, time and memory) necessary for its execution. Most algorithms are designed to work with inputs of arbitrary length. Typically, efficiency or running time of an algorithm specified as a function that associates an input length with a number of steps (time complexity) or storage locations (space complexity).

0
source

From CLRS 2.2 p. 25

The running time of an algorithm at a particular input is the number of primitive operations or “steps”. Conveniently define the concept of step so that it is as machine-independent as possible.

Now from wikipedia

... the time complexity of the algorithm determines the amount of time it takes the algorithm to execute, depending on the length of the string representing the input.

The complexity of the time is usually estimated by counting the number of elementary operations performed by the algorithm, where the elementary operation requires a fixed amount of time.

Note that both descriptions emphasize the ratio of input size to the number of primitive / elementary operations.

I believe that this makes it clear that both relate to the same concept.

In practice, although you will find that entrepreneurial jargon rarely matches academic terminology, for example, tons of people work on code optimization, but rarely solve problems.

0
source

All Articles