How to find the operating time at a given algorithm speed and computer speed?

I am currently working on a task that is related to Big-O and runtime. I have one question presented to me that seems very easy, but I'm not sure if I am doing it right. The rest of the problems were quite complicated, and I feel like I don’t notice something here.

Firstly, you have the following things: Algorithm A, which has a run time of 50n ^ 3. Computer A, which has a speed of 1 millisecond per operation. Computer B, which has a speed of 2 milliseconds per operation. Instance of size 300.

I want to find how long algorithm A takes to resolve this instance on computer A and how long it takes on computer B.

What I want to do is sub 300 in for n, so you have 50 * (300 ^ 2) = 4500000.

Then multiply this by 1 for the first computer and by 2 for the second computer.

This seems strange to me because it says that the “runtime” is 50n ^ 3, not the “number of operations 50n ^ 3”, so I get the feeling that I multiply from time to time, and in the end there will be units milliseconds squared, which doesn't seem right.

I would like to know if I'm right, and if not, what question really means.

+4
source share
3 answers

This would not make sense if you had O (n ^ 3), but you are not using the large O notation in your example. That is, if you had O (n ^ 3), you would know the complexity of your algorithm, but you would not know the exact number of operations, as you said.

Instead, it looks like you are given the exact number of operations to be performed. (I even know that this is not explicitly indicated). Thus, replacing n will make sense.

The Big O designation describes how input size will affect your uptime or memory usage. But with Big O, you could not get the exact time, even at the speed of each operation.

Explaining why your answer looks so simple (as I described above) will also be a safe way. But I am sure that even without this you will get marks for the question.

+2
source

Well, apart from the pointlessness of figuring out how long something will happen on most modern computers, although this may have some meaning in the embedded system, it looks right to me the way you did it.

If to perform something, an algorithm of 50n ^ 3 operations is required, where n is the number of processed elements, then substituting 300 for n will give you the number of operations, not a unit of time.

So, multiply the time by the operation, and you will get the necessary time.

It looks right.

+1
source

Your 50 * n ^ 3 data is called “runtime,” but this is because the model used to estimate speed involves a machine with several basic operations, where each one takes 1 unit of time.

In this case, the execution of the algorithm takes 50 * 500 ^ 3 units of time. On computer A, the unit is 1 ms each time, and on computer B, 2 ms.

Hope this gives some meaning to units,
Asaf.

0
source

All Articles