I am currently working on a task that is related to Big-O and runtime. I have one question presented to me that seems very easy, but I'm not sure if I am doing it right. The rest of the problems were quite complicated, and I feel like I don’t notice something here.
Firstly, you have the following things: Algorithm A, which has a run time of 50n ^ 3. Computer A, which has a speed of 1 millisecond per operation. Computer B, which has a speed of 2 milliseconds per operation. Instance of size 300.
I want to find how long algorithm A takes to resolve this instance on computer A and how long it takes on computer B.
What I want to do is sub 300 in for n, so you have 50 * (300 ^ 2) = 4500000.
Then multiply this by 1 for the first computer and by 2 for the second computer.
This seems strange to me because it says that the “runtime” is 50n ^ 3, not the “number of operations 50n ^ 3”, so I get the feeling that I multiply from time to time, and in the end there will be units milliseconds squared, which doesn't seem right.
I would like to know if I'm right, and if not, what question really means.
source share