I'm getting ready for the unit quiz, and there are two types of conversions that hit me.
Type one: What is the length (in ns) of one cycle on an XXX computer? - In this case, XXX may be at MHz or GHz randomly. I have problems converting tile times. Example:
What is the length (in ns) of one cycle on a computer with a frequency of 50 megahertz (MHz)?
The second type of conversion I have problems with: If the average instruction on the XXX computer requires ZZ cycles, how long (in ns) does the average instruction execute to execute? - As in the previous case, XXX will be either on MHz or on Ghz. For instance:
If the average command on a computer with a frequency of 2.0 GigaHertz (GHz) requires 2.0 clock cycles, how long (in ns) does the average instruction execute?
I do not understand what I am doing wrong in these transformations, but I continue to make mistakes. Any help would be great!
user427390
source share