Average learning time
4 answers
20,000,000 instructions, one of which will have a page error
Therefore, 20,000,000 instructions will take
(2 nanoseconds * 20,000,000) + 10 milliseconds
get the result (this is the total time for 20,000,000 instructions) and divide by the number of instructions to get the time per instruction.
+2
a source to share
If 1 in 20,000,000 instructions causes a page fault, then you have the chance of a page failure:
Page Fault Rate = (1/20000000)
Then you can calculate the average time per instruction:
Average Time = (1 - Page Fault Rate) * 2 ns + (Page Fault Rate * 10 ms)
Comes up to 2.5ns / instruction
0
a source to share