Average learning time

Let's say we have on average one page error every 20,000,000 instructions, a normal instruction takes 2 nanoseconds, and a page error causes the command to take an additional 10 milliseconds. What is the average learning time taking into account page faults?

0


a source to share


4 answers


20,000,000 instructions, one of which will have a page error

Therefore, 20,000,000 instructions will take



  (2 nanoseconds * 20,000,000) + 10 milliseconds

      

get the result (this is the total time for 20,000,000 instructions) and divide by the number of instructions to get the time per instruction.

+2


a source


What is the average learning time given page errors?

Average command time is the total time divided by the number of instructions.



So: what is the total time for 20,000,000 instructions?

+1


a source


2.5 nanoseconds? Pretty simple arithmetic I think.

0


a source


If 1 in 20,000,000 instructions causes a page fault, then you have the chance of a page failure:

Page Fault Rate = (1/20000000)

      

Then you can calculate the average time per instruction:

Average Time = (1 - Page Fault Rate) * 2 ns + (Page Fault Rate * 10 ms)

      

Comes up to 2.5ns / instruction

0


a source







All Articles