I am doing simulation project where i have hundred of CPU-Bound jobs running for 10 to 50 milliseconds. The Job is a Runnable object with specified running-time for which the job will keep CPU busy. There are 10 threads waiting in a thread pool for job arrival. I set request rate to 40 requests per second and all jobs running-time is set to 10ms. But results are so much terrible . All jobs run for at least 15ms. There is no job that runs for 10 ms. i test the experiment with jobs at 15ms and i got correct result. Why does 10ms jobs run for atleast 15ms?(I am using WINDOWS8).
public class CpuBoundJob implements Runnable {
long runningTime
public CpuBoundJob(long runningTime) {
this.runningTime=runningTime;
}
@Override
public void run() {
long timeToWait = this.runningTime;
long startTime = System.currentTimeMillis();
while(startTime + timeToWait > System.currentTimeMillis());
}
}
On many systems (Windows specifically, IIRC),
System.currentTimeMillis()
is backed by a clock that is only accurate to with 15 milliseconds or so.However, it's even worse:
System.currentTimeMillis()
measures time since the Unix epoch as measured by your system clock. So if you change the time on your computer (e.g. due to syncing your system clock to a time source, or due to a leap second, or any number of other things) thencurrentTimeMillis()
can jump forward or backward arbitrarily large amounts of time.If you want to measure elapsed time, NEVER use
currentTimeMillis()
. UseSystem.nanoTime()
instead. As a bonus, on most systems, it's also substantially cheaper to invoke and substantially more accurate.