speed specifies how fast simulated time is allowed to run, in percent of real time; default is 100.
check-interval specifies how often the check should take place, in milliseconds of simulated time; default is 1000. Higher values give better performance; lower values reduce the maximum difference between real and simulated time. Setting this to less than set-time-quantum
has no effect.
Upon deviation, drift-compensate regulates how much may be adjusted. If set to (for example) 0.25, simulation speed may be increased or decreased by up to 25% if necessary to make up for any accumulated drift with respect to real time. If set to zero (the default), the simulation speed may not be changed at all from its set value.