Your watch says you have a minute to go. You walk into the meeting room all set, and a sea of angry faces look up at you, saying “Where have you been? You’re late!” What went wrong? The answer is your watch – it was 5 minutes out. This is called time error. It is the difference between the time reported by a clock (or watch), and the reference clock (for example, the clock on the meeting room wall).
The first thing to point out is that it is relative – time error has no meaning without a reference clock to compare to. I can look at my watch, and I have no idea what the time error is until I look at the reference clock. Mathematically, the time error of a clock is defined as the time of the measured clock minus the time of the reference clock.
The second point is that the time error of a clock can vary. For example, if my watch gains 10s a day, the time error will increase by 10s every day, until I decide to adjust it to match the reference again. It may also vary with temperature changes and other random effects.
Therefore, to measure time error we always need a reference, and we also need to measure over a long period. It’s no use me checking my watch just after I’ve adjusted it and thinking it is always going to be that accurate, because tomorrow it will be 10s out, and the day after it will be 20s out. Some 4G and 5G mobile systems require the basestations to be synchronised to each other to within a microsecond. To verify that, we will need a time reference accurate to much better than a microsecond, and we’ll need to continuously measure over a reasonable long length of time, possibly a day or more.
For further information, a white paper called “Time and Time Error” is available on our website here.