Subject: Re: microtime
To: None <jonathan@DSG.Stanford.EDU, smd@ab.use.net>
From: Sean Doran <smd@ab.use.net>
List: tech-kern
Date: 08/22/2002 22:05:46
| However, dont forget that:
| * NTP is very carefully crafted so that what matters ot it
| is not aboslute delay, but *variance* in delay (aka jitter).
Teensy nit: variance and *symmetry* of delay.
I think Jonathan knows alot of this but...
There are some timescale people doing GPS common-view[1]-ish
stuff with a sattelite. Essentially each lab gets a chunk
of time where they transmit timestamps to the others, who
listen and record offsets. The sattelites are in pretty
well known positions (although the orbits are complicated to
_strange_ at metre scales) so subtracting the one way
delay from the recorded offset is simple. The result is
sent back by email to the originating lab, and the
"algorithms" (i.e., people's brains) take them into account
in tuning the local timescales.
NTP doesn't have the luxury of a readily determined high-quality
one-way-delay estimate, hence Mills's fun maths to figure out
what half the RTT is, after getting rid of noise like queueing
and processing delays in intermediate systems. Otherwise,
the principle is essentially the same, with Mills's brain
mapped into some heuristics for local consumption.
The sattelite game is getting figures that are much much
more useful than UTC circular t. However, these people are at
standards establishments, and are very very very very conservative
about how they do things. Sadly, this also means that alot of them
haven't hooked up to the Internet in any useful fashion. Most of
that seems to involve lack of programmers, however I suspect some of it
is a little deliberate, in that they have other (national) timescale
dissemination mechanisms which they use to help maintain their ongoing
funding. Given that nearly nobody needs the sorts of precisions
the labs maintain, any justification for that funding (or any revenue
at all) is considered extremely precious.
However, I'm sure that if it impressed their governments somehow,
NTP servers closely slaved to the timescales themselves would pop
up in several more places around the world, and thus the nanosecond
resolution timestamp you get from your kernel might have a chance
of being "right" (i.e., it can predict the next UTC circular t).
But, see footnote...
Sean.
[1] this involves listening to GPS timecodes, and sharing differences
between received timecodes and local timescale (e.g. cesium clock
or H2 maser) via email. http://www.stupi.se/Time/
kindly notice the graph, and ponder "nanosecond resolution in kernel"
versus "time of day to the nanosecond in kernel".