It is accepted that the modern physics of the quantum is inherently statistical. And Julian Barbour, in The End Of Time investigated an under-lying statistical basis to General Relativity for a consistently self-contained self-representative understanding of the universe.

One could suggest, that the classical physics of Galileo relativity, which
inter-relate observations for familiar motions far below light speed, can be
characterised as an agreement of averages, the relevant average in this case
being the arithmetic mean.

For instance, two observers moving, in a line, away from an object at, say,
four kilometres per hour and six kilometres per hour, agree that on average,
they are moving away at five kilometres per hour. That is: (4 + 6)/2 = 5. Or,
relatively speaking, they agree that, on average, the object is moving away
from them at that arithmetic mean speed of five kilometres per hour.

For any given observer, light might seem to have an equilibrum velocity, as an average of that observers local velocity going with and against the light. The same magnitude of velocity would cancel out from having the opposing signs of opposing directions, like the crest and the trof of a wave above and below its equilibrium level, such as the sea level when flatly at rest.

In this respect, the equilibrium light speed would be the arithmetic mean of the positive and negative amplitudes of an observers local velocity, u. Add the two range limits of light speed, c, plus local observers velocity, u, or (c+u), to light speed minus local velocity, or (c-u), then divide by two, giving the light speed, c.

It might be argued that all observers agree that light speed is their common
equilibrium velocity.

But to make local observers measures agree on a given event, significantly approacing light speed, they each must take the geometric mean of their respective pair of range limits, (c+u) and (c-u), by multiplying them and taking their square root.

This is, in effect, light speed multiplied by the gamma factor. This factor was first employed to correct the Michelson-Morley calculation, so that it agreed with the null result of their experiment, whereby light took the same time to reflect back whether aligned or cross-ways relative to earth motion.

To date, the physics community has not found out or faced the fact, I pointed out years ago, that the geometric mean, instead of the arithmetic mean, gives the correct answer to the Michelson-Morley experiment.

I have already paid tribute to the essential experimental genius of Michelson and Morley. They were wrong but not remiss, in their calculation, which was akin to the usual common sense type Galileo transformations of different observational viewpoints. Indeed, it took their experiment to come up with the new Lorentz transformations for inter-relating near-light speed observations of a given event.

The Lorentz transformations and the later Minkowski Interval, multiplied-in differing local times, so that the generally agreed geometric mean for each local observer was extended to dimensions of distance, or time multiplied by velocity, and not just velocity.

The peculiar effect of the geometric mean, on any local measures, is that the velocity range limits, tho equally above and below light speed, no longer neatly cancel, as when the arithmetic mean is taken of them. Instead, multiplying lower by upper limit and taking the square root of the two limits results in the so-called shrinking factor (really just a geometric mean, I contend). Observed velocities never exceed light speed.

This shrinking effect does make an absolute of light speed, but it is not quite like the space, time and ether velocity absolutes of classical physics, because the Intervals geometric mean inter-relation of local observations only makes light speed an absolute on average. Light speed becomes a (geometric mean type) averaged absolute - not an unconditional absolute or absolute absolute!

Consider the opposite extreme to very large scale relativistic physics of traveling close to light speed. In the extremely small scales of quantum electro-dynamics, light follows random paths more or less than the constant speed that they average out at, over familiar world scale distances.

It can be seen why the Interval is an agreement of the geometric means of
different observers. This happens with averaging in ordinary statistics.
Provided that the values taken from a range of values are truly representative,
they will give the true average for the whole range.

For example, a sample would not include only positive values or only negative
values, in a range that included both, or this would make the sample untypical
or unrepresentative and so not yield the true average.

This representative sample requirement is the statistical version of the
physicists quest for laws that are satisfied by any observational frame of
reference.

Within the terms of special relativity, the mathematical form of the Interval
ensures that each observers local "sample" of measurements is truly
representative of the whole range of every possible observers measurements.

In this respect, the sample is not the random sample, taken in statistics, to
obtain some approximate representation of the nature of the whole. The
statistical nature of special relativity may not have been appreciated because
it does not make itself felt as a more than statisticly accurate measurement
problem (unlike in some - but by no means all - aspects of quantum physics).

The whole range of possible observations, significantly approaching light speed, is a geometric series, thus averaged by the geometric mean, because of the deceleration of observed bodies approaching light speed. Bodies become ever more massive needing exponentially more energy to even creep ever more slowly towards light speed.

Each observers local measurements are a random sample due to the principle of relativity that says there is no privileged co-ordinate system. That is to say there is no absolute space or absolute time and thereby no absolute velocity, as postulated for a universal ether medium by which light waves were supposed to move in.

When Einstein dispensed with the ether, in his theory of special relativity, he was implicitly inaugurating a statistical theory, tho he believed and desired otherwise. For, the meaning of relativity is that all observation frames of reference have no determined relation to some fundamental reference frame. Therefore the frames are all essentially random. In this sense, observers reference frames can be called random "samples" of measurement. (By the way, the measurement of random distributions, like the normal distribution, is performed by something called parametric statistics.)

However, statistics is itself deterministic enough for the random observation frames to have a determined relation to each other, even tho it really is in terms of the statistical concept of a type of average or representative measure of all the possible observations of a given event.

When relativists introduced the notion of observer space and time varying locally, they were implicitly introducing the statistical concept of variations along a range or distribution. And their observational transformations of Lorentz and Minkowski were the taking of the properly inter-relating average, for that geometric range (the geometric mean rather than the arithmetic mean).

*Richard Lung.
13 July 2012.*