Via Satellite, February 2000
The International System of Units
Its History and Use in Science and Industry
by Robert A. Nelson
On September 23, 1999 the Mars Climate Orbiter was lost during an orbit injection maneuver
when the spacecraft crashed onto the surface of Mars. The principal cause of the mishap
was traced to a thruster calibration table, in which British units instead of metric units
were used. The software for celestial navigation at the Jet Propulsion Laboratory expected
the thruster impulse data to be expressed in newton seconds, but Lockheed Martin Astronautics
in Denver, which built the orbiter, provided the values in pound-force seconds, causing the
impulse to be interpreted as roughly one-fourth its actual value.
The Mars spacecraft incident renews a controversy that has existed in the United States
since the beginning of the space program regarding the use of metric or British units of
measurement. To put the issue into perspective, this article reviews the history of the
metric system and its modern version, the International System of Units (SI). The
origin and evolution of the metric units, and the role they have played in the United
States, will be summarized. Technical details and definitions will be provided for
reference. Finally, the use of metric units in the satellite industry will be examined.
ORIGIN OF THE METRIC SYSTEM
The metric system was one of many reforms introduced in France during the period between
1789 and 1799, known as the French Revolution. The need for reform in the system of weights and measures, as in other affairs, had long been recognized. No other aspect of applied science affects the course of human activity so directly and universally.
Prior to the metric system, there had existed in France a disorderly variety of measures,
such as for length, volume, or mass, that were arbitrary in size and variable from one
town to the next. In Paris the unit of length was the Pied de Roi and the unit of mass
was the Livre poids de marc. These units could be traced back to Charlemagne. However,
all attempts to impose the "Parisian" units on the whole country were fruitless, as
they were opposed by the guilds and nobles who benefited from the confusion.
The advocates of reform sought to guarantee the uniformity and permanence of the units of
measure by taking them from properties derived from nature. In 1670, the abbe Gabriel
Mouton of Lyons proposed a unit of length equal to one minute of arc on the earth’s
surface, which he divided into decimal fractions. He suggested a pendulum of specified
period as a means of preserving one of these submultiples.
The conditions required for the creation of a new measurement system were made possible
by the French Revolution, an event that was initially provoked by a national financial
crisis. In 1787 King Louis XVI convened the Estates General, an institution that had last
met in 1614, for the purpose of imposing new taxes to avert a state of bankruptcy.
As they assembled in 1789, the commoners, representing the Third Estate, declared
themselves to be the only legitimate representatives of the people, and succeeded
in having the clergy and nobility join them in the formation of the National Assembly.
Over the next two years, they drafted a new constitution.
In 1790, Charles-Maurice de Talleyrand, Bishop of Autun, presented to the National Assembly
a plan to devise a system of units based on the length of a pendulum beating seconds at
latitude 45. The new order was envisioned as an "enterprise whose result should belong
some day to the whole world.” He sought, but failed to obtain, the collaboration of
England, which was concurrently considering a similar proposal by Sir John Riggs Miller.
The two founding principles were that the system would be based on scientific observation
and that it would be a decimal system. A distinguished commission of the French Academy of
Sciences, including J. L. Lagrange and Pierre Simon Laplace, considered the unit of length.
Rejecting the seconds pendulum as insufficiently precise, the commission defined the unit,
given the name metre in 1793, as one ten millionth of a quarter of the earth’s meridian
passing through Paris. The proposal was accepted by the National Assembly on March 26, 1791.
The definition of the meter reflected the extensive interest of French scientists in the
figure of the earth. Surveys in Lapland by Maupertuis in 1736 and in France by LaCaille in 1740 had refined the value of the earth’s radius and established definitively that the shape of the earth is oblate. To determine the length of the meter, a new survey was conducted by the astronomers Jean Baptiste Delambre and P.F.A. Mechain between Dunkirk, in France on the English Channel, and Barcelona, Spain, on the coast of the Mediterranean Sea. This work was begun in 1792 and completed in 1798, enduring the hardships of the “reign of terror” and the turmoil of revolution. We now know that the quadrant of the earth is
10 001 957 meters instead of exactly 10 000 000 meters as originally planned. The
principal source of error was the assumed value of the earth’s flattening used in
correcting for oblateness.
The unit of volume, the pinte (later renamed the litre), was defined as the volume of a
cube having a side equal to one-tenth of a meter. The unit of mass, the grave (later
renamed the kilogramme), was defined as the mass of one pinte of distilled water at the
temperature of melting ice. In addition, the centigrade scale for temperature was
adopted, with fixed points at 0 C and 100 C representing the freezing and boiling
points of water (now replaced by the Celsius scale).
The work to determine the unit of mass was begun by Lavoisier and Hauy and was completed
by Gineau and Fabbroni. They discovered that the maximum density of water occurs at 4 C,
and not at 0 C as had been supposed, so the definition of the kilogram was amended to
specify the temperature of maximum density. We now know that the intended mass was
0.999972 kg, i.e., 1000.028 cm3 instead of exactly 1000 cm3 for the
volume of 1 kilogram of pure water at 4 C.
On August 1, 1793 the National Convention, which by then ruled France, issued a decree
adopting the preliminary definitions and terms. The “methodical” nomenclature,
specifying fractions and multiples of the units by Latin prefixes, was chosen in
favor of the “common” nomenclature, involving separate names.
A new calendar was also introduced in September, 1793. Its origin was designated
retroactively as September 22, 1792 to commemorate the overthrow of the monarchy
and the inception of the Republic of France. The French Revolutionary Calendar
consisted of twelve months of thirty days each, concluded by a five or six day
holiday. The months were given poetic names that suggested the prevailing seasons.
Each month was divided into three ten-day weeks, or decades. The day itself was
divided into decimal fractions, with ten hours per day and 100 minutes per hour.
The calendar was politically, rather than scientifically, motivated, since it
was intended to weaken the influence of Christianity. It was abolished by
Napoleon in 1806 in return for recognition by the Church of his authority as
emperor of France. Although the calendar reform remained in effect for twelve
years, the new method of keeping the time of day required the replacement of
valued clocks and timepieces and was never actually used in practice.
The metric system was officially adopted on April 7, 1795. The government issued
a decree (Loi du 18 germinal, an III) formalizing the adoption of the definitions
and terms that are in use today. A brass bar was made by Lenoir to represent the
provisional meter, obtained from the survey of LaCaille, and a provisional standard
for the kilogram was derived.
In 1799 permanent standards for the meter and kilogram made from platinum were
constructed based on the new survey by Delambre and Mechain. The full length of
the meter bar represented the unit. These standards were deposited in the Archives
of the Republic. They became official by an act of December 10, 1799.
During the Napoleonic era, several regressive acts were passed that temporarily
revived old traditions. Thus in spite of its auspicious beginning, the metric
system was not quickly adopted in France. Although the system continued to be
taught in the schools, lack of funds prevented the distribution of secondary
standards. Finally, after a three year transition period, the metric system
became compulsory throughout France as of January 1, 1840.
REACTION IN THE UNITED STATES
The importance of a uniform system of weights and measures was recognized in the
United States, as in France. Article I, Section 8, of the U.S. Constitution
provides that Congress shall have the power “to coin money ... and fix the standard
of weights and measures.” However, although the progressive concept of decimal
coinage was introduced, the early American settlers both retained and cultivated
the customs and tools of their British heritage, including the measures of length
and mass. In contrast to the French Revolution, the “American Revolution” was
not a revolution at all, but was rather a war of independence.
In 1790, President George Washington referred the subject of weights and
measures to his Secretary of State, Thomas Jefferson. In a report submitted to the
House of Representatives, Jefferson considered two alternatives: if the existing
measures were retained they could be rendered more simple and uniform, or if a new
system were adopted, he favored a decimal system based on the principle of the
seconds pendulum. As it was eventually formulated, Jefferson did not endorse the
metric system, primarily because the metric unit of length could not be checked
without a sizable scientific operation on European soil.
The political situation at the turn of the eighteenth century also made
consideration of the metric system impractical. Although France under Louis XVI
had supported the colonies in the war with England, by 1797 there was manifest
hostility. The revolutionary climate in France was viewed by the external world
with a mixture of curiosity and alarm. The National Convention had been replaced
by the Directory, and French officials who had been sympathetic to the United
States either had been executed or were in exile. In addition, a treaty negotiated
with England by John Jay in 1795 regarding settlement of the Northwest Territories
and trade with the British West Indies was interpreted by France as evidence of
an Anglo-American alliance. France retaliated by permitting her ships to prey
upon American merchant vessels and Federalist President John Adams prepared
for a French invasion. Thus in 1798, when dignitaries from foreign countries
were assembled in Paris to learn of France’s progress with metrological reform,
the United States was not invited.
A definitive investigation was prepared in 1821 by Secretary of State John
Quincy Adams that was to remove the issue from further consideration for the next
forty-five years. He found that the standards of length, volume, and mass used
throughout the 22 states of the Union were already substantially uniform, unlike
the disparate measures that had existed in France prior to the French Revolution.
Moreover, it was not at all evident that the metric system would be permanent,
since even in France its use was sporadic and, in fact, the consistent terminology
had been repealed in 1812 by Napoleon. Therefore, if the metric system failed to
win support in early America, it was not for want of recognition.
Serious consideration of the metric system did not occur again until
after the Civil War. In 1866, upon the advice of the National Academy of Sciences,
the metric system was made legal by the Thirty-Ninth Congress. The Act was signed
into law on July 28 by President Andrew Johnson.
TREATY OF THE METER
A series of international expositions in the middle of the nineteenth century enabled
the French government to promote the metric system for world use. Between 1870 and 1872,
with an interruption caused by the Franco-Prussian War, an international meeting of
scientists was held to consider the design of new international metric standards that
would replace the meter and kilogram of the French Archives. A Diplomatic Conference on
the Meter was convened to ratify the scientific decisions. Formal international approval
was secured by the Treaty of the Meter, signed in Paris by the delegates of 17 countries,
including the United States, on May 20,1875.
The treaty established the International Bureau of Weights and Measures (BIPM). It
also provided for the creation of an International Committee for Weights and Measures
(CIPM) to run the Bureau and the General Conference on Weights and Measures (CGPM)
as the formal diplomatic body that would ratify changes as the need arose. The
French government offered the Pavillon de Breteuil, once a small royal palace, to
serve as headquarters for the Bureau in Sevres, France near Paris. The grounds of
the estate form a tiny international enclave within French territory.
A total of 30 meter bars and 43 kilogram cylinders were manufactured from a single
ingot of an alloy of 90 percent platinum and 10 percent iridium by Johnson, Mathey
and Company of London. The original meter and kilogram of the French Archives in
their existing states were taken as the points of departure. The standards were
intercompared at the International Bureau between 1886 and 1889. One meter bar
and one kilogram cylinder were selected as the international prototypes. The
remaining standards were distributed to the signatories. The work was approved
by the First General Conference on Weights and Measures in 1889.
The United States received meters 21 and 27 and kilograms 4 and 20. On January 2, 1890
the seals to the shipping cases for meter 27 and kilogram 20 were broken in an official
ceremony at the White House with President Benjamin Harrison presiding. The standards
were deposited in the Office of Weights and Measures of the U.S. Coast and Geodetic
U.S. CUSTOMARY UNITS
The U.S. customary units were tied to the British and French units by a variety of indirect
Troy weight was the standard for the minting of coins. Congress could be ambivalent
about nonuniformity in standards for trade, but it could not tolerate nonuniformity in its
standards for money. Therefore, in 1827 a brass copy of the British troy pound of 1758
was secured by Ambassador to England and former Secretary of the Treasury, Albert Gallatin.
This standard was kept in the Philadelphia mint and lesser copies were made and distributed
to other mints. The troy pound of the Philadelphia mint was virtually the primary standard
for commercial transactions until 1857 and remained the standard for coins until 1911.
The semi-official standards used in commerce for a quarter century may be
attributed to Ferdinand Hassler, who was appointed superintendent of the newly organized
Coast Survey in 1807. In 1832 the Treasury Department directed Hassler to construct and
distribute to the states standards of length, mass, and volume, and balances by which
masses might be compared. As the standard of length, Hassler adopted the Troughton
scale, an 82-inch brass bar made by Troughton of London for the Coast Survey that
Hassler had brought back from Europe in 1815. The distance between the 27th and 63rd
engraved lines on a silver inlay scale down the center of the bar was taken to be
equal to the British yard. The standard of mass was the avoirdupois pound, derived
from the troy pound of the Philadelphia mint by the ratio 7000 grains to 5760 grains.
It was represented by a brass knob weight that Hassler constructed and marked with a
star. Thus it has come to be known as the “star” pound.
The system of weights and measures in Great Britain had been in use since the reign of
Queen Elizabeth I. Following a reform begun in 1824, the imperial standard avoirdupois
pound was made the standard of mass in 1844 and the imperial standard yard was adopted
in 1855. The imperial standards were made legal by an Act of Parliament in 1855 and
are preserved in the Board of Trade in London. The United States received copies of
the British imperial pound and yard, which became the official U.S. standards from
1857 until 1893.
When the metric system was made lawful in the United States in 1866, a companion
resolution was passed to distribute metric standards to the states. The Treasury
Department had in its possession several copies derived from the meter and kilogram
of the French Archives. These included the “Committee” meter and kilogram,
which were an iron end standard and a brass cylinder with knob copied from
the French prototypes, that Hassler had brought with him when he immigrated to
the United States in 1805. He had received them as a gift from his friend, J.G.
Tralles, who was the Swiss representative to the French metric convocation in 1798
and a member of its committee on weights and measures. Also available were
the “Arago” meter and kilogram, named after the French physicist who certified
them. They were purchased by the United States in 1821 through Albert Gallatin,
then minister to France. The Committee meter and the Arago kilogram were used
as the prototypes for brass metric standards that were distributed to the states.
In 1893, under a directive from Thomas C. Mendenhall, Superintendent of Standard Weights and
Measures of the Coast and Geodetic Survey, the U.S. customary units were redefined in terms
of the metric units. The primary standards of length and mass adopted were prototype
meter No. 27 and prototype kilogram No. 20 that the United States had received in 1889
as a signatory to the Treaty of the Meter. The yard was defined as 3600/3937 meter and
the avoirdupois pound-mass was defined as 0.4535924277 kilogram. The conversion for
mass was based on a comparison between the British imperial standard pound and the
international prototype kilogram performed in 1883. These definitions were used by
the National Bureau of Standards (now the National Institute of Standards and
Technology) from its founding in 1901 until 1959. On July 1, 1959 the definitions
were fixed by international agreement among the English-speaking countries to be
1 yard = 0.9144 meter and 1 pound-mass = 0.45359237 kilogram exactly. The
definition of the yard is equivalent to the relations 1 foot = 0.3048 meter and
1 inch = 2.54 centimeters exactly.
The derived unit of force in the British system is the pound-force (lbf), which is
defined as the weight of one pound-mass (lbm) at a hypothetical location where the
acceleration of gravity has the standard value 9.80665 m/s2 exactly. Thus, 1
lbf = 0.45359237 kg x 9.80665 m/s2 = 4.448 N approximately. The slug (sl) is
the mass that receives an acceleration of one foot per second squared under a
force of one pound-force. Thus 1 sl = (1 lbf)/(1 ft/s2) = (4.448 N)/(0.3048
= 14.59 kg = 32.17 lbm approximately.
THE ELECTRICAL UNITS
The theories of electricity and magnetism developed and matured during the early 1800s
as fundamental discoveries were made by Oersted, Ampere, Faraday, and many others. The
possibility of making magnetic measurements in terms of mechanical units, that is
in “absolute measure,” was first pointed out by Gauss in 1833. His analysis was
carried further to cover electrical phenomena by Weber, who in 1851 discussed a
method by which a complete set of absolute units might be developed.
In 1861 a committee of the British Association for the Advancement of Science, that
included William Thomson (later Lord Kelvin), James Clerk Maxwell, and James Prescott
Joule, undertook a comprehensive study of electrical measurements. This committee
introduced the concept of a system of units. Four equations were sufficient to
determine the units of charge q, current I, voltage V, and
resistance R. These
were either Coulomb’s force law for charges or Ampere’s force law for currents,
the relation between charge and current
q = I t, Ohm’s law V = I R, and the equation for electrical work
W = V q = I 2 R t, where t is time.
A fundamental principle was that the system should be coherent. That is, the
system is founded upon certain base units for length, mass, and time, and derived
units are obtained as products or quotients without requiring numerical factors.
The meter, gram, and mean solar second were selected as base units. In 1873 a
second committee recommended a centimeter-gram-second (CGS) system of units
because in this system the density of water is unity.
Two parallel systems of units were devised, the electrostatic
and electromagnetic subsystems, depending on whether the law of force for
electric charges or for electric currents was taken as fundamental. The ratio of the
electrostatic to the electromagnetic unit of charge or current
was a fundamental experimental constant c.
The committee also conducted research on electrical standards. It issued a wire resistance standard, the “B.A. unit,” which soon became known as the “ohm.” The idea of naming units after eminent scientists was due to Sir Charles Bright and Latimer Clark.
At the time, electricity and magnetism were essentially two distinct branches of experimental physics. However, in a series of papers published between 1856 and 1865, Maxwell created a unified theory based on the field concept introduced by Faraday. He predicted the existence of electromagnetic waves and identified the “ratio of the units” c with the speed of light.
In 1888, Heinrich Hertz verified Maxwell’s prediction by generating and detecting
electromagnetic waves at microwave frequencies in the laboratory. He also greatly
simplified the theory by eliminating unnecessary physical assumptions. Thus the form
of Maxwell’s equations as they are known to physicists and engineers today is due to
Hertz. (Oliver Heaviside made similar modifications and introduced the use of vectors.)
In addition, Hertz combined the electrostatic and electromagnetic CGS units into a
single system related by the speed of light c, which he called the “Gaussian” system of
The recommendations of the B.A. committees were adopted by the First International
Electrical Congress in Paris in 1881. Five “practical” electrical units were defined
as certain powers of 10 of the CGS units: the ohm, farad, volt, ampere, and coulomb.
In 1889, the Second Congress added the joule, watt, and a unit of inductance,
later given the name henry.
In 1901, Giorgi demonstrated that the practical electrical units and the MKS mechanical units
could be incorporated into a single coherent system by (1) selecting the meter, kilogram,
and second as the base units for mechanical quantities; (2) expanding the number of base
units to four, including one of an electrical nature; and (3) assigning physical
dimensions to the permeability of free space 0, with a numerical value of
4 x107 in
a “rationalized” system or 107 in an “unrationalized” system.
(The term “rationalized,” due
to Heaviside, concerned where factors of 4 should logically appear in the equations based
on symmetry.) The last assumption implied that the magnetic flux density B and magnetic
field H, which are related in vacuum by the equation
B = 0 H, are physically distinct with different units, whereas in the
they are of the same character and are dimensionally equivalent. An analogous situation
occurs for the electric fields D and Ethat are related by D = 0
E, where 0 is the permittivity of free space given by
c2 = 1 / 0 0.
In 1908, an International Conference on Electrical Units and Standards held in London
adopted independent, easily reproducible primary electrical standards for resistance
and current, represented by a column of mercury and a silver coulombmeter, respectively.
These so-called “international” units went into effect in 1911, but they soon became
obsolete with the growth of the national standards laboratories and the increased
application of electrical measure-ments to other fields of science.
With the recognition of the need for further international coopera-tion, the 6th
CGPM amended the Treaty of the Meter in 1921 to cover the units of electricity and
photometry and the 7th CGPM created the Consultative Committee for Electricity
(CCE) in 1927. By the 8th CGPM in 1933 there was a universal desire to replace
the “international” electrical units with “absolute” units. Therefore, the
International Electrotechnical Commission (IEC) recommended to the CCE an
absolute system of units based on Giorgi’s proposals, with the practical
electrical units incorporated into a comprehensive MKS system. The choice
of the fourth unit was left undecided.
At the meeting of the CCE in September 1935, the delegate from England,
J.E. Sears, presented a note that set the course for future action. He
proposed that the ampere be selected as the base unit for electricity,
defined in terms of the force per unit length between two long parallel
wires. The unit could be preserved in the form of wire coils for
resistance and Weston cells for voltage by calibration with a current
balance. This recommendation was unanimously accepted by the CCE and
was adopted by the CIPM.
Further progress was halted by the intervention of World War II.
Finally, in 1946, by authority given to it by the CGPM in 1933, the
CIPM officially adopted the MKS practical system of absolute electrical
units to take effect January 1, 1948.
INTERNATIONAL SYSTEM OF UNITS (SI)
By 1948 the General Conference on Weights and Measures was responsible for the units
and standards of length, mass, electricity, photometry, temperature, and ionizing
radiation. At this time, the next major phase in the evolution of the metric system
was begun. It was initiated by a request of the International Union of Pure and
Applied Physics “to adopt for international use a practical international system of
units.” Thus the 9th CGPM decided to define a complete list of derived units.
Derived units had not been considered previously because they do not require
independent standards. Also, the CGPM brought within its province the unit of
time, which had been the prerogative of astronomers.
The work was started by the 10th CGPM in 1954 and was completed by the 11th CGPM in
1960. During this period there was an extensive revision and simplification of
the metric unit definitions, symbols, and terminology. The kelvin and candela
were added as base units for thermodynamic temperature and luminous intensity,
and in 1971 the mole was added as a nineth base unit for amount of substance.
The modern metric system is known as the International System of Units, with the
international abbreviation SI. It is founded on the nine base units, summarized
in Table 1, that by convention are regarded as dimensionally independent. All
other units are derived units, formed coherently by multiplying and dividing units
within the system without the use of numerical factors. Some derived units,
including those with special names, are listed in Table 2. For example, the unit
of force is the newton, which is equal to a kilogram meter per second squared,
and the unit of energy is the joule, equal to a newton meter. The expression of
multiples and submultiples of SI units is facilitated by the use of prefixes,
listed in Table 3. (Additional information is available on the Internet at the
websites of the International Bureau of Weights and Measures at
and the National Institute of Standards and Technology at
One must distinguish a unit, which is an abstract idealization, and a standard, which
is the physical embodiment of the unit. Since the origin of the metric system,
the standards have undergone several revisions to reflect increased precision as
the science of metrology has advanced.
The meter. The international prototype meter standard of 1889 was a
platinum-iridium bar with an X-shaped cross section. The meter was defined by the distance
between two engraved lines on the top surface of the bridge instead of the distance
between the end faces. The meter was derived from the meter of the French Archives
in its existing state and reference to the earth was abandoned.
The permanence of the international prototype was verified by comparison with three
companion bars, called “check standards.” In addition, there were nine measurements
in terms of the red line of cadmium between 1892 and 1942. The first of these
measurements was carried out by A. A. Michelson using the interferometer which he
invented. For this work, Michelson received the Nobel Prize in physics in 1907.
Improvements in monochro-matic light sources resulted in a new standard based on a
well-defined wavelength of light. A single atomic isotope with an even atomic number
and an even mass number is an ideal spectral standard because it eliminates complexity
and hyperfine structure. Also, Doppler broadening is minimized by using a gas of
heavy atoms in a lamp operated at a low temperature. Thus a particular orange
krypton-86 line was chosen, whose wavelength was obtained by direct comparison with
the cadmium wavelength. In 1960, the 11th CGPM defined the meter as the length equal
to 1 650 763.73 wavelengths of this spectral line.
Research on lasers at the Boulder, CO laboratory of the National Bureau of Standards
contributed to another revision of the meter. The wavelength and frequency of a
stabilized helium-neon laser beam were measured independently to determine the
speed of light. The wavelength was obtained by comparison with the krypton
wavelength and the frequency was determined by a series of measurements
traceable to the cesium atomic standard for the second. The principal source of error
was in the profile of the krypton spectral line representing the meter itself.
Consequently, in 1983 the 17th CGPM adopted a new definition of the meter based
on this measurement as “the length of the path traveled by light in vacuum during a
time interval of 1/299 792 458 of a second.” The effect of this definition is to
fix the speed of light at exactly
299 792 458 m/s. Thus experimental methods previously interpreted as measurements
of the speed of light c (or equivalently, the permittivity of free space 0) have
become calibrations of length.
The kilogram. In 1889 the international prototype kilogram was adopted as the standard
for mass. The prototype kilogram is a
platinum-iridium cylinder with equal height and diameter of
3.9 cm and slightly rounded edges. For a cylinder, these dimensions present the smallest
surface area to volume ratio to minimize wear. The standard is carefully preserved in a
vault at the International Bureau of Weights and Measures and is used only on rare
occasions. It remains the standard today. The kilogram is the only unit still defined in
terms of an arbitrary artifact instead of a natural phenomenon.
The second. Historically, the unit of time, the second, was defined in terms of the period
of rotation of the earth on its axis as 1/86 400 of a mean solar day. Meaning “second
minute,” it was first applied to timekeeping in about the nineteenth century when
pendulum clocks were invented that could maintain time to this precision.
By the twentieth century, astronomers realized that the rotation of the earth is not constant.
Due to gravitational tidal forces produced by the moon on the shallow seas, the length of
the day is increasing by about
1.4 milliseconds per century. The effect can be measured by comparing the computed paths
of ancient solar eclipses on the assumption of uniform rotation with the recorded locations
on earth where they were actually observed. Consequently, in 1956 the second was redefined
in terms of the period of revolution of the earth about the sun for the epoch 1900,
as represented by the Tables of the Sun computed by the astronomer Simon Newcomb of the
U.S. Naval Observatory in Washington, DC. The operational significance of this definition
was to adopt the linear coefficient in Newcomb’s formula for the mean longitude of the
sun to determine the unit of time.
The rapid development of atomic clocks soon permitted yet another definition. Accordingly,
in 1967 the 13th CGPM defined the second as “the duration of
9 192 631 770 periods of the radiation corresponding to the transition between
the two ground states of the cesium-133 atom.” This definition was based on
observations of the moon, whose ephemeris is tied indirectly to the apparent motion of
the sun, and was equivalent to the previous definition within the limits of experimental
The ampere. The unit of electric current, the ampere, is defined as that constant current
which, if maintained in each of two parallel, infinitely long wires with a separation of 1
meter in vacuum, would produce a force per unit length between them equal to 2 x 10-7 N/m.
This formal definition serves to establish the value of the constant 0 as 4 x 107 N/A2 exactly.
Although the base unit for electricity is the ampere, the electrical units are maintained
through the volt and the ohm.
In the past, the practical representation of the volt was a group of Weston saturated
cadmium-sulfate electrochemical standard cells. A primary calibration experiment involved
the measurement of the force between two coils of an “ampere balance” to determine the
current, while the cell voltage was compared to the potential difference across a known
The ohm was represented by a wire-wound standard resistor. Its resistance was measured against
the impedance of an inductor or a capacitor at a known frequency. The inductance can be
calculated from the geometrical dimensions alone. From about 1960, a
so-called Thompson-Lampard calculable capacitor has been used, in which only a single
measurement of length is required.
Since the early 1970s, the volt has been maintained by means of the Josephson effect, a quantum
mechanical tunneling phenomenon discovered by Brian Josephson in 1962. A Josephson junction may
be formed by two superconducting niobium films separated by an oxide insulating layer. If the
Josephson junction is irradiated by microwaves at frequency f and the bias current is
progressively increased, the current-voltage characteristic is a step function, in which the
dc bias voltage increases discontinuously at discrete voltage intervals equal
to f / KJ , where
KJ = 2 e / h is the Josephson constant, h is Planck’s constant, and e is the elementary charge.
The ohm is now realized by the quantum Hall effect, a characteristic of a two-dimensional
electron gas discovered by Klaus von Klitzing in 1980. In a device such as a silicon
transistor (MOSFET), the Hall
voltage VH for a fixed current I increases in discrete steps as
the gate voltage is
increased. The Hall resistance, or RH = VH / I , is
equal to an integral fraction of the von Klitzing constant, given by
RK = h / e2 = 0 c / 2 , where is the fine structure constant. In practice,
RK can be measured in terms of a laboratory resistance standard, whose resistance is
obtained by comparison with the impedance of a calculable capacitor, or it can be
obtained indirectly from.
A new method to determine the relation between the mechanical and electromagnetic units that
has shown much promise is by means of a “watt balance,” which has greater precision than an
ordinary ampere balance. In this experiment, a current I is passed through a test coil
suspended in the magnetic field of a larger coil so that the force F balances a known
weight mg. Next the test coil is moved axially through the magnetic field and the velocity
v and induced voltage V are measured. By the equivalence of mechanical and electrical power,
F v = V I. The magnetic field and apparatus geometry drop out of the calculation. The
voltage V is measured in terms of the Josephson constant KJ while
the current I is
calibrated by the voltage across a resistance known in terms of the von Klitzing constant
RK. The experiment determines KJ 2 RK (and
thus h), which yields KJ if RK is assumed to
be known in terms of the SI ohm.
The Josephson and quantum Hall effects provide highly uniform and conveniently reproducible
quantum mechanical standards for the volt and the ohm. For the purpose of practical
engineering metrology, conventional values for the Josephson constant and the von Klitzing
constant were adopted by international agreement starting January 1, 1990. These values
are KJ-90 = 483 597.9 GHz/V and
RK-90 = 25 812.807 exactly. The best experimental SI values, obtained as part
of an overall least squares adjustment of the fundamental constants completed in 1998,
differ only slightly from these conventional values.
METRIC UNITS IN INDUSTRY
The International System of Units (SI) has become the fundamental basis of scientific
measurement worldwide. It is also used for everyday commerce in virtually every country
of the world but the United States. Congress has passed legislation to encourage use of
the metric system, including the Metric Conversion Act of 1975 and the Omnibus Trade and
Competitiveness Act of 1988, but progress has been slow.
The space program should have been the leader in the use of metric units in the
United States and would have been an excellent model for education. Burt Edelson,
Director of the Institute for Advanced Space Research at George Washington University
and former Associate Administrator of NASA, recalls that “in the mid-‘80s, NASA made a
valiant attempt to convert to the metric system” in the initial phase of the international
space station program. However, he continued, “when the time came to issue production
contracts, the contractors raised such a hue cry over the costs and difficulties of
conversion that the initiative was dropped. The international partners were unhappy,
but their concerns were shunted aside. No one ever suspected that a measurement
conversion error could cause a failure in a future space project.”
Economic pressure to compete in an international environment is a strong motive for
contractors to use metric units. Barry Taylor, head of the Fundamental Constants Data
Center of the National Institute of Standards and Technology and U.S. representative
to the Consultative Committee on Units of the CIPM, expects that the greatest stimulus
for metrication will come from industries with global markets. “Manufacturers are moving
steadily ahead on SI for foreign markets,” he says. Indeed, most satellite design
technical literature does use metric units, including meters for length, kilograms for
mass, and newtons for force, because of the influence of international partners,
suppliers, and customers.
As we begin the new millennium, there should be a renewed national effort to promote
the use of SI metric units throughout industry, and to assist the general public in
becoming familiar with the system and using it regularly. The schools have taught
the metric system in science classes for decades. It is time to put aside the
customary units of the industrial revolution and to adopt the measures of precise
science in all aspects of modern engineering and commerce, including the United States
space program and the satellite industry.
Dr. Robert A. Nelson, P.E. is president of Satellite Engineering
Research Corporation, a satellite engineering consulting firm in Bethesda, Maryland.
He is Via Satellite’s Technical Editor.
Table 1. SI Base Units
length meter m
mass kilogram kg
time second s
electric current ampere A
thermodynamic temperature kelvin K
amount of substance mole mol
luminous intensity candela cd
Table 2. Examples of SI Derived Units
Special Name Symbol Equivalent
plane angle radian rad 1
solid angle steradian sr 1
angular velocity rad/s
angular acceleration rad/2
frequency hertz Hz s-1
speed, velocity m/s
force newton N kg m/s2
pressure, stress pascal Pa N/m2
energy, work, heat joule J kg m2 /s2, N m
power watt W kg m2/s3, J/s
power flux density W/m2
linear momentum, impulse kg m/s, N s
angular momentum kg m2/s, N m s
electric charge coulomb C A s
electric potential, emf volt V W/A, J/C
magnetic flux weber Wb V s
resistance ohm V/A
conductance siemens S A/V, -1
inductance henry H Wb/A
capacitance farad F C/V
electric field strength V/m, N/C
electric displacement C/m2
magnetic field strength A/m
magnetic flux density tesla T Wb/m2, N/(A m)
Celsius temperature degree Celsius C K
luminous flux lumen lm cd sr
illuminance lux lx lm/m2
radioactivity becquerel Bq s-1
Table 3. SI Prefixes
Factor Prefix Symbol Factor Prefix Symbol
1024 yotta Y 10-1 deci d
1021 zetta Z 10-2 centi c
1018 exa E 10-3 milli m
1015 peta P 10-6 micro
1012 tera T 10-9 nano n
109 giga G 10-12 pico p
106 mega M 10-15 femto f
103 kilo k 10-18 atto a
102 hecto h 10-21 zepto z
101 deka d 10-24 yocto yze