メートル | NIST
原題: Meter | NIST
分析結果
- カテゴリ
- AI
- 重要度
- 60
- トレンドスコア
- 24
- 要約
- NISTのメートルに関するページでは、メートルの定義やその重要性について説明されています。メートルは国際単位系(SI)の基本単位であり、長さの測定に広く使用されています。公式な政府機関のウェブサイトであり、HTTPSを使用して安全性が確保されています。
- キーワード
Meter | NIST Skip to main content Official websites use .gov A .gov website belongs to an official government organization in the United States. Secure .gov websites use HTTPS A lock ( Lock A locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites. https://www.nist.gov/si-redefinition/meter SI Redefinition Meter Share Facebook Linkedin X.com Email Whether it’s the interminable distance to Grandma’s house, a span of cloth, the number of yards to the goal line or the space between the unfathomably small transistors on a computer chip, length is one of the most familiar units of measurement. Credit: K. Irvine/NIST People have come up with all sorts of inventive ways of measuring length. The most intuitive are right at our fingertips. That is, they are based upon the human body: the foot, the hand, the fingers or the length of an arm or a stride. In ancient Mesopotamia and Egypt, one of the first standard measures of length used was the cubit. In Egypt, the royal cubit, which was used to build the most important structures, was based on the length of the pharaoh’s arm from elbow to the end of the middle finger plus the span of his hand. Because of its great importance, the royal cubit was standardized using rods made from granite. These granite cubits were further subdivided into shorter lengths reminiscent of centimeters and millimeters. Fragment of a Cubit Measuring Rod Credit: Gift of Dr. and Mrs. Thomas H. Foulds, 1925 Later length measurements used by the Romans (who had taken them from the Greeks, who had taken them from the Babylonians and Egyptians) and passed on into Europe generally were based on the length of the human foot or walking and multiples and subdivisions of that. For example, the pace—one left step plus one right step—is approximately a meter or yard. (On the other hand, the yard did not derive from a pace but from, among other things, the length of King Henry I of England’s outstretched arm.) Mille passus in Latin, or 1,000 paces, is where the English word “mile” comes from. However, the Roman mile was not quite as long as the modern version. The Romans and other cultures from around the world such as those in India and China standardized their units, but length measurements in Europe were still largely based on variable things until the 18th century. For instance, in England, for the purposes of commerce, the inch was conceived as the length of three barleycorns laid end to end. Barleycorn. Credit: ©m-desiign/Shutterstock A unit of length for measuring land, a rod, was the length of 16 randomly selected men’s feet, and multiples of it defined an acre. In some places, the area of farmland was even measured in time, such as how much land a man, or a man with an ox, could plow in a day. This measure further depended upon the crop being grown: For example, an acre of wheat was a different size than an acre of barley. This was fine so long as accuracy and precision were not an issue. You could build your own house using such measures, and plots of land could be roughly surveyed, but if you wanted to buy or sell anything based on length or area, collect proper taxes and duties, build more advanced weapons and machines with interchangeable parts, or perform any kind of scientific investigation, you needed a universal standard. The invention of the metric system at the end of the 18th century in revolutionary France was the result of a lengthy effort to establish such a universal system of measurement, one that wasn’t based on bodily dimensions that varied from person to person or from place to place. Rather, the French sought to create a system that would endure “for all times, for all peoples.” To do this, the French Academy of Sciences established a council of preeminent scientists and mathematicians, Jean-Charles de Borda, Joseph-Louis Lagrange, Pierre-Simon Laplace, Gaspard Monge and Nicolas de Condorcet, to study the problem in 1790. A year later, they emerged with a set of recommendations. The new system would be a decimal system, that is, based on 10 and its powers. The measure of distance, the meter (derived from the Greek word metron , meaning “a measure”), would be 1/10,000,000 of the distance between the North Pole and the equator, with that line passing through Paris, of course. The measure of volume, the liter, would be the volume of a cube of distilled water whose dimensions were 1/1,000 of a cubic meter. The unit of mass (or more practically, weight), the kilogram, would be the weight of a liter of distilled water in vacuum (completely empty space). In 1792, astronomers Pierre Méchain and Jean-Baptiste Delambre set out to measure the meter by surveying the distance between Dunkirk, France, and Barcelona, Spain. After seven or so years of effort, they arrived at their final measure and submitted it to the academy, which embodied the prototype meter as a bar of platinum. It was later discovered that scientists made errors in calculating the curvature of the earth, and as a result the original prototype meter was 0.2 millimeters shorter than the actual distance between the North Pole and the equator. While this doesn’t seem like a big discrepancy, it’s the kind of thing that keeps measurement scientists up at night. Nonetheless, it was decided that the meter would remain as realized in the platinum bar. Subsequent definitions of the meter have since been chosen to hew as closely as possible to the length of that first meter bar, despite its shortcomings. As time passed, more and more European countries adopted the French meter as their length standard. However, while the copies of the meter bar were meant to be exact, there was no way to verify this. In 1875, the Treaty of the Meter, signed by 17 countries including the U.S., established the General Conference on Weights and Measures ( Conférence Général des Poids et Mésures , CGPM) as a formal diplomatic organization responsible for the maintenance of an international system of units in harmony with the advances in science and industry. The intergovernmental organization, the International Bureau of Weights and Measures ( Bureau international des poids et mesures , BIPM), was also established at that time. Located just outside of Paris in Sèvres, France, the BIPM serves as the focal point through which its member states act on matters of metrological significance. It is the ultimate arbiter of the International System of Units (SI, Système Internationale d'Unités ) and the repository of the physical measurement standards. The kilogram was the last of the artifact-based measurement standards in the SI. (On May 20, 2019, it was officially replaced with a new definition based on constants of nature.) After that first meeting, the BIPM ordered a new prototype and 30 copies were given to the member states. This new prototype would be made of platinum and iridium, which was significantly more durable than platinum alone. The bar would also no longer be flat but have an X-shaped cross section to better resist distortions that could be introduced by flexing during comparisons with other meter bars. The new prototype would also not be an “end” standard, whereby the meter was defined by the ends of the bar itself. Instead, the bar would be over a meter long and the meter would be defined as the distance between two lines inscribed on its surface. Easier to create than an end standard, these inscriptions also enabled the measurement of the meter to survive if the ends of the bar got damaged. Official measurements of the prototype meter would occur at standard atmospheric pressure at the melting point of ice. Until 1960, the SI standard of length was disseminated using platinum-iridium meter bars such as these from the NIST Museum. Credit: NIST And this was how it remained until 1927, when precision measurement of the meter could make a quantum leap thanks to advancements made in a 40-year-old technique known as interferometry. In this technique, waves of light can be manipulated in such a way that they combine or “interfere” with one another, enabling precise measurements of the length of the waves—the distance between successive peaks. In an interferometer, two or more waves overlap to produce an “interference pattern,” which can provide detailed information on the waves, such as their wavelengths (the distance between adjacent peaks). In this simple, ideal setup, an individual light wave from a laser hits a beamsplitter, which creates two light waves traveling in different paths. One of the waves hits a moving mirror, which can vary its distance as it travels to the detector. If one wave’s peaks overlap with the other’s valleys (left panel), they cancel out. If the two wave’s peaks overlap, however, they create a bright spot (right panel). Credit: S. Kelley/NIST It was in 1927 that NIST (then known as the National Bureau of Standards) advocated for the interference patterns of energized cadmium atoms to be made a practical standard of length. This was useful because international measurement artifacts such as meter bars could not be everywhere at once; however, with proper equipment, scientists anywhere could measure the meter with cadmium. Their copies, exquisite as they might be, are not as accurate as the real thing. Neither an artifact nor its copies are suited for every measurement one might want to make. To cite one real-world example, gage blocks are length standards commonly used in machining. Because of the extremely fine work demanded of machinists, their calibration standards must be finely crafted as well. Using cadmium (and krypton) wavelengths, gage blocks could be certified to being accurate to within 0.000001 inch per every inch (1 part per million), three times closer than previously. In the mid-1940s, nuclear physicists aimed neutrons at gold to transform the atoms into mercury. NIST physicist William Meggers noted that aiming radio waves a