Like most of the quantities we will talk about in this section, temperature - or to be precise thermodynamic temperature - is one of the base quantities defined by the International System of Units. The internationally agreed base unit for temperature is the kelvin, named after the Irish mathematical physicist and engineer William Thomson (1824-1907). Thompson earned the title 1st Baron Kelvin in 1892, partly in recognition of his achievements in the field of thermodynamics, and partly due to his opposition to Irish home rule.

The SI unit for heat energy (or work or just energy) is the joule, although we sometimes refer to a unit of heat used in the centimetre-gram-second (CGS) system of physical units called a calorie. One calorie represents the amount of heat energy required to increase the temperature of one gram (one cubic centimetre) of pure liquid water by one degree Celsius (1 °C), or that is lost by one gram of pure liquid water as its temperature decreases by one degree Celsius.

The concept of heat is generally associated with that of temperature, and rightly so. They are of course closely related, but they are not the same thing. We often talk about a "heat wave" when temperatures are higher than normal in the summer, but even in the depths of winter, when temperatures fall below the freezing point of water, our environment is far from being devoid of heat. If it were, none of us would be here to talk about it. So what is the difference between temperature and heat? We"ll come back to that question shortly.

A brief history




You are watching: Heat is measured in units called


Man has always been aware of the fact that some things, such as fire, are very hot, and other things, like ice and snow, are very cold. The human body is very sensitive to even relatively small changes in air temperature, and we can readily determine how hot or cold an object or substance is in relative terms simply by touching it. As you are no doubt aware, however, many things are so hot (or so cold) that touching them would yield no useful information about their temperature, and could even result in serious injury. Touching a hot iron, for example, even for a moment, can give you a nasty burn (I speak from experience). Nevertheless, we have to have some way of determining temperature; the success of many modern industrial, scientific and commercial endeavours is dependent on this ability.

Measuring temperature is not as easy or straightforward as measuring something like length or mass. That said, we have been using high-temperature processes for thousands of years to create tools, weapons, and other artefacts, despite the absence of any meaningful temperature scale.

In order to make metal implements from bronze (a metal alloy consisting of about eighty-eight per cent copper and twelve per cent tin), it was necessary to heat the metals to a very high temperature so that they would melt and form an alloy. The molten metal could then be poured into a mould to form a casting. At lower temperatures, the casting would harden, but would still be malleable enough to be shaped by hammering.

To put things into perspective, the Bronze Age began sometime around 3300 BCE. It is generally considered to have ended around 1200 BCE, when it was succeeded by the Iron Age. It was not until the latter half of the 16th century CE, however, that man started to develop instruments that could actually measure temperature with a reasonable degree of accuracy. It was not for want of trying; efforts to establish some kind of yardstick for measuring temperature date back at least as far as the second century CE.

The Greek physician, surgeon and philosopher of the Roman Empire Aelius Galenus or Claudius Galenus (129 CE - circa 200 or 216 CE), probably better known as Galen Of Pergamon, made a great number of contributions in the fields of human anatomy and medicine. He also made one of the first attempts on record, in 170 CE, to create a "neutral" temperature standard by mixing equal amounts by volume of ice and boiling water. From this standard, he derived four degrees of heat that lay above his neutral temperature, and four degrees of cold that lay below it.

It was not until the late sixteenth century CE, however, that the first thermometer appears in historical records. Prior to this, temperature measurement seems to have been largely based on observation, in the sense that wax would melt, water would boil, and various metals would start to glow (or even melt) when sufficient heat was applied. Even so, there was no real temperature scale, and no reliable way to measure the temperature at which such events occurred.

The credit for the invention of the thermometer is often given to the Italian astronomer, physicist, engineer, philosopher and mathematician Galileo Galilei (1564-1642). The device was invented in around 1592, and was known as a thermoscope or air thermometer. The thermoscope consisted of a long glass tube, open at one end and opening into a bulbous enclosure at the other end. The open end of the tube is immersed in a vessel containing water as shown in the illustration below. If some of the air in the tube is allowed to escape, the lower part of the tube becomes partially flooded. If the temperature of the bulb is increased, the air inside it will expand, forcing some of the water out of the tube. Conversely, if the temperature of the bulb cools, the air inside it will contract, allowing more water to enter the tube.

The level of the water in the tube will rise or fall, depending on the temperature of the air inside the bulb. There was no scale on the glass tube, and the instrument would not have been particularly accurate anyway because it is affected by changes in atmospheric pressure, but it was a step in the right direction. Indeed, the general principle on which it is based (noting the changes in the volume of a gas or a liquid in a tube due to changes in temperature) is essentially the same as that of a modern glass thermometer.




See more: What Is The Function Of Ligase, Dna Ligase: Definition & Role

*

Galileo"s thermoscope (circa 1592)


The Italian physician Santorio Santorio (1561-1636), best known for his studies of human physiology, improved on the design of the thermoscope in around 1612. In his version, a graduated scale was added to the glass tube, and he used the device to measure variations in human temperature with some success. Although the device was still not particularly accurate, it could be used to detect, fairly reliably, whether or not a patient had a fever.

The next major advance is credited to the Italian Ferdinand II de" Medici (1610-1670), Grand Duke of Tuscany from 1621 until his death. He had a lifelong fascination with new technology, and was responsible for the construction of the first sealed thermometer in 1641. His thermometer was a glass tube with a bulb at one end, similar in shape to Galileo"s, but positioned so that the bulb was at the bottom end of the tube. The bulb was filled with coloured alcohol, and the open end of the tube was sealed. The alcohol level in the tube would move up or down as the temperature changed, causing the liquid to expand or contract. The graduations on the tube allowed the temperature to be read, although there was still no standardised temperature scale against which these graduations could be calibrated.

Even though still not particularly accurate by today"s standards, Ferdinand"s device had several advantages over its predecessors. The use of a sealed tube meant that the device was portable. Furthermore, it significantly reduced the effect of variations in atmospheric pressure on the accuracy of readings. The use of alcohol meant that the device could be used at lower temperatures, since the freezing point of alcohol is considerably lower than that of water. And of course the coloured alcohol was much easier to see than water, making it easier to take temperature readings.

During the first part of the eighteenth century, a number of temperature scales evolved. The problem now, however, was that there were simply far too many different temperature scales in use. Among those trying to make sense of this somewhat chaotic situation was the Polish-Dutch engineer, physicist and glass-blower Daniel Gabriel Fahrenheit (1686-1736). Fahrenheit had learned how to calibrate thermometers from the Danish astronomer Ole Rømer (1644-1710), who had developed one of the first temperature scales (the Rømer scale).

Rømer based his scale on the proposal made by the English polymath Robert Hooke (1635-1703) for a temperature scale that used the freezing point of water as its zero point, but he realised that at least two fixed points would be needed in order to allow interpolation between them. On Rømer"s scale, the freezing point of water was designated as seven-point-five degrees (7.5°). For the second fixed point, he chose the boiling point of water, which he designated as sixty degrees (60°).

From 1708 to 1724, Fahrenheit produced a number of thermometers based on Rømer"s temperature scale. Although Fahrenheit refused to disclose the exact details of his calibration methods, he is known to have used the melting point of a mixture of sea salt, ice and water as one of his calibration points, and the temperature of a healthy human being as another. He improved the design of the instrument itself by replacing the bulb-shaped reservoir with a cylindrical reservoir, and used mercury instead of alcohol (the rate of thermal expansion of mercury is significantly more linear than that of alcohol). For the main body of the thermometer, he used a fine capillary tube. The thermometer was partially filled with mercury, and then heated so that the mercury would expand and force all of the remaining air out of the tube, which was then sealed.

Fahrenheit was obviously not happy with Rømer"s temperature scale, as he subsequently created his own scale in which the freezing point of water was set at thirty two degrees (32°), and the boiling point of water at two hundred and twelve degrees (212°). The interval between the two points was thus exactly one hundred and eighty degrees (180°) on Fahrenheit"s new scale. Note that the freezing point of water was originally set at thirty degrees (30°), but that was for salt-water. The freezing point of pure water is (approximately) two degrees higher. The Fahrenheit temperature scale was widely used in the UK until relatively recently, and is still used in the United States.

The temperature scale most widely used today (outside of scientific circles) is the Celsius temperature scale, named after the Swedish astronomer, physicist and mathematician Anders Celsius (1701-1744). Celsius set the zero point of his temperature range to the so-called "steam point" of water (essentially, its boiling point). He designated the melting point of pure water ice as the second anchor point in his temperature scale, and divided the interval between them into one hundred one-degree intervals. The scale was later inverted so that the melting point of water ice became zero degrees, with the boiling point of water set to one hundred degrees (100°), which seems far more logical.

The Celsius temperature scale has in the past often been referred to as the centigrade temperature scale because it divides the interval between the melting point of ice and the boiling point of water into one hundred equal parts. The name Celsius has been the preferred name since 1948, however, when it was officially adopted by an international conference on weights and measures.

The exact nature of heat has been a question that has occupied scientists for centuries, and attempts to define it have resulted in a number of theories being put forward. By the middle of the nineteenth century, the predominant theory appears to have been the Caloric Theory. In this theory, heat was an unseen physical entity - a somewhat mysterious fluid known as caloric. One of the main proponents of this theory was the French noblemen and chemist Antoine Lavoisier (1743-1794), an influential figure in the history of both chemistry and biology. He theorised that "caloric" manifested itself in one of two forms. Latent caloric was stored in combustible materials. This latent caloric, he reasoned, was released when such materials burned, releasing caloric to the immediate environment in a form that was observable through a change in the temperature of that environment.

Heat is thus described as a physical substance which can change its form, but must otherwise be conserved. If heat is released by some object or material, it must be gained by another object or material, and the overall amount of heat remains constant. There are some rather obvious problems with this theory, however. If heat is a physical substance that can be transferred from one object to another, there must surely be some accompanying transfer of mass. This would appear to be supported by the fact that, when a combustible material such as wood or coal burns, it invariably loses some mass to the environment. But what about any nearby objects that increase in temperature as a result of the combustion? According to caloric theory, they should gain "caloric", and therefore gain some mass.

The many accomplishments of British scientist Benjamin Thompson (1753-1814) - later Count Rumford - included designing warships for the British Navy during the American War of Independence. He wrote a scientific paper, published in 1798, that challenged the assumptions of caloric theory and sowed the seeds of a revolution in the field of thermodynamics. Having spent a number of years in the service of the Bavarian government, including a spell as Army Minister, his activities included overseeing the boring out of cannon barrels at the arsenal in Munich.

His observations there led him to conclude that, despite the amount of heat generated during this process (subsequently immersing the cannon barrels in water would eventually heat the water to boiling point), the cannon and the material removed from them did not undergo any chemical or physical change that would support the idea of some unseen substance (Lavoisier"s "caloric") being transferred from the canon to the water. And, far from being a conserved quantity (which surely implies that the cannon must contain some fixed amount of latent caloric), Thompson noted that heat appeared to be generated at a continuous rate for as long as the boring process continued.

Thompson concluded that the heat generated by the boring process must have some mechanical explanation, and that it was in some way related to motion. He theorised that continuous mechanical action would produce heat indefinitely. He correctly identified the cause of the heat in the boring process as friction between the boring tool and the cannon. His 1798 paper rejected caloric theory, putting forward instead the view that heat was generated by the motion of atoms.

Although Thompson seems to have been content to let the matter lie there, the story continues with the work of the English physicist James Prescott Joule (1818-1889), after whom the SI derived unit of energy is named. Joule, who would later work with William Thompson (Lord Kelvin) to develop the Kelvin temperature scale, performed various experiments that established a relationship between heat and mechanical work. The illustration below shows the apparatus he used in one of his experiments. In this experiment, Joule used a weight suspended from a pulley to turn a spindle. A pair of paddles was attached to the bottom end of the spindle, which was mounted within an enclosure that had been filled with water.