A major puzzle faced scientists in the 19th century. Volcanoes showed that the earth is molten beneath its crust. Penetration into the crust by bore-holes and mines showed that the earth's temperature increases with depth. Scientists knew that heat flows from the interior to the surface. They assumed that the source of the earth's internal heat was primordial, the afterglow of its fiery birth. Measurements of the earth's rate of cooling indicated a relatively young earth - some 25 to 30 million years in age. But geological evidence indicated an older earth. This puzzle wasn't solved until the discovery of radioactivity. Then it was learned that the interior was kept hot by the energy of radioactive decay. We now know the age of the earth is some 4.5 billion years - a much older earth.
All rock contains trace amounts of radioactive minerals. Radioactive minerals in common granite release energy at the rate 0.03 J/kg/yr. Granite at the earth's surface transfers this energy to the surroundings practically as fast as it is generated, so we don't find granite any warmer than other parts of our environment. But what if a sample of granite were thermally insulated? That is, suppose all the increase of internal energy due to radioactive decay were contained. Then it would get hotter.
How much? Let's figure it out, using 790 J/kg/K as the specific heat of granite.
|