I’m wondering how Earth’s surface temperatures evolved to support biological life for a cricket-click in time.
Energy from the Sun adapts for supporting life by changing the behavior of radiation-particles from moving in specific directions to clustering in and as matter. Pieces of matter and energy move because of the impossibility for mutually-exclusive force-fields to have position perfectly evenly; and because of the impossibility for their surroundings, in 3D and other dimensions, to present the same levels of exclusion force in all directions.
I believe the universe is a particle in a larger sky. If this were not so, then there would be nowhere for the sum of radiant energy to get away and the sky would be a wall of energy whereas we see space as empty except for the sun and a few bright spots. The Universe must be radiating energy that can peak, coincidently including from other universes, at places where the quantities of movement prevent any particle from associating with another particle long enough to occupy space exclusively. In such focus points in space, no particle could oscillate fast enough concentrically in multi-D nor move in 3D, swapping with other particles in any dimensions, for any teams to occupy their place in any 3D space, with stability. In some places in our universe and beyond, matter is being pushed into alternative dimensions. Within and outside of our observable universe there are places where the heat, from all the stars and universes, is too great, so matter changes dimensions. Our place in space – maybe our whole universe – is in a perfect-for-life temperature area, away from the biggest of places that are disappearing to be recycled using other sets of three dimensions. Such imagined mechanisms do seem to be in our universe – pulsars, black holes…all sorts of mysteries.
Temperature measurement cannot assess the quantity of radiant energy particles inside the subject both causing movement and reflected within the subject yet unable to escape through the surface. The temperature of a body is an assessment of energy escaping from the body and is not necessarily accurate relative to the energy that could be released from that body. I believe that after rocks have been in the same place for a long time then they can be cold yet hold heat that comes out when they are disturbed enough.
An example, of the reason for believing that energy can be trapped in a body, is shown by putting a clean container of water into a microwave oven. I used a glass teapot; it had to be clean to prevent the phenomena from happening all the time, like a kettle boiling, rather than explosively. For the experiments, water is filled close to the top of, in this example, a glass teapot; and microwaved until boiled, or just before; then taken out when still. After taking the water out of the oven, it explosively boils (if the physical setup is correct) if e.g. salt or sugar (any disturbance – even a teaspoon) results in the water violently boiling like “explosively” ending up everywhere mostly out of the teapot. People have had their cup of tea water burst out of the cup for a spoon of sugar put in. The reason for the teapot needing to be glass is that tiny sharp bits on ceramics, for example, allow tensions to dissipate; whereas smooth surfaces prevent the static-type forces (the heat energy) from concentrating and radiating away. The experiment works well because the round glass (the best way) keeps the radiant energy circulating and reflecting rather than getting out of the teapot.
All of nature works on harmony and induction with anything incompatible either becoming compatible or getting pushed away. I believe that ancient rocks could have a lot more energy in them than their temperatures indicate. Disturbing the rocks could release more energy than just the energy expended cracking into the rocks. The experiments will find increases relative to human activity regarding temperature gradients with depth below Earth’s surface.
People are unlikely to drill holes into the Earth in places of known volcanic activity close to the surface. In many places the depth of rock is tens of kilometres. Allowing for the insulative nature of rock, heat below the surface would have less ability, to make sub-surface rock hot, than the ability of the sub-surface rock to get to the surface and into space.
Space is cold. Putting a temperature sensor towards the sky reads about minus 50 C. For some reason aiming a temperature measuring device into the sky reads about minus 50 (one of those hand-held things that reads temperature instantly and needs adjustments for emissivity). The meter can’t specifically pick up the little layers of hot low-pressure air of a few hundred degrees C, because this layer can’t radiate much. The high temperature of some layers of air above Earth would have been found by absorbent methods of measurement not radiation color-temperature measurements; there’s almost no radiation from such matter (air).
The sun is on average half there and half not there, so the amount of energy entering the Earth from the sun is likely zero (within a tiny spark of time). Space radiates almost no energy in quantities relevant to the tiny time biological life is as it is presently on Earth. The stars are just points in the sky (re the above logic – if the universe was not just another star in the sky, then we’d face an infinitely large amount of radiant energy with nowhere to go). So I feel that any heat inside Earth gets away from Earth faster than it gets close to the surface from inside the Earth. The feeling (not logic in this case) is that to support biological life this situation has to be in place: the Sun is the source of life in the atmosphere and the surface of the Earth and a short distance under that. For biological life to be on Earth the Earth has to be in a condition where the rocks under the surface are within less than one degree of their present temperatures. I say "biological" life because Earth and every rock and every atom has life relative to what it wants to experience in the situation.
When people drill down far enough, then they’ll get to hot areas. What’s below that, though, is less certain. I get amazed that the reading of minus 50c of so is not less relative to -273; asking where does the heat observed come from – probably the hot layers and the fact that air can radiate slightly, but…?
On Mon, 25 Feb 2013 03:34:18 +0800, Charles Odendhal <...> wrote:
Like so many others, I was taught and long believed that temperatures within the Earth increased with increasing depth, that is until I became more familiar with caves and mines. That, plus my work at Shell Oil, dealing with oil fields cooling off over time and needing steam injection to loosen up the flow of now relatively cool oil, caused me to question the long standing theory of internal, increasing with depth, heat flow. However, I must agree with Warren Hunt in that all well drilling borehole temperature readings I knew about did show elevated temperatures with increasing depth. This also proves the ability of solid rock to retain the enormous heat energy being injected into it by drill bits powered by engines with hundreds of horsepower. I can only wonder what the borehole temperatures will be in the years after the well is abandoned.
I also agree with Warren Hunt in that there are many sub-surface regions where intermetals may combine with oxides and release considerable heat as well as create water: The Geysers of California for example and I am aware of some caves with extreme heating that will not allow humans to exist inside for more than a limited period, minutes in some cases. Although, most caves are relatively cold.
A close reading of Sir Robert Boyle's journal of 1671+ clearly indicates his original formula for increasing heat with increasing depth was only a pay scale formula for Tin Miners in Wales. In fact, he concluded the increasing air temperatures in which the poor bastards were required to work were more likely the result of an input of work energy, man's and lighting. Later, Lord Kelvin and, it seems many others, apparently overlooked Boyle's conclusion and probably due to his overriding obsession with all things mathematic, he made Boyle's formula into scientific dogma. Soon, radioactivity was largely accepted as the "scientific" explanation for the generation of internal heating. Eventually, later scientists extrapolated Boyle's formula down to Earth's center and the insanely hot core theory became dogma too, despite the fact that no one really knows what might be the actual temperatures below Earth's crust.
Even later, Lord Rutherford calculated that there was enough radioactive material just in the crust to provide the heat flow that was often measured in the field and no mantle heating by radioactivity was necessary. Naturally, few scientists accepted such a challenge to the famed Lord Kelvin's assumption. Since then, they have equally ignored the fact that no radioactive materials have been found in deeper portions of Earth's crust. Witness the Russian Kola Superdeep Borehole, with some heating, perhaps largely due to the energy of drilling, found only hydrogen gas and water at the end; no radioactive materials. The "scientific" rule appears to be, don't bother us with facts, we know what we believe and you must believe as we do.
As a student of Geology, later Physical Geography, at the University of Oklahoma, heat flow measurements were made to demonstrate the concept of internal heating. Students used a gas powered drill motor on top of a large tripod. Sections of drill rod were forced, [key word, forced] into rock layers to a fair depth and measurements taken of the borehole temperatures. Sure enough, the temperatures increased with increasing depth. When some maverick student, name unknown, rechecked a borehole made during the past semester by others, the bottom hole temperature was found to be much less. I concluded the original, recorded increasing temperatures mainly resulted from the heat energy created by the drilling. This was not accepted by the professor and the decreased temperatures in the last semester's borehole was not recorded or discussed.
I also tried to get the professor to allow us to drill into a rock layer located next to the shaded wall of a nearby gully, which had never been exposed to sunlight. This was not allowed. However, somewhere in my records, I remember having just such a heat flow measurement done in Europe. This resulted in the conclusion that solar radiation played a significant role in heat flow in surface layers. Another study involved measuring radial heat flow inside a tunnel being built for train travel in Europe. I remember this study concluded that heat flow increased with depth, up, down and sideways, and the heat energy of drilling appeared to be the main contribution to internal heat flow. When our latest snow storm, including a ground blizzard which may last another day or two, gives way to more reasonable temperatures, I'll drag out some old boxes last opened 50 years ago and see if I can find these references.
Meanwhile, an observation in support of Neil Christianson's reference about deep caves: ALL of deepest known caves are said to be "freezing" inside. This is NOT caused by cold air descending and being trapped in their depths; the usual "scientific" explanation for low temperatures in caves. In fact some of these caves have "sumps" which completely block the flow of outside air and must be traversed by scuba gear, with which I have enjoyed observing activity underwater in the past. One would think, being some 7,000 feet below the surface, more or less, that if the dogma of increasing temperature with increasing depth was valid, then ALL these caves would be extremely heated, not "freezing."
go to the Index for ChiSync
go to the top of the page
Dynamic Drive for free, original DHTML scripts and