We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this chapter, we use quantum statistics to help us understand the distinctive electrical properties of conductors, semiconductors, and insulators. We have previously learned that although the occupation number has the same form for all systems, the spectrum of accessible states varies from one to the next. For this reason, we begin this chapter with a brief and simplified overview of band structure.
Band structure
The splitting of levels
As atoms are brought close together, the overlapping of their electron clouds allows electrons to move from one atom to another. These interactions with their neighbors cause shifts in the allowed electron energies. What were initially states of identical energies in isolated identical atoms turn into “bands” of very closely spaced states for the shared electrons in groups of atoms.
Band widths and structure
In general, the outer states of higher energy experience greater overlap, which usually results in greater splitting and wider bands (Figure 23.1). Within any band, the density of states usually is largest near the middle and falls off near the edges (Figure 23.2). Electrons preferentially fill the lowest energy states, so at low temperatures the lower bands are full and the higher bands are empty.
The highest completely filled band in the T → 0 limit is called the “valence band.” Because it is full, there are no empty states into which these valence electrons can move. (Although pairs could trade places, they are identical particles, so it is the same as staying put.)
We now examine interacting systems. We will find that the number of states for the combined system is extremely sensitive to the distribution of energy among the interacting subsystems, having a very sharp narrow peak at some “optimum” value (Figure 7.1). Configurations corresponding to a greater number of accessible states are correspondingly more probable, so the distribution of energy is most probably at the peaked optimum value. Even the slightest deviation would cause a dramatic reduction in the number of accessible states and would therefore be very improbable.
This chapter is devoted to developing this statement of probabilities, which underlies the most powerful tools of thermodynamics. We elevate it to the stature of a “law.” Even though there is some small probability that the law may be broken, it is so minuscule that we can rest assured that we will never see it violated by any macroscopic system. Rivers will flow uphill and things will freeze in a fire if the law is broken. No one has ever seen it happen, and you can bet that you won't either.
Microscopic examples
We now investigate some examples of how the number of states is affected by the distribution of energy between interacting systems. Consider the situation of Figure 7.1, where an isolated system A0 is composed of two subsystems, A1 and A2, which may be interacting in any manner.
Engines convert heat into work. Thermodynamics owes both its name, “heat-motion”, and much of its early development to the study of engines. The working system for most engines interacts both thermally and mechanically with other systems, so its properties depend on two independent variables. Most engines are cyclical, so that the working system goes through the following stages:
it is heated;
it expands and does work, pushing a piston or turbine blades;
it is cooled further;
it is compressed back into its original state, ready to begin the cycle again.
The expansion occurs when the working system is hot and is under high pressure or has a larger volume, and the compression occurs when it is cooler and is under lower pressure or has a smaller volume. Therefore, the work done by the engine while expanding is greater than the work done on the engine while being compressed. So there is a net output of work by the engine during each cycle. This is what makes engines useful. If you can understand this paragraph, you understand nearly all engines.
The details vary from one engine to the next. The working system could be any of a large variety of gases or volatile liquids. The source of heating could be such things as a flame, a chemical explosion, heating coils, steam pipes, sunlight, or a nuclear reactor. The cooling could be provided by such things as air, water, ice, evaporation, or radiative coils.
The preceding chapters introduced the fundamental ideas that connect the microscopic and macroscopic behavior of systems. They also gave an overview of the three types of interactions between systems and how the second law controls them. These concepts form the statistical basis of thermodynamics, and the tools are so general that they can be applied to almost any system imaginable. This is the single most impressive feature of the subject. Unfortunately, it is also the single most confusing feature of the subject. There are so many different kinds of systems and such a variety of parameters – internal energy, temperature, pressure, entropy, volume, chemical potential, number of particles, and many more. Furthermore, the interdependence among these parameters varies from one system to the next and in ways that are usually not specified. Consequently, we often deal with general and abstract expressions, each involving many parameters whose interrelationships are either vague or unknown.
But the large number of parameters can be turned to our advantage. We don't need them all, so we can choose to use whichever we wish and ignore the rest. Furthermore, their behaviors and interrelationships are heavily constrained. In this and the following chapters we learn how to make order out of chaos through a judicious choice of parameters and the application of constraints.