Keywords

Improvement of performance and shrinkage of device sizes in microelectronics have been major driving forces for scientific and economic progress over the past 40 years. Developments in semiconductor processing and surface sciences have allowed precise control over critical dimensions with desirable properties for solid-state devices. In the past 30 years, there have been tremendous developments in micro- and nanoelectromechanical systems (MEMS and NEMS), microfluidics and nanofluidics, quantum structures and devices, photonics and optoelectronics, nanomaterials for molecular sensing and biomedical diagnosis, and scanning probe microscopy for measurement and manipulation at the molecular and atomic levels.

Nanotechnology opens new frontiers in science and engineering, and has also become an integral part of almost all natural science and engineering disciplines. Back in 2007, about 10% of the faculty members at Georgia Tech were conducting some research related to nanoscience and nanoengineering. The number of faculty and research projects related to micro/nanoscales has grown significantly. The same can be said for most major research universities in the United States and in many other countries. Furthermore, the study of nanoscience and nanoengineering requires and has resulted in close interactions across the boundaries of many traditional disciplines. Knowledge of physical behavior at the molecular and atomic levels has played and will continue to play an important role in our understanding of the fundamental processes occurring in the macro world. This will enable us to design and develop novel devices and machines, ranging from a few nanometers all the way to the size of automobiles and airplanes [1, 2]. Ten to fifteen years ago, many people either had never heard about the word “nanotechnology” or had doubts about the usefulness in real world. Today, we depend on and enjoy nanotechnology in our daily life, including cell phones, laptops, computers, internet, medicine and medical devices, energy harvesting, transportation, lighting, batteries, smart clothes, and so on, you name it. The advancement of nano/microscale science and engineering will continue to restructure the technologies currently used in manufacturing, energy production and utilization, communication, transportation, space exploration, and medicine.

A key issue associated with miniaturization is the tremendous increase in the heat dissipation per unit volume. Micro/nanostructures may enable engineered materials with unique thermal properties to allow significant enhancement or reduction of the heat flow rate. Therefore, knowledge of thermal transport from the micrometer scale down to the nanometer scale and thermal properties of micro/nanostructures is of critical importance to future technological growth. Solutions to more and more problems in small devices and systems require a solid understanding of the heat (or more generally, energy) transfer mechanisms in reduced dimensions and/or short time scales, because classical equilibrium and continuum assumptions are not valid anymore. Examples are the thermal analysis and design of micro/nanodevices, thermal management in flexible electronics, ultrafast laser interaction with materials, micromachined thermal sensors and actuators, thermoelectricity in nanostructures, photonic crystals, microscale thermophotovoltaic devices, battery thermal management, and so on [3, 4].

This book was motivated by the need to understand the thermal phenomena and heat transfer processes in micro/nanosystems and at very short time scales for solving problems occurring in contemporary and future technologies. Since the first publication in 2007, many universities have offered micro/nanoscale heat transfer courses and used it as either the textbook or major reference. Significant progress has been made in the last decade and this second edition reflects a major update.

1.1 Limitations of the Macroscopic Formulation

As an ancient Chinese philosopher put it, suppose you take a foot-long wood stick and cut off half of it each day; you will never reach an end even after thousands of years, as shown in Fig. 1.1. Modern science has taught us that, at some stage, one would reach the molecular level and even the atomic level, below which the physical and chemical properties are completely different from those of the original material. The wooden stick or slice would eventually become something else that is not distinguishable from the other constituents in the atmosphere. Basically, properties of materials at very small scales may be quite different from those of the corresponding bulk materials. Note that 1 nm (nanometer) is one-billionth of a meter. The diameter of a hydrogen atom H is on the order of 0.1 nm, and that of a hydrogen molecule H2 is approximately 0.3 nm. Using the formula \(l_{n} = 0.3048/2^{n - 1}\) m, where n is number of days, we find \(l_{30} = 5.7 \times 10^{ - 10}\) m (or 0.57 nm) after just a month, which is already near the diameter of a hydrogen atom (about 0.1 nm).

Fig. 1.1
figure 1

The length of the wood stick: \(l_{1} = 1\) ft in day 1, \(l_{2} = 1/2\) ft in day 2, and \(l_{n} = 1/2^{n - 1}\) ft in day n

While atoms can still be divided with large and sophisticated facilities, our ability to observe, manipulate, and utilize them is very limited. On the other hand, most biological processes occur at the molecular level. Many novel physical phenomena happen at the length scale of a few nanometers and can be integrated into large systems. This is why a nanometer is a critical length scale for the realization of practically important new materials, structures, and phenomena. For example, carbon nanotubes with diameters ranging from 0.4 to 50 nm or so have dramatically different properties. Some researchers have shown that these nanotubes hold promise as the building block of nanoelectronics. Others have found that the thermal conductivity of single-walled carbon nanotubes at room temperature could be an order of magnitude higher than that of copper. Therefore, carbon nanotubes have been considered as a candidate material for applications that require a high heat flux.

In conventional fluid mechanics and heat transfer, we treat the medium as a continuum, that is, indefinitely divisible without changing its physical nature. All the intensive properties can be defined locally and continuously. For example, the local density is defined as

$$\rho = \mathop {\lim }\limits_{\delta V \to 0} \frac{\delta m}{\delta V}$$
(1.1)

where \(\delta m\) is the mass enclosed within a volume element \(\delta V\). When the characteristic dimension is comparable with or smaller than that of the mechanistic length—for example, the molecular mean free path, which is the average distance that a molecule travels between two collisions—the continuum assumption will break down. The density defined in Eq. (1.1) will depend on the size of the volume, \(\delta V\), and will fluctuate with time even at macroscopic equilibrium. Noting that the mean free path of air at standard atmospheric conditions is about 70 nm, the continuum assumption is well justified for many engineering applications until the submicrometer regime or the nanoscale is reached. Nevertheless, if the pressure is very low, as in an evacuated chamber or at a high elevation, the mean free path can be very large; and thus, the continuum assumption may break down even at relatively large length scales.

Within the macroscopic framework, we calculate the temperature distribution in a fluid or solid by assuming that the medium under consideration is not only a continuum but also at thermodynamic equilibrium everywhere. The latter condition is called the local-equilibrium assumption, which is required because temperature can be defined only for stable-equilibrium states. With extremely high temperature gradients at sufficiently small length scales and/or during very short periods of time, the assumption of local equilibrium may be inappropriate. An example is the interaction between short laser pulses and a material. Depending on the type of laser, the pulse duration or width can vary from a few tens of nanoseconds down to several femtoseconds (1 fs = 10−15 s). In the case of ultrafast laser interaction with metals, free electrons in the metal could gain energy quickly to arrive at an excited state corresponding to an effective temperature of several thousand kelvins, whereas the crystalline lattices remain near room temperature. After an elapse of time represented by the electron relaxation time, the excess energy of electrons will be transferred to phonons, which are energy quanta of lattice vibration, thereby causing a heating effect that raises the temperature or changes the phase of the material under irradiation.

Additional mechanisms may affect the behavior of a system as the physical dimensions shrink or as the excitation and detection times are reduced. A scale-down of the theories developed from macroscopic observations often proves to be unsuitable for applications involving micro/nanoscale phenomena. Examples are reductions in the conductivity of thin films or thin wires due to boundary scattering (size effect), discontinuous velocity and temperature boundary conditions in microfluidics, wave interferences in thin films, and tunneling of electrons and photons through narrow gaps. In the quantum limit, the thermal conductance of a nanowire will reach a limiting value that is independent of the material that the nanowire is made of. At the nanoscale, the radiation heat transfer between two surfaces can exceed that calculated from the Stefan-Boltzmann law by several orders of magnitude. Another effect of miniaturization is that surface forces (such as shear forces) will scale down with \(L^{2}\), where L is the characteristic length, while volume forces (such as buoyancy) will scale down with \(L^{3}\). This will make surface forces predominant over volume forces at the microscale.

1.2 The Length Scales

It is instructive to compare the length scales of different phenomena and structures, especially against the wavelength of the electromagnetic spectrum. Figure 1.2 compares the wavelength ranges with some characteristic dimensions. One can see that MEMS generally produce micromachining capabilities from several millimeters down to a few micrometers. Currently, the smallest feature of integrated circuits is well below 100 nm. The layer thickness of thin films ranges from a few nanometers up to several micrometers. The wavelengths of the visible light are in the range from approximately 380 to 760 nm. On the other hand, thermal radiation covers a part of the ultraviolet, the entire visible and infrared, and a portion of the microwave region. The thickness of human hair is between 50 and 100 μm, while the diameter of red blood cells is about 6–8 μm. A typical optical microscope can magnify 100 times with a resolution of 200–300 nm, which is about half the wavelength and is limited due to the diffraction of light. Therefore, optical microscopy is commonly used to study micrometer-sized objects. On the other hand, atoms and molecules are on the order of 1 nm, which falls in the x-ray and electron-beam wavelength region. Therefore, x-ray and electron microscopes are typically used for determining crystal structures and defects, as well as for imaging nanostructures. The development of scanning probe microscopes (SPMs) and near-field scanning optical microscopes (NSOMs) in the 1980s enabled unprecedented capabilities for the visualization and manipulation of nanostructures, such as nanowires, nanotubes, nanocrystals, single molecules, individual atoms, and so forth, as will be discussed in Sect. 1.3.4. Figure 1.2 also shows that the mean free path of heat carriers (e.g., molecules in gases, electrons in metals, and phonons or lattice vibration in dielectric solids) often falls in the micrometer to nanometer scales, depending on the material, temperature, and type of carrier.

Fig. 1.2
figure 2

Characteristic length scales as compared with the wavelength of electromagnetic spectrum

A brief historical retrospective is given next on the development of modern science and technologies, with a focus on the recent technological advances leading to nanotechnology. The role of thermal engineering in this technological advancement process is outlined.

1.3 From Ancient Philosophy to Contemporary Technologies

Understanding the fundamentals of the composition of all things in the universe, their movement in space and with time, and the interactions between one and another is a human curiosity and the inner drive that makes us different from other living beings on the earth. The ancient Chinese believed that everything was composed of the five elements: metal, wood, water, fire, and earth (or soil) that generate and overcome one another in certain order and time sequence. These simple beliefs were not merely used for fortune-telling but have helped the development of traditional Chinese medicine, music, military strategy, astronomy, and calendar. In ancient Greece, the four elements (fire, earth, air, and water) were considered as the realm wherein all things existed and whereof all things consisted. These classical element theories prevailed in several other countries in somewhat different versions for over 2000 years, until the establishment of modern atomic theory that began with John Dalton’s experiment on gases some 200 years ago. In 1811, Italian chemist Amedeo Avogadro introduced the concept of the molecule, which consists of stable systems or bound state of atoms. A molecule is the smallest particle that retains the chemical properties and composition of a pure substance. The first periodic table was developed by Russian chemist Dmitri Mendeleev in 1869. Although the original meaning of atom in Greek is “indivisible,” subatomic particles have since been discovered. For example, electrons as a subatomic particle were discovered in 1897 by J. J. Thomson, who won the 1906 Nobel Prize in Physics. An atom is known as the smallest unit of one of the 118 confirmed elements.

The first industrial revolution began in the late eighteenth century and boosted the economy of western countries from manual labor to the machine age by the introduction of machine tools and textile manufacturing. Following the invention of the steam engine in the mid nineteenth century, the second industrial revolution had an even bigger impact on human life through the development of steam-powered ships and trains, along with the internal combustion engines, and the generation of electrical power. Newtonian mechanics and classical thermodynamics have played an indispensable role in the industrial revolutions. The development of machinery and the understanding of the composition of matter have allowed unprecedented precision of experimental investigation of physical phenomena, leading to the establishment of modern physics in the early twentieth century.

The nature of light has long been debated. In the seventeenth century, Isaac Newton formulated the corpuscular theory of light and observed with his prism experiment that sunlight is composed of different colors. In the early nineteenth century, the discovery of infrared and ultraviolet radiation and Young’s double-slit experiment confirmed Huygens’ wave theory, which was overshadowed by Newton’s corpuscular theory for over 100 years. With the establishment of Maxwell’s equations that fully describe the electromagnetic waves and Michelson’s interferometric experiment, the wave theory of radiation had been largely accepted by the end of the nineteenth century. While the wave theory was able to explain most of the observed phenomena, it could not explain thermal emission over a wide spectrum, nor was it able to explain the photoelectric effect. Max Planck in 1900 used the hypothesis of radiation quanta, or oscillators, to successfully derive the blackbody spectral distribution function. It appears that the energy of light is not indefinitely divisible but must exists in multiples of the smallest massless quanta that we now call photons. In 1905, Albert Einstein explained the photoelectric effect based on the concept of radiation quanta. To knock out an electron from the metal surface, the energy of each incoming phonon (hν) must be sufficiently large because one electron can absorb only one photon. This explained why photoemission could not occur at frequencies below the threshold value, no matter how intense the incoming light might be. In 1924, Louis de Broglie hypothesized that particles should also exhibit wavelike characteristics. With the electron diffraction experiment, it was found that electrons indeed can behave like waves with a wavelength inversely proportional to the momentum. Electron microscopy was based on the principle of electron diffraction. The wave–particle duality was essential to the establishment of quantum mechanics in the early twentieth century. Quantum mechanics describes the phenomena occurring in minute particles, structures, and their interaction with radiation, for which classical mechanics and electrodynamics are not applicable. The fundamental scientific understanding gained during the first half of the twentieth century has facilitated the development of contemporary technologies that have transformed from the industrial economy to the knowledge-based economy and from the machine age to the information age. The major technological advancements in the last half of the century are highlighted in the following sections.

1.3.1 Microelectronics and Information Technology

In his master’s thesis at MIT published in 1940, Claude Shannon (1916–2001) used the Boolean algebra and showed how to use TRUE and FALSE to represent function of switches in electronic circuits. Digital computers were invented during the 1940s in several countries, including the IBM Mark I which is 2.4 m high and 16 m long. In 1948, while working at Bell Labs, Shannon published an article, “A Mathematical Theory of Communication,” which marked the beginning of the modern communication and information technology [5]. In that paper, he laid out the basic principles of underlying communication of information with two symbols, 1 and 0, and coined the term “bit” for a binary digit. His theory made it possible for digital storage and transmission of pictures, sounds, and so forth.

In December 1947, scientists at Bell Labs invented the semiconductor point-contact transistor with germanium. The earlier computers and radios were based on bulky vacuum tubes that generated a huge amount of heat. The invention of transistor by William Shockley, John Bardeen, and Walter Brattain was recognized through the Nobel Prize in Physics conferred on them in 1956. There had been intensive research on semiconductor physics using the atomic theory and the mechanism of point contact for the fabrication of transistor to become possible. The invention of transistors ushered the information age with a whole new industry.

In 1954, Gordon Teal at Texas Instruments built the first silicon transistor. The native oxide of silicon appeared to be particularly suitable as the electric insulator. In 1958, Jack Kilby (1923–2005) at Texas Instruments was able to cramp all the discrete components onto a silicon base and later onto one piece of germanium. He filed a patent application the next year on “Miniaturized Electronic Circuits,” where he described how to make integrated circuits and connect the passive components via gold wires. Working independently, Robert Noyce at Fairchild Electronics in California found aluminum to adhere well to both silicon and silicon oxide and filed a patent application in 1959 on “Semiconductor Device-and-Lead Structure.” Kilby and Noyce are considered the co-inventors of integrated circuits. Noyce was one of the founders of Intel and died in 1990. Kilby was awarded half of the Nobel Prize in Physics in 2000 “for his part in the invention of the integrated circuit.” The other half was shared by Zhores Alferov and Herbert Kroemer for developing semiconductor heterogeneous structures used in optoelectronics, to be discussed in the next section.

In 1965, around 60 transistors could be packed on a single silicon chip. Seeing the fast development and future potential of integrated circuits, Gordon Moore, a co-founder of Intel, made a famous prediction that the number and complexity of semiconductor devices would double every year [6]. This is Moore’s law, well-known in the microelectronics industry [7]. In the mid-1970s, the number of transistors on a chip increased from 60 to 5000. By 1985, the Intel 386 processor contained a quarter million transistors on a chip. In 2001, the Pentium 4 processor reached 42 million transistors. The number has now exceeded 1 billion per chip in 2006. When the device density is plotted against time in a log scale, the growth almost follows a straight line, suggesting that the packaging density has doubled approximately every 18 months till recent years [7, 8]. Reducing the device size and increasing the packaging density have several advantages. For example, the processor speed increases by reducing the distance between transistors. Furthermore, new performance features can be added into the chip to enhance the performance. The cost for the same performance also reduces. Advanced supercomputer systems have played a critical role in enabling modeling and understanding micro/nanoscale phenomena.

The process is first to grow high-quality silicon crystals and then dice and polish into wafers. Devices are usually made on SiO2 layer that can be grown by heating the wafer to sufficiently high temperatures in a furnace with controlled oxygen partial pressure. The wafers are then patterned using photolithographic techniques combined with etching processes. Donors and acceptors are added to the wafer to form n- and p-type regions by ion implantation and then annealed in a thermal environment. Metals or heavily doped polycrystalline silicon are used as gates with proper coverage and patterns through lithography. A schematic of metal-oxide-semiconductor field-effect transistor (MOSFET) is shown in Fig. 1.3. Billions of transistors can be packed into the size of a fingernail with several layers through very-large-scale integration (VLSI) with the smallest features on the order of 5 nm. As mentioned earlier, managing heat dissipation is a challenge especially as the device dimension continues to shrink. Local heating or hot spots on the size of 10 nm could cause device failure if not probably handled. The principles governing the heat transfer at the nanoscale are very different from those at large scales. A fundamental understanding of the phonon transport is required for device-level thermal analysis. Furthermore, understanding heat transfer in microfluidics is necessary to enable reliable device cooling at the micro- and nanoscales.

Fig. 1.3
figure 3

Schematic of a metal-oxide-semiconductor field-effect transistor (MOSFET)

The progress in microelectronics is not possible without the advances in materials such as crystal growth and thermal processing during semiconductor manufacturing, as well as the deposition and photolithographic technologies. Rapid thermal processing (RTP) is necessary during annealing and oxidation to prevent ions from deep diffusion into the wafer. Thermal modeling of RTP must consider the combined conduction, convection, and radiation modes. Lightpipe thermometer is commonly used to monitor the temperature of the wafer. In an RTP furnace, the thermal radiation emitted by the wafer is collected by the light pipe and then transmitted to the radiometer for inferring the surface temperature [9]. In some cases, the wafer surface is rough with anisotropic features. A better understanding of light scattering by anisotropic rough surfaces is also necessary.

As the process node continues to shrink, high-intensity Ar or Xe arc lamps with millisecond optical pulses are considered as a suitable annealing tool following ion implantation in ultra-shallow junction fabrication. Because the optical energy is absorbed within milliseconds, thermal diffusion cannot distribute heat uniformly across the wafer surface. Therefore, temperature uniformity across the nanometer-patterned wafer is expected to be a critical issue. To reduce the feature size further, deep-UV lithography and x-ray lithography have also been developed. It is inevitable that Moore’s law will reach its limit, when the critical dimensions would be less than a few nanometers. Further reduction will be subjected to serious barriers due to problems associated with gate dielectrics and fabrication difficulties. Beyond Moore’s law, there are continuous challenges in improving the energy efficiency, overall performance, stability, flexibility, and cost efficiency. Two-dimensional (2D) and 3D very large-scale integration (VLSI) architectures using stacked or sequential integrated systems/circuits may offer future technological solutions. Molecular nanoelectronics using self-assembly has been sought for as an alternative, along with quantum computing. Therefore, nanoelectronics and quantum computing are anticipated to brighten the electronics and computer future.

1.3.2 Lasers, Optoelectronics, and Nanophotonics

It is hard to imagine what the current technology would look like without lasers. Lasers of different types have tremendous applications in metrology, microelectronics fabrication, manufacturing, medicine, and communication. Examples are laser printers, laser bar code readers, laser Doppler velocimetry, laser machining, and laser corneal surgery for vision correction. The concept of laser was demonstrated in late 1950s independently in the United States and the Soviet Union during the cold war. The Nobel Prize in Physics of 1964 recognized the fundamental contributions in the field of quantum electronics by Charles Townes, Nicolay Basov, and Aleksandr Prokhorov. The first working laser was Ruby laser built by Theodore Maiman at Hughes Aircraft Company in 1960. The principle of laser dates back to 1917, when Einstein elegantly depicted his conception of stimulated emission of radiation by atoms. Unlike thermal emission and plasma emission, lasers are coherent light sources and, with the assistance of optical cavity, lasers can emit nearly monochromatic light and point to the same direction with little divergence. Lasers enabled a branch of nonlinear optics, which is important to understand the fundamentals of light–matter interactions, communication, as well as optical computing. In 1981, Nicolaas Bloembergen and Arthur Schawlow received the Nobel Prize in Physics for their contributions in laser spectroscopy. There are a variety of nonlinear spectroscopic techniques, including Raman spectroscopy, as reviewed by Fan and Longtin [10]. Two-photon spectroscopy has become an important tool for molecular detection [11]. Furthermore, two-photon 3D lithography has also been developed for microfabrication [12, 13].

Gas lasers such as He–Ne (red) and Ar (green) have been extensively used for precision alignment, dimension measurements, and laser Doppler velocimetry due to their narrow linewidth. On the other hand, powerful Nd:YAG and CO2 lasers are used in thermal manufacturing, where the heat transfer processes include radiation, phase change, and conduction [14, 15]. Excimer lasers create nanosecond pulses in ultraviolet and have been extensively used in materials processing, ablation, eye surgery, dermatology, as well as photolithography in microelectronics and microfabrication. High-energy nanosecond pulses can also be produced by Q-switching, typically with a solid-state laser such as Nd:YAG laser at a wavelength near 1 μm. On the other hand, mode-locking technique allows pulse widths from picoseconds down to a few femtoseconds. Pulse durations less than 10 fs have been achieved since 1985. Ultrafast lasers have enabled the study of reaction dynamics and formed a branch in chemistry called femtochemistry. Ahmed Zewail of Caltech received the 1999 Nobel Prize in Chemistry for his pioneering research in this field. In 2005, John Hall and Thoedor Hänsch received the Nobel Prize in Physics for developing laser-based precision spectroscopy, in particular, the frequency comb technique. Short-pulse lasers can facilitate fabrication, the study of electron–phonon interaction in the nonequilibrium process, measurement of thermal properties including interface resistance, nondestructive evaluation of materials, and so forth [1620].

Room-temperature continuous-operation semiconductor lasers were realized in May 1970 by Zhores Alferov and co-workers at the Ioffe Physical Institute in Russia, and independently by Morton Panish and Izuo Hayashi at Bell Labs a month later. Alferov received the Nobel Prize in Physics in 2000, together with Herbert Kroemer who conceived the idea of double-heterojunction laser in 1963 and was also an earlier pioneer of molecular beam epitaxy (MBE). Invented in 1968 by Alfred Cho and John Arthur at Bell Labs and developed in the 1970s, MBE is a high-vacuum deposition technique that enables the growth of highly pure semiconductor thin films with atomic precision. The name heterojunction refers to two layers of semiconductor materials with different bandgaps, such as GaAs/AlxGa1-xAs pair. In a double-heterojunction structure, a lower-bandgap layer is sandwiched between two higher-bandgap layers [21]. When the middle layer is made thin enough, on the order of a few nanometers, the structure is called a quantum well because of the discrete energy levels and enhanced density of states. Quantum well lasers can have better performance with a smaller driving current. Multiple quantum wells (MQWs), also called superlattices, that consist of periodic structures can also be used to further improve the performance. In a laser setting, an optical cavity is needed to confine the laser bandwidth as well as enhance the intensity at a desired wavelength with narrow linewidth. Distributed Bragg reflectors (DBRs) are used on both ends of the quantum well (active region). DBRs are the simplest photonic crystals made of periodic dielectric layers of different refractive indices; each layer thickness is equal to a quarter of the wavelength in that medium (\(\lambda /n\)). DBRs are dielectric mirrors with nearly 100% reflectance, except at the resonance wavelength λ, where light will eventually escape from the cavity. Figure 1.4 illustrates a vertical cavity surface emitting laser (VCSEL), where light is emitted through the substrate (bottom of the structure). The energy transfer mechanisms through phonon waves and electron waves have been extensively investigated [22]. Further improvement in the laser efficiency and control of the wavelength has been made using quantum wires and quantum dots (QDs) [21].

Fig. 1.4
figure 4

Schematic of a VCSEL laser made of heterogeneous quantum well structure. The smaller layer thickness can be 3 nm, and there can be as many as several hundred layers

Semiconductor lasers are the most popular lasers (in quantity), and several hundred-million units are sold each year. Their applications include CD/DVD reading/writing, optical communication, laser pointers, laser printers, bar code readers, and so forth. A simpler device is the light-emitting diode (LED), which emits incoherent light with a two-layer p-n junction without DBRs. LEDs have been used for lighting, including traffic lights with improved efficiency and decorating lights. The development of wide-bandgap materials, such as GaN and AlN epitaxially grown through metal-organic chemical vapor deposition (MOCVD), allows the LED and semiconductor laser wavelength to be pushed to the blue and ultraviolet. For their invention of efficient blue LEDs, Isamu Akasaki, Hiroshi Amano, and Shuji Nakamura were recognized by the 2014 Nobel Prize in Physics. Organic light-emitting diodes (OLEDs) based on electroluminescence are being developed as a promising candidate for the next-generation computer and TV displays.

Alongside the development of light sources, there have been continuous development and improvement in photodetectors, mainly in focal plane arrays, charge-coupled devices (CCDs), quantum well detectors, readout electronics, data transfer and processing, compact refrigeration and temperature control, and so forth. On the other hand, optical fibers have become an essential and rapidly growing technology in telecommunication and computer networks. The optical fiber technology for communication was developed in the 1970s along with the development of semiconductor lasers . In 1978, Nippon Telegraph and Telephone (NTT) demonstrated the transmission of 32 Mbps (million-bits-per-second) through 53 km of graded-index fiber at 1.3-μm wavelength. By 2001, 3 × 1011 m of fiber-optic wires have been installed worldwide; this is a round-trip from the earth to the sun. In March 2006, NEC Corporation announced a 40-Gbps optical-fiber transmission system. The 2009 Nobel Prize in Physics was conferred to Charles K. Kao for his achievements concerning the transmission of light in fibers for optical communications and to Willard S. Boyle and George E. Smith for the invention of CCD sensor. Optical fibers have also been widely applied as sensors for biochemical detection as well as temperature and pressure measurements. Fiber drawing process involves complicated heat transfer and fluid dynamics at different length scales and temperatures [2325].

Nanophotonics (or nano-optics) is an emerging frontier that integrates photonics with physics, chemistry, biology, materials science, manufacturing, and nanotechnology. The foundation of nanophotonics is to study interactions between light and matter, to explore the unique characteristics of nanostructures for utilizing light energy, and to develop novel nanofabrication and sensing techniques. Recent studies have focused on photonic crystals, nanocrystals, plasmonic waveguides, nanofabrication and nanolithography, light interaction with organic materials, biophotonics, biosensors, quantum electrodynamics, nanocavities, quantum dot and quantum wire lasers, solar cells, and so forth. In the field of thermal radiation, new workshops have been established [2628] and new experimental discoveries have been made [29,30,31].

1.3.3 Microfabrication and Nanofabrication

Richard Feynman, one of the best theoretical physicists of his time and a Nobel Laureate in Physics, delivered a visionary speech at Caltech in December 1959, entitled “There’s plenty of room at the bottom.” At that time, lasers had never existed and integrated circuits had just been invented and were not practically useful, and a single computer that is not as fast as a present-day handheld calculator would occupy a whole classroom with enormous heat generation. Feynman envisioned the future of controlling and manipulating things on very small scales, such as writing (with an electron beam) the whole 24 volumes of Encyclopedia Britannica on the head of a pin and rearranging atoms one at a time [32]. Many of the things Feynman predicted were once considered scientific fictions or jokes but have been realized in practice by now, especially since 1980s. In 1983, Feynman gave a second talk about the use of swimming machine as a medical device: the surgeon that you could swallow, as well as quantum computing [33]. In the 1990s, micromachining and MEMS emerged as an active research area, with a great success by the commercialization of the micromachined accelerometers in the automobile airbag. Using the etching and lithographic techniques, engineers were able to manufacture microscopic machines with moving parts, as shown in Fig. 1.5, such as gears with a size less than the cross-section of human hair. The technologies used in microfabrication have been extensively discussed in the text of Madou [34]. These MEMS devices were later developed as tools for biological and medical diagnostics, such as the so-called lab-on-a-chip, with pump, valve, and analysis sections on the 10–100 μm scale. In aerospace engineering, an application is to build micro air vehicles or microflyers, with sizes ranging from a human hand down to a bumblebee, that could be used for surveillance and reconnaissance under extreme conditions. Microchannels and microscale heat pipes have also been developed and tested for electronic cooling applications. The study of microfluidics has naturally become an active research area in mechanical engineering. The development of SPM and MEMS technologies, together with materials development through self-assembly and other technologies, lead to further development of even smaller structures and the bottom-up approach of nanotechnology. Laser-based manufacturing, focused ion beam (FIB), and electron-beam lithography have also been developed to facilitate nanomanufacturing. In NEMS, quantum behavior becomes important and quantum mechanics is inevitable in understanding the behavior.

Fig. 1.5
figure 5

Courtesy of Sandia National Laboratories, https://www.sandia.gov/mesa/mems/

MEMS structures. a A dust mite on a microfabricated mirror assembly, where the gears are smaller than the thickness of human hair. b Drive gear chain with linkages, where coagulated red blood cells are on the upper left and the lower right and a grain of pollen is on the upper right.

Robert Curl, Harold Kroto, and Richard Smalley were winners of the Nobel Prize in Chemistry in 1996 for their discovery of fullerenes in 1985 at Rice University, during a period Kroto visited from University of Sussex. The group used pulsed laser irradiation to vaporize graphite and form carbon plasma in a pressurized helium gas stream. The result as diagnosed by time-of-flight mass spectroscopy suggested that self-assembled C60 molecules were formed and would be shaped like a soccer ball with 60 vertices made of the 60 carbon atoms [35]. The results were confirmed later to be C60 molecules indeed with a diameter on the order of 1 nm with wave–particle duality. This type of carbon allotrope is called a buckminsterfullerene, or fullerene, or buckyball, after the famous architect Buckminster Fuller (1895–1983) who designed geodesic domes. In 1991, Sumio Iijima of NEC Corporation synthesized carbon nanotubes (CNTs) using arc discharge. Soon his group and an IBM group were able to produce single-walled carbon nanotubes (SWNTs) with a diameter on the order of 1 nm. There have been intensive studies of CNTs for hydrogen storage, nanotransistors, field emission , light emission and absorption, quantum conductance, nanocomposites, and high thermal conductivity . Figure 1.6a shows CNTs growth at a room-temperature environment by chemical vapor deposition on a heated cantilever tip with a size around 5 μm [36]. Figure 1.6b shows the synthesized SWNTs with encapsulated metallofullerenes of Gd:C82 (i.e., a gadolinium inside a fullerene molecule). The high-resolution transmission electron microscope (TEM) image suggests that the diameter of the SWNT is from 1.4 to 1.5 nm [37]. It should be noted that electron microscopes, including SEM and TEM, have become a powerful tool for imaging micro/nanoscale objects with a magnification up to 2 million. The first electron microscope was built by Ernst Ruska and Max Knoll in Germany during the early 1930s, and Ruska shared the Nobel Prize in Physics in 1986 for his contributions to electron optics and microscopy.

Fig. 1.6
figure 6

Reprinted with permission from Gao et al. [38]; copyright (2005) AAAS (image courtesy of Prof. Z. L. Wang, Georgia Tech)

Examples of nanostructures. a SEM image of CNTs grown on heated cantilever tip. Reprinted with permission from Sunden et al. [36]; copyright (2006) American Institute of Physics.b Buckyballs inside a SWNT (the lower is a TEM image in which the nanotube diameter is 1.4–1.5 nm). Reprinted with permission from Hirahara et al. [37]; copyright (2000) American Physical Society. c TEM images of ZnO nanobelts that are coiled into nanohelices or nanosprings.

Various nanostructured materials have been synthesized, such as silicon nanowires, InAs/GaAs QDs, and Ag nanorods. Figure 1.6c shows some images for nanohelices or nanosprings made of ZnO nanobelts or nanoribbons using a solid-vapor process [38, 39]. These self-assembled structures under controlled conditions could be fundamental to the study of electromagnetic coupled nanodevices for use as sensors and actuators, as well as the growth dynamics at the nanoscale.

Since 2004, graphene and other two-dimensional (2D) sheet materials have received great attention due to their amazing and unusual properties. The combination of these materials with micro/nanofabrication holds enormous potentials to revolutionize current microelectronic, optoelectronic, and photonic devices as well as energy harvesting systems [40,41,42,43,44,45,46,47,48]. For their groundbreaking experiments exfoliating graphene and charactering its properties, Andre Geim and Konstantin Novoselov were awarded the 2010 Nobel Prize in Physics. As a layered 2D material with carbon atoms arranged in a honeycomb lattice, graphene has unique electronic, thermal, mechanical, and optical properties. Unlike conventional metals, free electrons in graphene are massless quasi-particles that exhibit a linear energy-momentum dispersion governed by the Dirac equation for 2D relativistic fermions. As such, graphene offers certain exotic characteristics such as the extremely high mobility, large thermal conductivity, a universal conductance in the optical frequency region, and unique plasmonic characteristics with 2D graphene patches and ribbons [40,41,42]. Furthermore, the infrared conductance of graphene can be tuned by chemical doping or voltage gating, leading to promising high-speed photodetectors, transistors, solar cells, as well as optical modulators [43, 44]. More recently, a large number of 2D materials have been synthesized chemically or isolated using mechanical or liquid-phase exfoliation from their layered crystalline forms. These 2D materials and their heterostructures have great potentials for photodetectors, nanophotonics, transparent electrodes, and energy conversion and storage [45,46,47,48].

One of the successful technologies that operate in the regime of quantum mechanical domain is the giant magnetoresistive (GMR) head and hard drive. The GMR head is based on ferromagnetic layers separated by an extremely thin (about 1 nm) nonferromagnetic spacer, such as Fe/Cr/Fe and Co/Cu/Co. MBE enabled the metallic film growth with required precision and quality. The electrical resistance of GMR materials depends strongly on the applied magnetic field, which affects the spin states of electrons. IBM first introduced this technology in 1996, which was only about 10 years after the publication of the original research results [49, 50]. GMR materials have been extensively used in computer hard drive and read/write head. Albert Fert and Peter Grünberg won the 2007 Nobel Prize in Physics for this discovery. Overheating due to friction with the disk surface can render the data unreadable for a short period until the head temperature stabilizes; such an effect is called thermal asperity. Yang et al. [51, 52] performed a detailed thermal characterization of Cu/CoFe superlattices for GMR head applications using MEMS-based thermal metrology tools. Infrared near-field transducer is a key element for heat-assisted magnetic recording (HAMR) to achieve a density of 1 TB/in2. Datta and Xu [53] designed different nanostructures for near-field transducer to boost the coupling efficiency.

1.3.4 Probe and Manipulation of Small Structures

Tunneling by elementary particles is a quantum mechanical phenomenon or wavelike behavior. Quantum tunneling refers to the penetration of a particle through a potential barrier whose height (potential energy that the particle would have at the top of the barrier) is greater than the total energy of the particle. When the barrier width is thin enough, quantum tunneling can occur and particles can transmit through the barrier, as if a tunnel is dug through a mountain. An example is the tunneling of electrons through an insulator between two metal strips. Trained in mechanical engineering, Ivar Giaever performed the first tunneling experiment with superconductors in 1960 at the General Electric Research Laboratory and received the 1973 Nobel Prize in Physics, together with Leo Esaki of IBM and Brian Josephson. Esaki made significant contributions in semiconductor tunneling, superlattices, and the development of MBE technology. He invented a tunneling diode, called the Esaki diode, which is capable of very fast operation in the microwave region. Josephson further developed the tunneling theory and a device, called a Josephson junction, which is used in the superconducting quantum interface devices (SQUIDs), for measuring extremely small magnetic fields. SQUIDs are used in magnetic resonance imaging (MRI) for medical diagnostics.

In 1981, Gerd Binnig and Heinrich Rohrer of IBM Zurich Research Laboratory developed the first scanning tunneling microscope (STM) based on electron tunneling through vacuum [54]. This invention has enabled the detection and manipulation of surface phenomena at the atomic level and, thus, has largely shaped the nanoscale science and technology through further development of similar instrumentation [55]. Binnig and Rohrer shared the Nobel Prize in Physics in 1986, along with Ruska who developed the first electron microscope as mentioned earlier. STM uses a sharp-stylus-probe tip and piezoelectricity for motion control. When the tip is near 1 nm from the surface, electron can tunnel through the tip to the conductive substrate. The tunneling current is very sensitive to the gap. Therefore, by maintaining the tip in position and scanning the substrate in the xy direction with a constant current (or distance), the height variation can be obtained with extremely good resolution (0.02 nm). Using STM, Binnig et al. [56] soon obtained the real-space reconstruction of the 7 × 7 unit cells of Si (111). In 1993, another group at IBM Almaden Research Center was able to manipulate iron atoms to create a 48-atom quantum corral on a copper substrate [57]. The images have appeared in the front cover of many magazines, including Science and Physics Today. STM can also be used to assemble organic molecules and to study DNA molecules [2].

In 1986, Gerd Binnig, Calvin Quate, and Christoph Gerber developed another type of SPM, that is, the atomic force microscope (AFM) that can operate without a vacuum environment and for electrical insulators [58]. AFM uses a tapered tip at the end of a cantilever and an optical position sensor, as shown in Fig. 1.7. The position sensor is very sensitive to the bending of the cantilever (with a 0.1-nm vertical resolution). When the tip is brought close to the surface, there exist intermolecular forces (repulsive or attractive) between the tip and the atoms on the underneath surface. In the contact mode, the cantilever is maintained in position using the servo signal from the position-sensing diode to adjust the height of the sample, while it scans in the lateral direction. Surface topographic data can be obtained in an ambient environment for nonconductive materials. Other SPMs have also been developed and the family of SPMs is rather large nowadays. Wickramasinghe and co-workers first investigated thermal probing by attaching a thermocouple to the cantilever tip [59,60,61]. Later, Arun Majumdar’s group developed several types of scanning thermal microscope (SThM) for nanoscale thermal imaging of heated samples, including microelectronic devices and nanotubes [62]. Researchers have also modified SThM for measuring and mapping thermoelectric power at nanoscales [63, 64].

Fig. 1.7
figure 7

Schematic of an atomic force microscope (AFM)

Because of its simplicity, AFM has become one of the most versatile tools in nanoscale research, including friction measurements, nanoscale indentation, dip-pen nanolithography , and so forth. Heated cantilever tips were proposed for nanoscale indentation or writing on the polymethyl methacrylate (PMMA) surface, either using a laser or by heating the cantilever legs [65, 66]. The method was further developed to concentrate the heat dissipation to the tip by using heavily doped legs as electrical leads, resulting in writing (with a density near 500 Gb/in2) and erasing (with a density near 400 Gb/in2) capabilities. The temperature signal measured by the tip resistance can also be used to read the stored data due to the difference in heat loss as the tip scans the area [67, 68]. In an effort to improve the data-writing speed, IBM initiated the “millipede” project in 2000 and succeeded in making 32 × 32 heated-cantilever array for which each cantilever was separately controlled [69, 70]. Obviously, heat transfer and mechanical characteristics are at the center of these systems. The heated AFM cantilever tips have been used as a local heating source for a number of applications, including the above-mentioned CVD growth of CNTs locally and thermal dip-pen nanolithography [71].

1.3.5 Energy Conversion and Storage

Nanostructures may have unique thermal properties that can be used to facilitate heat transfer for heat removal and thermal management applications. An example was mentioned earlier to utilize nanotubes with high thermal conductivity, although nanotube bundles often suffer from interface resistance and phonon scattering by defects and boundaries. There have been a large number of studies on nanofluids, which are liquids with suspensions of nanostructured solid materials, such as nanoparticles, nanofibers, and nanotubes with diameters on the order of 1–100 nm [72]. Enhanced thermal conductivity and increased heat flux have been demonstrated with a wide range of applications from solar energy harvesting to medical applications, and from electronic cooling to fuel cell thermal management [73,74,75].

Thermoelectricity utilizes the irreversible thermodynamics principle for thermal-electrical conversion and can be used for cooling in microelectronics as well as miniaturized power generation . A critical issue is to enhance the figure of merit of performance, with a reduced thermal conductivity. Multilayer heterogeneous structures create heat barriers due to size effects and the boundary resistance. These structures have been extensively studied in the literature and demonstrate enhanced performances. Understanding the thermal and electrical properties of heterogeneous structures is critically important for future design and advancement [76,77,78].

Nanostructures can also help increase the energy conversion efficiency and reduce the cost of solar cells [79]. Furthermore, nanomaterials have been used to develop novel photovoltaic devices. Figure 1.8 shows the device structure of a ZnO-nanowire array for dye-sensitized solar cells [80]. This structure can greatly enhance the absorption or quantum efficiency over nanoparticle-based films. Improvement of photon-to-electron conversion efficiency may be achieved using photonic crystals [81]. In recent years, solar cells based on organic–inorganic halide perovskites have emerged with rapidly increased efficiency and various fabrication processes and structure designs [82].

Fig. 1.8
figure 8

ZnO nanowires for dye-sensitized solar cells. Reprinted with permission from Law et al. [80]; copyright (2005) Macmillan Publishers Ltd. The height of the wires is near 16 μm and their diameters vary between 130 and 200 nm. a Schematic of the cell with light incident through the bottom electrode. b SEM image of a cleaved nanowire array

Fast-depleting reserves of conventional energy sources have resulted in an urgent need for increasing energy conversion efficiencies and recycling of waste heat. One of the potential candidates for fulfilling these requirements is thermophotovoltaic devices, which generate electricity from either the complete combustion of different fuels or the waste heat of other energy sources, thereby saving energy. The thermal radiation from the emitter is incident on a photovoltaic cell, which generates electrical currents. Applications of such devices range from hybrid electric vehicles to power sources for microelectronic systems. At present, thermophotovoltaic systems suffer from low conversion efficiency. Nanostructures have been extensively used to engineer surfaces with designed absorption, reflection, and emission characteristics. Moreover, at the nanoscale, the radiative energy transfer can be greatly enhanced due to tunneling and enhanced local density of states . A viable solution to increase the thermophotovoltaic efficiency is to apply microscale radiation principles in the design of different components to utilize the characteristics of thermal radiation at small distances and in microstructures [83].

Concentrated solar power (CSP) has regained the interest with significant governmental investments in countries like the Spain, United States, Australia, China, United Arab Emirates, India, and so on. Solar energy is reflected by a large array of mirrors to a central receiver (power tower) to create a high-temperature source that can be used to heat a working fluid and then used to generate electricity through a steam turbine or a gas turbine power plant [84]. CSP may be combined with thermal storage system for operation during night or bad weathers; therefore, it can potentially offer a high-efficiency and cost-effective renewable energy solution. Challenges remain in the selected materials for energy storage, receiver efficiency, and system integration [85]. Various spectrally selective absorber/emitter that can operate at high temperatures are being developed using multilayers and nanostructures [86, 87]. Thermochemical cycle with redox reactions may be used as the high-temperature storage to further boost the efficiency of CSP [88, 89]. Further research is needed to understand the materials properties, develop and identify suitable storage substances, as well as develop high-temperature high-pressure power systems [84, 85].

Alternatives to traditional turbomachinery have also been developed to improve the cost effectiveness [85]. One of such methods is called solar thermophotovoltaic (STPV) systems [90, 91]. As illustrated in Fig. 1.9, an intermediate absorber/emitter assembly converts the solar radiation to a relatively lower temperature and then emits with a larger area to a PV cell, which then generates electricity. The prototype by Lenert et al. [90] used an carbon nanotube array to fully absorb the solar irradiation and a 1D Si/SiO2 photonic crystal as the selective emitter to match with the bandgap of 0.55 eV (\(\lambda = 2.26\,\upmu{\text{m}})\). An efficiency of 3.2% was demonstrated and later an efficiency of 6.8% has been achieved using an improved PC design with a bandpass filter [91]. These devices are scalable and significant improvement may be made in order to approach the theoretical conversion efficiency limit of about 60%.

Fig. 1.9
figure 9

Reprinted with permission from Lenert et al. [90]; copyright (2014) Springer Nature

A nanophotonic solar TPV: a Schematic of absorber/emitter with the PV cell for harvesting concentrated solar power; b Microscopic image of cross-section.

Hydrogen technologies are being considered and actively pursued as the energy source of the future [92]. There are two ways in which hydrogen H2 may be used: one is in a combustion heat engine where hydrogen reacts with oxygen intensively while releasing heat; the other is in a fuel cell where electrochemical reaction occurs quietly to generate electricity just like a battery. Because the only reaction product is water, hydrogen-powered automobiles can be made pollution free in principle. Grand challenges exist in generation, storage, and transport of hydrogen. If all hydrogen is obtained from fossil fuels, there will be no reduction in either the fossil fuel consumption or the carbon dioxide emission, except that the emission is centralized in the hydrogen production plant. Alternatively, hydrogen may be produced from water with other energy sources, such as renewable energy sources. Nanomaterials are being developed for several key issues related to hydrogen technologies, such as hydrogen storage using nanoporous materials, effective hydrogen generation by harvesting solar energy with inexpensive photovoltaic materials, and fuel cells based on nanostructure catalysts [93, 94]. Effective thermal management and cooling are also very important to improve the performance and reliability of the fuel cell technology [75, 95].

Lithium-ion batteries are commonly used in cell phones, laptops, and electric cars. Moreover, competing technologies have been developed and commercialized. The 2019 Nobel Prize in Chemistry honors John B. Goodenough, M. Stanley Whittingham, and Akira Yoshino for the pioneering research toward the development of Li-ion batteries during 1970s and 1980s. Overheating or thermal runaway has been known to cause device failure as well as fire disasters. Therefore, understanding the thermal properties and thermal transport at the microscale is critically important to improving the performance and reliability [96, 97]. Nanobatteries and nanogenerators have also been actively explored [98, 99].

1.3.6 Biomolecule Imaging and Molecular Electronics

Optical microscopy has played an instrumental role in medical diagnoses because it allows us to see bacteria and blood cells. Optical wavelength is more desirable than x-ray or electron beam because of the less invasiveness and the more convenience. However, the resolution of a traditional microscope is on the order of half the wavelength due to the diffraction limit. While the concept of near-field imaging existed in the literature before 1930, it has been largely forgotten because of the inability in building the structures and controlling their motion. With the microfabrication and precision-positioning capabilities, near-field scanning optical microscopes (NSOMs, also called SNOMs) were realized in the early 1980s by different groups and extensively used for biomolecule imaging with a resolution of 20–50 nm [100]. The principle is to bring the light through an aperture of a tapered fiber of very small diameter at the end or to bring the light through an aperture of very small diameter. The beam out from the fiber tip or aperture will diverge quickly if the sample is placed in the far field, that is, away from the aperture. However, high resolution can be achieved by placing the sample in close proximity to the aperture within a distance much less than the wavelength, that is, in the near field, such that the beam size is almost the same as the aperture. An apertureless metallic tip can be integrated with an SPM to guide the electromagnetic wave via surface plasmon resonance with a spatial resolution as high as 10 nm, for high-resolution imaging and processing. There have since been extensive studies on near-field interactions between electromagnetic waves and nanostructured materials, from semiconductor QDs, metallic nanoaperture and nanohole arrays, to DNA and RNA structures. The 2014 Nobel Prize in Chemistry was awarded to Eric Betzig, Stefan W. Hell, and William E. Moerner for the development of super-resolved fluorescence microscopy for imaging individual molecules. The 2017 Nobel Prize in Chemistry recognized the development of cryo-electron microscopy by Jacques Dubochet, Joachim Frank, and Richard Henderson for probing biomolecules with high resolution.

Nanoparticles are among the earliest known nanostructures that have been used for centuries in making stained glass with gold or other metallic nanoparticles as well as photographic films with silver nanoparticles. A QD has a spherical core encapsulated in a shell made of another semiconductor material, such as a CdSe core in a ZnS shell. The outer shell is only several monolayers thick, and the diameters of QDs range from 2 to 10 nm. The material for the inner core has a smaller bandgap. Quantum confinement in the core results in size-dependent fluorescent properties. Compared with molecular dyes conventionally used for fluorescent labeling in cellular imaging, the emission from QD fluorophores is brighter with a narrower spectral width. QDs also allow excitation at shorter wavelengths, making it easier to separate the fluorescent signal from the scattered one, and are resistive to photobleaching that causes dyes to lose fluorescence. Furthermore, the emission wavelength can be selected by varying the core size of QDs to provide multicolor labeling. It was first demonstrated in 1998 that QDs could be conjugated to biomolecules such as antibodies, peptides, and DNAs, enabling surface passivation and water solubility. In recent years, significant development has been made to employ QDs for in vivo and in vitro imaging, labeling, and sensing [101, 102].

CMOS technology is a top-down semiconductor fabrication process, in which patterns are created by first making a mask and then printing the desired features onto the surface of the wafer via lithography. Integrated circuits have dominated the technological and economic progress in the past 40 years, and complex and high-density devices have been manufactured on silicon wafers. However, this technology is coming to a limit, as the smallest feature size is less than a few nanometers or just about ten unit cells. While opportunities still remain in semiconductor technology as discussed previously [8], molecular electronics is considered as a promising alternative [103]. A 3D assembly with short interconnect distances would greatly increase the information storage density and transfer speed with reduced power consumption and amount of heat being dissipated. Self-assembly means naturally occurring processes, from biological growth to the galaxy formation. In materials synthesis, self-assembly implies that the end products or structures are formed under favorable conditions and environments. An example is the growth of bulk crystals from a seed. Fullerenes and nanotubes are formed by self-assembling, not by slicing a graphite piece and then rolling and bending it to the shape of a tube or a shell. Self-assembly is referred to as a bottom-up process, like constructing an airplane model with LEGO pieces. Biological systems rely on self-assembly and self-replication to develop. Since 2000, CNT-based transistors have been built by several groups and found to be able to outperform Si-based ones. Transistors have also been created using a single molecule of a transition-metal organic complex nanobridge between two electrodes [104]. Because of the small dimensions, quantum mechanics should govern the electrical and mechanical behaviors [105]. Figure 1.10 illustrates an engineered DNA strand between metallic atoms, noting that the width of a DNA strand is around 2 nm. Such a structure could function as a sensor and other electronic components. Molecular electronics, while still at its infancy, is expected to revolutionize electronics industry and to enable continuous technological progress through the twenty-first century.

Fig. 1.10
figure 10

Courtesy of NASA Ames Center of Nanotechnology

An engineered DNA strand between metal and atom contacts that could function as a molecular electronics device.

Nano/microscale research and discoveries have been instrumental to the development of technologies used today in microelectronics, photonics, communication, manufacture, and biomedicine. However, systematic and large-scale government investment toward nanoscience and engineering did not start until late 1990s, when the Interagency Working Group on Nanoscience, Engineering, and Technology (IWGN) was formed under the National Science and Technology Council (NSTC). The first report was released in fall 1999, entitled “Nanostructure Science and Technology,” followed by the report, “Nanotechnology Research Directions.” In July 2000, NSTC published the “National Technology Initiative (NNI).” A large number of nanotechnology centers and nanofabrication facilities have been established since then; see www.nano.gov. In the United States, the government spending on nanotechnology R&D exceeded $1 billion in 2005, as compared to $464 million in 2001 and approximately $116 million in 1997. The total government investment worldwide was over $4 billion in 2005, and Japan and European countries invested similar amount of money as the United States did. Over 60 countries have launched nanotechnology research programs. The NNI funding has totaled near $29 billion from 2001 to 2020. Recognizing the increasing impact on engineering and science, the American Society of Mechanical Engineers established the ASME Nanotechnology Institute in mid-2001 and sponsored a large number of international conferences and workshops. Understanding the thermal transport and properties at the nanoscale is extremely important as mentioned earlier. In 2008, Professor “Bob” D. Y. Tzou initiated the ASME International Conference of Microscale/Nanoscale Heat and Mass Transfer (MNHMT) in Tainan, which was followed by five successive MNHMTs in Shanghai, Atlanta, Hong Kong, Singapore, and Dalian [106,107,108,109,110]. These conferences have provided a highly interactive forum for researchers, educators, and practitioners around the world to exchange and promote the knowledge and new advances on the state-of-the-art research and development in this interdisciplinary field. The ASME Heat Transfer Division established the committee on Nanoscale Thermal Transport in 2012 and organized many focused research sessions at various ASME conferences.

Engineers have the responsibility to transfer the basic science findings into technological advances, to design and develop better materials with desired functions, to build systems that integrate from small to large scales, to perform realistic modeling and simulation that facilitate practical realization of improved performance and continuously reduced cost, and to conduct quantitative measurements and tests that determine the materials properties and system performance. Like any other technology, nanotechnology may also have some adverse effects, such as toxic products and biochemical hazards, which are harmful to human health and the environment. There are also issues and debates concerning security, ethics, and religion. Governmental and industrial standard organizations, as well as universities, have paid great attention to the societal implications and education issues in recent years. Optimists believe that we can continue to harness nanobiotechnology to improve the quality of human life and benefit social progress, while overcoming the adverse effects, like we have done with electricity, chemical plants, and space technology.

1.4 Objectives and Organization of This Book

Scientists, engineers, entrepreneurs, and lawmakers must work together for the research outcomes to be transferred into practical products that will advance the technology and benefit the society. Nanotechnology is still in the early stage and holds tremendous potentials; therefore, it is important to educate a large number of engineers with a solid background in nanoscale analysis and design so that they will become tomorrow’s leaders and inventors. There is a growing demand of educating mechanical engineering students at both the graduate and undergraduate levels with a background in thermal transport at micro/nanoscales. Micro/nanoscale heat transfer courses have been introduced in a number of universities; however, most of these courses are limited at the graduate level. While an edited book on Microscale Energy Transport has been available since 1998 [3], it is difficult to use as a textbook due to the lack of examples, homework problems, and sufficient details on each subject. Some universities have introduced nanotechnology-related courses to the freshmen and sophomores, with no in-depth coverage on the fundamentals of physics. A large number of institutions have introduced joint mechanical-electrical engineering courses on MEMS/NEMS, with a focus on device-level manufacturing and processing technology. To understand the thermal transport phenomena and thermophysical properties at small length scales, learning the concepts and principles of quantum mechanics, solid-state physics, and electrodynamics are inevitable while being difficult for engineering students.

The aim of this book is to introduce the much needed physics knowledge without overwhelming mathematical operators or notions that are unfamiliar to engineering students. Therefore, this book can be used as the textbook not only in a graduate-level course but also in an elective for senior engineering undergraduates. While the book contains numerous equations, the math requirement mostly does not exceed engineering calculus including series, differential and integral equations, and some vector and matrix algebra. The reason to include such a large number of equations is to provide necessary derivation steps, so that readers can follow and understand clearly. This is particularly helpful for practicing engineers who do not have a large number of references at hand. The emphasis of this book is placed on the fundamental understanding of the phenomena and properties: that is, why do we need particular equations and how can we apply them to solve thermal transport problems at the prescribed length and time scales? Selected and refined examples are provided that are both practical and illustrative. At the end of each of the remaining nine chapters, a large number of exercises are given at various levels of complexity and difficulty. Numerical methods are not presented in this book. Most of the problems can be solved with a personal computer using a typical software program or spreadsheet. Some open source codes are accessible and downloadable from the author’s website. For course instructors, the solutions of many homework problems can be obtained from the author.

The field of micro/nanoscale heat transfer was cultivated and fostered by Professor Chang-Lin Tien beginning in late 1980s, along with the rapid development in microelectronics, MEMS, and nanotechnology. His long-lasting and legendary contributions to the thermal science research have been summarized in a volume of Annual Review of Heat Transfer [111]. As early as in the 1960s, Professor Tien investigated the fundamentals of the radiative properties of gas molecules, the size effect on the thermal conductivity of thin films and wires, and radiation tunneling between closely spaced surfaces. In 1971, he authored with John H. Lienhard a book, entitled Statistical Thermodynamics, which provides inspiring discussions on early quantum mechanics and models of thermal properties of gases, liquids, and crystalline solids. While thermodynamics is a required course for mechanical engineering students, the principles of thermodynamics cannot be understood without a detailed background in statistical thermodynamics. Statistical mechanics and kinetic theory are also critical for understanding thermal properties and transport phenomena.

Chapter 2 provides an overview of equilibrium thermodynamics, heat transfer, and fluid mechanics. Built up from the undergraduate mechanical engineering curricula, the materials are introduced in a quite different sequence to emphasize thermal equilibrium , the second law of thermodynamics, and thermodynamic relations. The concept of entropy is rigorously defined and applied to analyze conduction and convection heat transfer problems in this chapter. It should be noted that, in Chap. 8, an extensive discussion is given on the entropy of radiation.

Chapter 3 introduces statistical mechanics and derives the classical (Maxwell-Boltzmann) statistics and quantum (i.e., Bose–Einstein and Fermi-Dirac) statistics. The first, second, and third laws of thermodynamics are presented with a microscopic interpretation, leading to the discussion of Bose–Einstein condensate and laser cooling of atoms. The classical statistics is extensively used to obtain the ideal gas equation, the velocity distribution , and the specific heat. A concise presentation of elementary quantum mechanics is then provided. This will help students gain a deep understanding of the earlier parts of this chapter. For example, the quantization of energy levels and the energy storage mechanisms by translation, rotation, and vibration for modeling the specific heat of ideal polyatomic gases. The combined knowledge of quantum mechanics and statistical thermodynamics is important for subsequent studies. The concept of photon as an elementary particle and how it interacts with an atom are discussed according to Einstein’s 1917 paper on the atomic absorption and emission mechanisms. Finally, the special theory of relativity is briefly introduced to help understand the limitation of mass conservation and the generality of the law of energy conservation.

Chapter 4 begins with a very basic kinetic theory of dilute gases and provides a microscopic understanding of pressure and shear. With the help of mean free path and average collision distance, the transport coefficients such as viscosity, thermal conductivity, and mass diffusion coefficient are described. Following a discussion of intermolecular forces, the detailed Boltzmann transport equation (BTE) is presented to fully describe hydrodynamic equations as well as Fourier’s law of heat conduction, under appropriate approximations. In the next section, the regimes of microflow are described based on the Knudsen number, and the current methods to deal with microfluidics are summarized. The heat transfer associated with slip flow and temperature jump is presented in more detail with a simple planar geometry. Then, gas conduction between two surfaces under free molecule flow is derived. These examples, while simple, capture some of the basics of microfluidics. No further discussion is given on properties of liquids or multiphase fluids. It should be noted that several books on microflow already exist in the literature.

The next three chapters provide a comprehensive treatment of nano/microscale heat transfer in solids, with an emphasis on the physical phenomena as well as material properties. The materials covered in Chap. 5 are based on simple free-electron model, kinetic theory, and BTE without a detailed background of solid-state physics, which is discussed afterward in Chap. 6. This not only helps students comprehend the basic, underlying physical mechanisms but also allows the instructor to integrate Chap. 5 into a graduate heat conduction course. For an undergraduate elective, Chap. 6 can be considered as reading materials or references without spending too much time going through the details in class. In Chap. 5, the theory of specific heat is presented with a detailed treatment on the quantum size effect. Similarly, the theory of thermal conductivity of metals and dielectric solids is introduced. Because of the direct relation between electrical and thermal conductivities and the importance of thermoelectric effects, irreversible thermodynamics and thermoelectricity are also introduced. The classical size effect on thermal conductivity due to boundary scattering is elaborated. Finally, the concept of quantum conductance (both electric and thermal) is introduced.

Chapter 6 introduces the electronic band structures and phonon dispersion relations in solids. It helps understand semiconductor physics and some of the difficulties of free-electron model for metals. Photoemission, thermionic emission, and electron tunneling phenomena are introduced. The electrical transport in semiconductors is described with applications in energy conversion and optoelectronic devices. Chapter 7 focuses on nonequilibrium energy transport in nanostructures, including non-Fourier equations for transient heat conduction. The equation of phonon radiative transfer is presented and solved for thin-film and multilayer structures. The phenomenon of thermal boundary resistance is studied microscopically. A regime map is developed in terms of the length scale and the time scale from macroscale to microscale to nanoscale heat conduction. Additional reading materials regarding multiscale modeling, atomistic modeling, and thermal metrology are provided as references.

The last three chapters give comprehensive discussion on nano/microscale radiation with extensive background on the fundamentals of electromagnetic waves, the optical and thermal radiative properties of materials and surfaces, and the recent advancement in nanophotonics and nanoscale radiative transfer. Chapter 8 presents the Maxwell equations of electromagnetic waves and the derivation of Planck’s law and radiation entropy . The electric and magnetic properties of the newly developed class of materials, that is, negative-refractive-index materials are also discussed. More extensive discussion of the radiative properties of thin films , gratings, and rough surfaces is given in Chap. 9. The wave interference, partial coherence, and diffraction phenomena are introduced with detailed formulations. Furthermore, various types of surface polaritons and localized excitations in nanostructures and 2D materials are extensively discussed. The focus of Chap. 10 is on near-field thermal radiation, with formulations of simple semi-infinite parallel plates to complicated systems. In addition, Chap. 10 reviews contemporary numerical simulation methods for computing nanoscale thermal radiation and recent experimental techniques for measuring nanoscale radiative transfer. These advancements will continue and are expected to have a huge impact on the energy conversion devices, sensors, and nanoscale photothermal manufacturing.

It is hoped that the present text can be used either as a whole in a one-semester course, or in part for integration into an existing thermal science course for several weeks on a particular topic. Examples are graduate-level thermodynamics (Chaps. 2 and 3), convection heat transfer (Chap. 4), conduction heat transfer (Chaps. 57), and radiation heat transfer (Chaps. 810). Selected materials may also be used to introduce nanoscale thermal sciences in undergraduate heat transfer and fluid mechanics courses. Some universities offer a second course on thermodynamics at the undergraduate level for which statistical thermodynamics and quantum theory can also be introduced. This text can also be self-studied by researchers or practicing engineers, graduated from a traditional engineering discipline. A large effort is given to balance the depth with the breadth so that it is easy to understand and contains sufficient coverage of both the fundamentals and advanced developments in the field. Readers will gain the background necessary to understand the contemporary research in nano/microscale thermal engineering and to solve a variety of practical problems using the approaches presented in the text, along the codes accessible from author’s website [112].