Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Cosmology has intrigued humanity since very early times. Questions like “How did the Universe begin?”, “How old is it?” or “How big is it?” are natural questions that accompanied society and religion well before modern Science had a say in our understanding of nature. Despite all this interest, and despite the effort of the greatest minds of their times, it was not until the sixteenth century that science finally proved that it was the Earth that went around the Sun, demolishing many centuries of dogmatic thinking that placed the Earth at the center of the Universe. Among others, there was a key factor that made this possible: the invention of the telescope by Galileo drastically expanded the horizons of astronomical observation, producing the required experimental evidence to clear all doubts around such fundamental question. This event marked an inflection point in human Cosmology, followed by a series of great discoveries, always related to new technologies, allowing for finer and deeper observations. It was like this that four centuries later (1927), Edwin Hubble, observing from the world’s largest telescope at Mount Wilson, made one of the most amazing discoveries by showing that the Universe is not steady but expanding, giving a new twist to the newly developed theory of gravity developed by Einstein (General Relativity) and providing the base for the revolutionary ideas that gave birth to the Big Bang Cosmology. Wait another 35 years, and the advances in electronics and telecommunications made possible another astonishing discovery, when two engineers, Arno Penzias and Robert W. Wilson, while testing a new transatlantic communication system, fortuitously detected the afterglow of the Big Bang, or Cosmic Microwave Background (CMB), settling the idea that the Universe experienced a hot and dense era before expanding into its current state. These examples prove how scientific development is tightly linked to the incorporation of new technologies to improve the observation of nature, providing new experimental evidence to support theoretical ideas.

The study of the Cosmic Microwave Background has revolutionized Cosmology, as it provides a snapshot of the early state of the Universe with detailed information about its composition, structure and physical state. This signal, the oldest in the form of electromagnetic radiation, reaches us with as a 2.76 K blackbody spectrum, carrying most of the relevant information as tiny fluctuations on the order of tenths of micro-Kelvin in temperature and a few micro-Kelvin in polarization. Moreover, the signal is extended on the sky, requiring large high-fidelity maps to extract the relevant information. Measuring this signal is a huge technical challenge which has only been achieved recently thanks to the development of high sensitivity detectors and readout systems, cryogenics, well understood reflective and refractive optical designs and powerful computer systems able to analyze huge volumes of data.

These pages summarize four lectures given at the II JPBCosmo School (2014) about the state of the art CMB Observation, centered around the case of the Atacama Cosmology telescope, and including a description of the instrument, observing techniques, data reduction and analysis, and new prospects for observing the CMB polarization signal.

2 Precision Cosmology

Historically speaking, Cosmology was always considered a very speculative and uncertain science, characterized by brave hypotheses and little evidence. The term “Precision Cosmology” appeared after CMB observations brought in fantastic experimental evidence, allowing us to strongly constrain our cosmological models (to better than 1 % in precision), producing this change of paradigm.

After the Penzias and Wilson’s discovery in 1964 (Penzias and Wilson 1965), another 30 years had to pass until two instruments on board of the COBE satellite unambiguously measured the spectrum and anisotropies of the CMB for the first time (Smoot et al. 1992; Mather et al. 1994). COBE’s discovery was followed by a rush of ground-based and balloon experiments (for examples see de Bernardis et al. 1999; Hanany et al. 2000; Devlin et al. 1999; Carlstrom & DASI Collaboration 2000) which systematically contributed to the development of new technologies and techniques required to dig into the rich information frozen on the CMB signal. Most of these experiments were intended to measure the tiny angular anisotropies of the CMB, requiring increasing sensitivities and angular resolutions. In 2001 the WMAP satellite was launched by NASA, later producing the most precise measurement of the CMB. After 9 years of observations “the allowed volume of cosmological parameters was reduced by a factor in excess of 68,000” http://map.gsfc.nasa.gov, 2013 justifying the concept of precision cosmology.

WMAP produced a full map of the sky with an angular resolution of 0.2, meaning multipoles of less than  ≈ 500, which is enough to measure the first three peaks of the CMB power spectrum, and with enough sensitivity to determine those multipoles to their cosmic variance limit. Further exploration of the CMB required mapping finer resolutions and measuring the polarization of the signal, which was only done to a more limited degree by WMAP. In 2009 the European Space Agency launched Planck (http://www.esa.int/Our_Activities/Space_Science/Planck, 2013), another space observatory with improved WMAP sensitivity and resolution, which reached 5 arcmin, measuring multipoles well down the diffusion damping tail of the CMB power spectrum up to  ≈ 2000. Planck greatly confirmed and improved WMAP results, further constraining cosmological parameters and characterizing the microwave signal across the sky, and continues to release results today. One of the most expected results from Planck is the polarization maps and analysis, which may possibly shed light on the very early epochs of the inflationary universe.

The spacial resolution of CMB observations from space is limited by the size of the optics that can be put on a satellite. This means that finer angular resolutions are better achieved from the ground. This brings another set of great challenges, especially because the atmosphere strongly interacts with millimeter wavelengths, mostly due to its water vapor content. This forces observations to be done in sites where the atmosphere is extremely dry and thin, as can be found in Antarctica or at the highlands of the Atacama Desert in Chile. This complication is balanced by the possibility of implementing large arrays of high-sensitivity detectors at a much lower cost. In 2007, the South Pole Telescope (SPT http://pole.uchicago.edu, 2013) and the Atacama Cosmology Telescope (ACT http://www.princeton.edu/act, 2014), with 10 and 6 m main apertures, began observing from the South Pole and from the Atacama Desert respectively. Their higher resolution, combined with an extreme sensitivity provided by their kilo-pixel class cameras, allowed these experiments to produce super-fine maps of the CMB over thousands of square degrees in the sky. With this finer resolution (order 1 arcmin), these experiments measured the CMB power spectrum all the way down the diffusion tail and into the secondary anisotropy region, dominated by the cosmic infrared background, thermal and kinetic Sunyaev-Zel’dovich clustering signals and radio sources, dramatically expanding the available science to later epochs of the Universe, while improving the “standard” CMB science by measuring and decoupling this foreground signal from the CMB.

Figure 1 compares the maps from COBE, WMAP, Planck and ACT, stressing the effect of increasing the resolution of the map.

Fig. 1
figure 1

Direct comparison of the resolution achieved by COBE, WMAP, Planck and ACT maps. The Increase in resolution is evident when moving from left to right. The finer resolution of ACT (lower right circle) clearly detects foreground galaxies and clusters of galaxies through their thermal Sunyaev-Zel’dovich effect signature, as indicated

In the following we will describe technical aspects of the Atacama Cosmology Telescope, as reference for the technologies and techniques required to observe the CMB from the ground.

3 The Atacama Cosmology Telescope

The Atacama Cosmology Telescope is a 6 m telescope installed at 5200 m on Cerro Toco, a few kilometers away from the ALMA site in the Chajnantor valley. It is a dedicated instrument to measure the CMB at millimeter wavelengths over large areas of the sky. It started operation in 2007 with MBAC, a 3-K TES pixel bolometer camera observing in three bands (150, 220 and 280 GHz) with unprecedented sensitivity (Swetz et al. 2011). At these frequencies, the resolution reaches less than 1 arcmin, permitting the direct detection of a large number of extragalactic point sources. Moreover, the three bands are chose to probe the decrement, null and increment frequencies of the Sunyaev-Zel’dovich (SZ) effect, clearly distinguishing clusters of galaxies from the background, producing a redshift-independent blind survey of these objects only limited by their mass.

MBAC observed for four seasons before being decommissioned in 2011. During those years it mapped nearly 2000 square degrees over two long stripes of the sky. The scientific outcome was a long list of publications (for main results and references see Niemack et al. 2008; Menanteau et al. 20102013; Sherwin et al. 2011; Hand et al. 2012; Dünner et al. 2013; Sifón et al. 2013; Calabrese et al. 2013; Sievers et al. 2013; Hasselfield et al. 2013a,b; Dunkley et al. 2013; Sehgal et al. 2013; Das et al. 2014; Louis et al. 2014; Marsden et al. 2014) ranging from new cosmological parameter constraints, through detection of dozens of galaxy clusters and extragalactic sources, first detection of kinetic SZ effect, first detection of gravitational lensing effects on the CMB, first detection of Dark Energy effects purely using CMB data, to the new observing and data processing techniques required to achieve these results.

At the relevant frequencies, radiation strongly interacts with rotational and vibrational water modes, thus observing the CMB requires very dry atmosphere conditions, measured as Precipitable Water Vapor (PWV). Normal PWV values around de Earth range from 2 to 50 mm, while the median PWV at the ACT site is only 0.5 mm, justifying such an extreme location for the telescope.

To measure temperature fluctuations of only a few micro-Kelvins we need extremely sensitive detectors, which imply having extremely low noise levels. As you may know, any electronic element produces noise proportional to its temperature. Then to measure such a low temperature radiation the detectors must also operate at very low temperature. In the case of MBAC, the detectors operated at only 0.3 K, together with most of the superconducting readout electronics. These temperatures are achieved using cryogenics systems based in He4 and He3, which is a Helium isotope, operating in a very similar way as a fridge does. For these the whole refractive optics and detectors are kept under vacuum inside the camera, which is constructed as Matryoshka doll, with layers at decreasing temperatures up to the lowest is reached at the center where the detectors are.

The detectors in MBAC were Transition Edge Sensors (TES), which are superconducting devices that are operated in the transition between being normal and super conductors. This means that tiny changes in temperature of the device produce large changes in resistivity, which is measured by running a current through it. So in practice they are thermometers. When radiation falls on one of these detectors it warms it up, so we can detect very tiny changes in radiation loading. As they do not discriminate between the wavelength of the incoming radiation, but only on the temperature achieved by the device, these detectors are called bolometers.

An important property of TES’s is that they can be micro-fabricated using techniques similar to those developed in the electronics industry, so they can be cheaply stacked together in large arrays, increasing this way the telescope sensitivity. For instance MBAC had three cameras, each one containing 1024 detectors.

The optical design of the telescope and camera are very important to achieve a good result. As this is an extended source, and considering that at these wavelengths the images produced are limited by diffraction (as opposed to optical telescopes where the image resolution is normally limited by the turbulences in the atmosphere, or seeing), the shape of the features in the maps produced is directly affected by the properties of the optics. Moreover, at such low temperatures, stray radiation from the surrounding landscape can enter the telescope producing ghost images, motivating the installation of large baffles around the telescope to direct all the stray light to the sky. To reduce diffraction features, the optical design is off-axis, clearing up the optical path. To achieve a large focal plane, necessary to accommodate many detectors, the camera contains refractive optics composed of filters, lenses and a Lyot-stop to limit the illumination of the primary mirror and minimize spillover. These lenses are very similar so optical lenses, but their materials are very different because then operate at these long wavelengths.

The observations are performed by continuously scanning the sky at constant elevation while the Earth rotation slowly moves the sky across the field of view. This is important to keep the airmass constant, as the water vapor is the main source of radiation as mentioned before. The same patch of the sky is observed while rising and setting, producing a cross-linked observation patterns which is important to reconstruct modes in all directions during map making. Contrarily to optical telescopes, which integrate the radiation falling in their detectors while staring at a single place in the sky, ACT is continuously reading its detectors while scanning, producing long time-streams of data occupying many terabytes of disk space. We call these time-streams Time Ordered Data or TOD. All this data must then be reduced offline to produce the image of the sky.

The raw data from the telescope is very noisy. Most of its power comes from the atmospheric turbulence, which is clearly predominant at low frequencies in the time-streams, forming strong correlations between detectors (a strong common mode). At higher frequencies the TOD is dominated by detectors noise, which is mostly Gaussian noise. The CMB signal is then buried under the noise in a single TOD and can only be recovered after combining many TODs.

The map making procedure consists in estimating the best possible map of the sky given all the data from the time-streams. This implies performing a likelihood minimization given the best noise model we can produce. The map-making equation can be written as

$$\displaystyle{ \mathbf{M}^{T}\,\mathbf{N}^{-1}\,\mathbf{M}\,x = \mathbf{M}^{T}\,\mathbf{N}^{-1}\,d }$$
(1)

where M is a pointing matrix that relates every sample from the TODs with a pixel in the map, N is the noise covariance matrix from the TODs, d is the time-space data samples and x is the map. Given the huge amount of data used to produce the map, it is impossible to express and invert this equation in its full form, so iterative methods are used instead. These methods only used a block diagonal version of the noise matrix, grouping TODs obtained simultaneously by different detectors, while modeling the frequency dependence by computing the noise matrix divided in frequency bins. The whole process can only be done in a reasonable time using a supercomputer with thousands of cores.

4 Polarization Sensitive Maps (ACTpol)

MBAC was decommissioned in 2011 and used to build a new polarization sensitive camera called ACTpol. This new camera also used TES detectors, but operating at only 0.1 K using a dilution refrigerator, device that uses the latent heat in a mixture of He3 and He4 to reach these super cold temperatures. Also several improvements were introduced to the optics and to the detector coupling, which was now done using a wafer feed-horn array. The polarization was achieved by coupling each feed-horn to two orthogonal microstrip antennas, each one coupled to its own TES detectors, such that each detector is coupled to a single polarization. Antenna-pairs were distributed across the array in several directions to evenly cover Q and U polarizations on the sky.

One advantage of measuring polarization is that the atmosphere is not polarized, significantly reducing the level of noise in the data. On the other hand, the polarized signal is about an order of magnitude dimmer than the temperature signal, increasing the experimental challenge to measure it correctly.

ACTpol began observing in 2013, mapping nearly 200 degrees on the sky. The polarized maps were used to measure the E-mode CMB polarization (Naess et al. 2014), although the reached sensitivity in a single year of observations was not enough to reach the B-mode polarization signal. ACTpol is expected to achieve the sensitivity to measure B-mode polarization after the next 2 years of observations, ending in early 2016.

5 What is Coming

As mentioned before, the field of CMB observations is developing very rapidly, with its prime focus towards reaching finer angular resolutions and high polarization sensitivities. Associated to these observational goals there is an ambitious set of scientific aims, like characterizing the lensing signal on the CMB to measure the distribution of matter at redshifts peaking at z ≈ 2, and detecting primordial B-modes produced by gravitational waves formed during the epoch of inflation.

Inflation is one of the largest mysteries still to be resolved, as it implies understanding the very beginning of the Universe. Several experiments are now competing to be the first to detect this signal, some of them being BICEP, ABS, CLASS, Planck, SPTpol and AdvACT, which is the successor of ACTpol on the ACT telescope. The challenge is huge, as primordial B-modes are expected to be a tiny signal over degree scales on the sky, while being strongly affected by foreground signals like synchrotron and polarized dust from our galaxy. Decoupling this signals will require measuring the sky at several frequency bands and over large areas of the sky. Moreover, measuring large features on the sky require mapping large areas, favoring low latitude locations for ground based telescopes.

Wrapping up, measuring large areas of the sky require excellent atmospheric conditions, access to large areas of the sky and very fast telescopes. The latter implies increasing the sensitivity of the detectors, which is now reaching its optical limit. A way to overcome this is adding more and more detectors in the focal plane, and of course using optical designs with large focal planes. In my opinion, the implementation of these new techniques will require at least a few iterations of these lower cost ground experiments before justifying new satellite missions, while the seek for finer resolutions imply large telescopes which are also impractical in orbit. Then the field is probably going to point towards a rapid development of ground and balloon experiments, with a significant amount of technological development.