Dark Energy Space Missions and Type Ia Supernovae Research
background anomaly, called Cold Spot. But the simulations that could account for this hypothesis are still at too uncertain a stage to confirm or reject the hypotheses regarding it.
CHAPTER 11 - Eyes in the Sky
The range of possible interpretations concerning the acceleration of the Universe's expansion thus leads to the inevitable expectation of cosmologists regarding experiments capable of discriminating and rejecting some of the previous scenarios. As can be imagined, the experiments proposed in this field are extremely numerous, but the most interesting results are expected from a better description of the Universe's expansion. In this context, the JDEM, or Joint Dark Energy Mission, is positioned - a joint space mission of the American Department of Energy (DOE) together with the American Space Agency (NASA), and the LSST project, or Large Survey Space Telescope, a data collection and analysis program realized in collaboration with Google. To understand the meaning of these projects and what advantages they hope to obtain, it is necessary to premise some considerations on the method of detecting the expansion of the universe from the analysis of Type Ia Supernovae.
11.1 Why Type Ia Supernovae?
Type Ia Supernovae are a type of supernovae resulting from the violent explosion of a white dwarf. These supernovae arise from a binary system in which a white dwarf acquires mass from its companion star until it exceeds the limit of about $1.4$ solar masses above which the star's collapse begins. The key point of the situation is that, since these stars slowly acquire mass until reaching the Chandrasekhar limit at which collapse occurs, all Type Ia Supernovae behave in their collapse always in the same way. In particular, the luminous emission of these stars is indicatively always the same regardless of the place where they are spotted. This information proves to be extremely valuable for astronomers who, knowing the theoretical intensity given by the explosion and comparing it with the detected intensity of the supernova, can determine its distance. For this reason, Type Ia Supernovae are considered the standard reference candles for measuring extragalactic distances.
Once the distance is established, it is possible to proceed to analyze its motion by analyzing the redshift or red-shifting of the star's emission (indicated by the letter $z$).
SNAP and JDEM
Among all these Type Ia Supernovae, the most interesting are certainly those with higher redshift, that is, the oldest and most distant Supernovae from us. These indeed allow discriminating some fundamental scenarios that appear identical instead when compared on supernovae with low redshift. To give an example, the Big Rip scenarios we previously discussed can be diagnosed with certainty exclusively on Supernovae with redshift greater than $1$. With this objective, the SNAP probe (SuperNovae Amplitude Probe) has been proposed, which is one of the candidate projects for the JDEM or joint space mission we mentioned previously. This probe, which at the moment seems to be the favored project for JDEM, was specifically designed and optimized for the infrared field so as to be able to dedicate itself entirely to the detection of Supernovae with redshift greater than $1$. The launch of this probe would have been declared to be in 2013, but more realistic voices want it postponed towards 2016-2017.
LSST and Google
A project in some ways complementary to the SNAP probe is instead that of the LSST, or Large Synoptic Space Telescope, whose construction should begin in 2010 and finish in 2015. This is a systematic sky scanning project whose purpose is to collect an incredible number of data and thus proceed by incredibly increasing the number of interpolation points of the Type Ia Supernovae distribution curve, to better understand their acceleration. The project in question is extremely ambitious, not only for the construction of the telescope, but especially for data analysis. The core of the telescope is indeed a $3.2$ Giga-Pixel camera that takes a photograph with 15 seconds of exposure every 20 seconds, for a total of acquired data equal to about $30$ Terabytes per night. This enormous number of data has led the project managers to request active collaboration from the IT giant Google in order to pre-select the data deemed interesting. At present, however, the project is continuing to seek federal funding that should arrive starting from 2010. Even in this case, however, the first useful images would appear no earlier than 2016.
In the Meanwhile
Although it is possible that some results from the LHC may preferentially indicate some scenarios over others even before 2016. However, it seems that until then, real innovations in the field of dark energy must come from conceptual innovations rather than new experimental data. From this perspective, every date is a good one, so let's see what the near future holds for us.
APPENDIX A - Quintessence, Phantoms and Scalar Fields
The main merit and defect of constants is that they do not vary in time or space, they are therefore not dynamic quantities. If this has the merit of greatly facilitating mathematical calculations, on the other hand it leads to some undeniable conceptual difficulties. For example, suppose that the energy density of vacuum energy is constant per unit of space, as is the case determined by the introduction of a cosmological constant. Since the universe is expanding and the density per unit of space remains constant, one must suppose a continuous creation of dark energy. This and infinite other arguments have made cosmologists opt to consider other less rigid possibilities in which the cosmological constant, or vacuum energy, is not constant at all but varies in time or space, that is, it is essentially