What To Look Forward To In The Next Decade Of Science

Originally published January 2020

In 1965, Gordon Moore, founder of Intel, hypothesised that “the number of transistors that can be placed on an affordable integrated circuit will double approximately every two years”. Moore’s Law, as it was christened, has proven by-and-large to be correct, as computers get smaller yet faster every year. Even within this last decade, we’ve seen the rise of the smartwatch, perhaps the ultimate realisation of Moore’s prediction, where a computer can be so small as to fit on one’s wrist. Yet there was always destined to be a time when Moore’s Law would no longer be applicable, when transistors could physically be made no smaller. Moore himself predicted this point of saturation in 2015: “I see Moore’s Law dying here in the next decade or so”. But with the planet’s desire for computing power not slowing, especially in the worlds of scientific research where the answers to life’s questions can be obscured by computers merely not being fast enough, what is to be done about this maximum transistor threshold being reached?

The answer for the likes of IBM and Google is Quantum Computing. Traditionally, a classical computer relies on data being stored as binary “bits”, 1s and 0s, which in turn more literally represent high and low voltages within the circuits. Each circuit can either store a 1 or a 0 (have a high or a low voltage), but common sense dictates they cannot store both a 1 and a 0 simultaneously (the voltage be both high and low). Yet that is exactly what a quantum computer can do, as it exploits the phenomena of quantum mechanics to allow the “qubits” to be simultaneously a 1 and a 0. This means quantum computers have incredible amounts of  processing power as they can store vastly more data at once. Consider two conventional bits: they can be in one of four combinations, 00,01,10 and 11. Two qubits could inhabit all four of these states simultaneously, whereas eight bits would be required to fulfil all of these combinations. This allows for quantum processors to be fractions of the size of classical chips and yet multiple times faster.

Indeed, just last year Google claimed to have “quantum supremacy”, when their quantum computer Sycamore performed a task virtually impossible for even the most powerful classical computers; it performed a calculation in 200 seconds that best estimates think the most powerful supercomputers would require 10 000 years to complete. And in the next decade, quantum computing is only going to grow in its ability; Hartmut Neven, director of Quantum AI Labs at Google, predicts that the power of quantum computers will increase at a double-exponential rate. (If this whole section has left you, much like the author, rather wishing you hadn’t started down this rabbit hole now you realise how utterly befuddling the world of quantum computing is, you shouldn’t worry, as the great physicist Richard Feynman said himself “I think I can safely say that nobody understands quantum mechanics”). 

Neven’s Law sets a precedent for the rate at which science and technology will no doubt develop over the next ten years, but sometimes a look back at where we have been is necessary to see how we can move forward. It was almost five decades ago, in 1972, that the last manned lunar mission, Apollo 17, set off from the Moon and returned to Earth. Since then, NASA and the world’s space programmes have had their sights set closer to home for manned missions, with the International Space Station, a monumental symbol of what is possible through international cooperation. Its completion however, spelt the end for the Space Shuttle, as Atlantis flew the final mission in July 2011. Since then, the only manned spaceflight has been from the Russian space agency Roscosmos and their Soyuz rocket which flies astronauts and cosmonauts to the ISS every three-to-six months. 

That’s all about to change, however, as NASA have announced plans for a 2024 Lunar Mission to pave the way toward a permanent settlement on our closest celestial neighbour, which will act as a gateway to the stars. The Artemis Programme is not the only moonshot aiming high for the 2020s; Amazon tycoon Jeff Bezos’ private spaceflight company, Blue Origin, is aiming to put a potentially manned lander in Shackleton crater by 2024.
This, along with fellow private rocket and space-vehicle companies, the likes of SpaceX, Virgin Galactic and Rocket Lab, presents the interesting possibility of the development of space tourism across the next decade and its viability being put to the test. Indeed, many wait with bated breath for the tests of SpaceX’s latest vessel, Starship, designed to carry humans to the Moon, Mars and beyond, as these rockets may become the new airliners: the norm for holiday travel. There may be doubt in the truth of these companies claims of launching to the Moon before even half of the decade is complete (SpaceX’s founder Elon Musk believes Starship will be completing lunar missions by 2022), but without a doubt this sector will make leaps and bounds in the coming ten years, and become even more prolific as they strive to make the Solar System a “must-see” holiday destination. 

But how about a completely new way of observing the Universe? Since the dawn of Astronomy, scientists have been limited by the properties of light in their observations of the cosmos. First it was visible light with Galileo’s telescope, and with the power of our eyes and some shaped orbs of glass we discovered numerous wonders of the galaxy. But soon visible light gave up all the secrets it had to tell, and astronomers had to shift to a different wavelength — quite literally. Radio waves, Microwaves, Infrared, Ultraviolet, X-Rays and Gamma, all exposed invisible yet wondrous objects in the night sky, and provided insights into the origins of our Universe that we would never have gained otherwise. However, all these forms of observation are essentially the same; they are merely stretches and squashes on visible light, forming the Electromagnetic Spectrum. But there is only so many answers to our questions that come in the form of E-M waves, especially when we are asking questions of the monsters of the Universe that feast on visible light and its siblings…

These are, of course, black holes; the fallen stars, too massive for their own good that are crushed under the force of their own gravity into an infinitesimally small point of an infinitely high density. They exert a force of gravity so strong not even light can escape their clutches, and any stray photons that find themselves beyond the opaque curtain known as the event horizon are are devoured in a void beyond our current understanding. Conversations in the language of electromagnetic waves with these beasts, then, are futile; if we are to probe the light-killers for answers to our questions, we must use a parlance they are more acquainted with — gravity.

Einstein predicted in 1916 that some of the most violent and energetic events in the Universe would cause ‘ripples’ in the very fabric of space-time, and they would distort matter as they radiate through space at the speed of light. He called these ripples gravitational waves, and it took almost a hundred years for them to finally be discovered: in 2015, the Laser Interferometer Gravitational-Wave Observatory (LIGO) discovered gravitational waves caused by the collision of two black holes almost 1.3 billion light years away. Though it took a century to reach this milestone, no doubt within the next ten years gravitational-wave astronomy will make leaps and bounds as the observational technology develops and improves. 

Currently, to detect the waves, scientists at LIGO use a large-scale interferometer: two 4km long arms, constructed at a right-angle to one another, with mirrors on either face of each arm. A laser light beam is blasted down each arm and reflected back, and is used to measure the distance from one mirror to another. When a gravitational wave passes through the interferometer, it should result in a distortion of space that would stretch it in one direction and compress it in the direction perpendicular. This would mean that in one arm the mirrors would stretch slightly apart and in the other they would compress slightly together. These absolutely minuscule movements of the mirrors can be detected due to the incredible precision of the laser measuring system, and so enables scientists to observe these otherwise imperceivable events from billions of light years away.

But from the incomprehensibly far to issues closer to home: the Earth, and it’s battle with a changing climate. If we are to reduce the potentially devastating effects of two centuries relying on fossil fuels, an increase in renewable energy sources is essential. One key revolution that’s required is of our transport system; cars, aeroplanes and boats will have to transform from behemoth “gas guzzlers” into clean electric vehicles. It’s already begun in the automotive industry as battery-electric-vehicles have come to fruition in the last half decade. However, to provide a truly viable alternative to the combustion engine, there needs to be a radical improvement in battery technology to provide greater storage capacity to provide more range for vehicles. 

Lithium ion cells, which are as small as the AAs in a TV remote all the way to in giant battery packs in electric cars, work by containing a liquid electrolyte between two lithium-compound electrodes. The electrolyte allows positively charged lithium ions to flow from the anode to the cathode, and this movement of ions creates free electrons in the anode. In turn, this creates a charge which flows as a current through the device and powers it. It’s a tried-and-tested technology, but not one without its flaws: Li-on batteries degrade very quickly, contain highly toxic chemicals and can be flammable. They can also only be made so small, there is a point at which all the necessary components and volume of electrolyte cannot be shrunken any smaller. The solution could be solid state batteries, a technology we could see rise to the forefront in the 2020s, where the liquid electrolyte is replaced with a highly conductive solid. Solids, being much denser than liquids, allow for the cells overall to have a much smaller size as the whole system can be compacted further, so more batteries could be put in a given area (say, a car chassis) allowing for greater energy storage capacity. Furthermore, the removal of toxic and flammable liquids would make solid state batteries safer and less harmful to the environment. In 2017, Toyota announced plans to integrate solid-state battery technology into its vehicles this year, and since as far back as 2012 Volkswagen have been working with SSB start-up QuantumScape. Both these automotive giants, and many more, are in a race for what has been dubbed “the golden fleece” of car technology; the first efficient and practical solid-state battery electric vehicle. Even non-automotive companies have tried to force their way in to the battle for SSB vehicles; Dyson, the vacuum giant, announced plans for a SSB EV in 2017, and even bought Hullavington Airfield as an R&D test track in 2018, but last year founder James Dyson announced the project was “no longer commercially viable”. However, he says the company will shift its focus to the development of solid-state batteries, which he terms a “fundamental” technology for our future. 

But many argue that simply changing the way we store our energy is not enough to curb the dangers of climate change. Since the industrial revolution, the majority of the UK’s energy has been generated through burning fossil fuels, mainly natural gas. The combustion of fossil fuels is, of course, the leading producer of greenhouse gases that have a direct causal link to climate change. Nuclear energy, which does not produce harmful greenhouse gases such as carbon dioxide and methane, is the next most contributing form, but many nuclear power stations are closing due to not being economically viable — all but one is expected to be closed by 2025. Of course, nuclear reactor’s create their own waste which can be as equally damaging to the environment if not handled properly.

Or so it may seem: in fact, nuclear energy comes in two variations, fission and fusion, and it is only the former which we have developed the technology to exploit. Nuclear fission reactors contain large amounts of enriched Uranium, meaning it contains higher-than-natural amounts of the isotope Uranium-235. A neutron is fired at a U-235 nucleus at extremely high speeds. The neutron is absorbed by the nucleus, to create U-236, an unstable isotope that immediately splits. The split, or fission, of the atom produces multiple products: two lighter elements, some free neutrons ejected at high speed that repeat the process to cause a chain reaction, and immense amounts of heat and gamma radiation. It is this energy that will be converted into electricity in much the same way as a fossil fuel power station; large amounts of water is heated into steam which turns a turbine connected to a dynamo to generate electricity. 

Of course, the overarching downside of this process is the radioactive material used; all the facilities must be suitably insulated against radiation, human workers could be exposed to radiation that leads to greater risks of cancer, and once the Uranium fuel is used up it is still radioactive and dangerous and needs to be disposed of. Nuclear fusion, on the other hand, provides a much cleaner alternative generation method, and is toted by many as a potential solution to the world’s energy crisis. It exploits the vast amounts of energy released by the fusion of two hydrogen atoms, three-to-four times that garnered from fission reactions, into helium to generate electricity. It is the process that powers stars, and if we could harness this process on earth using our abundant hydrogen (certainly much more abundant than Uranium) it could solve all our energy woes. 

However, nuclear fusion is not an easy process to get started: it requires temperatures of around 100 million Kelvin (approximately 6x the temperature of the surface of the Sun) to provide enough energy to the hydrogen atoms to overcome their repulsion. The atoms must be put under colossal pressure, also, as they have to be within femtometer (1x10-15 m) to fuse. Stars have the advantage of incredible mass and force of gravity to meet these extreme requirements, conditions that are simply impossible to replicate on Earth. Scientists use a combination of lasers, microwaves, ion beams and magnetic fields to heat and squeeze the hydrogen atoms together to enable fusion. Even then, our technology is only capable of creating an environment for one of the three fusion schemes to occur, deuterium-tritium fusion, rather than the more efficient deuterium-deuterium fusion. And even then, our current fusion reactors only generate marginally more energy than is inputted into the system, but this is all set to change, as the likes of MIT claim “fusion power [will be] on the grid in 15 years”. Constant developments in material science are sure to make vast steps in improving the effectiveness of fusion reactors, such as those required to produce the powerful magnets required to hold in place the hydrogen plasma. 

Then there is the Holy Grail of energy generation systems: cold fusion. A completely hypothetical system currently limited to the realms of science-fiction, it takes all the benefits of conventional “hot” fusion and its potentially limitless energy but at or near to room temperature. Exactly how these systems could work is still a mystery, and the likelihood of seeing cold fusion generators on the grid before 2030 is minuscule to say the least, but it is exciting to ponder a world that runs on limitless, clean fusion energy. 

Perhaps, as a final thought, we should make it our mission for these next ten years to ponder more, to ponder everything, to ask “why does that happen?”, “what if we did this?”. The Universe has a wickedly mischievous soul; even after we spend generations pondering one of its great mysteries, to the point where we allow ourselves to think we might have just gone and cracked it, it presents to us ten more questions to occupy another generation of thinkers and dreamers and ponderers. In the next decade, could you make yourself a solver?