EMBARGO Wednesday 19 Jul 1600 BST | 1500 GMT | Thursday 20th July 0100 AEST
When the universe was just a small baby universe, not much was happening chemically. There was hydrogen, with some helium, and a few traces of other things. Heavier elements did not arrive until stars had formed, lived and died.
So imagine the consternation of scientists when, using the James Webb Space Telescope to look back into the far reaches of the Universe, they discovered significant amounts of carbon dust less than a billion years after the Big Bang.
The discovery suggests there was a way to improve carbon production in the tumultuous early universe — likely through the death of massive stars, which spewed it into space as they died.
“Our detection of carbonaceous dust at redshifts 4-7 presents critical limitations to the dust production models and scenarios in the early Universe,” write a team led by cosmologist Joris Witstok from the University of Cambridge in the UK.
The first billion years of the universe’s life, known as the Cosmic Dawn, after the Big Bang 13.8 billion years ago, was a critical time. The first atoms formed; the first stars; the first light bloomed in the darkness. But it took stars themselves to forge significant amounts of elements heavier than hydrogen and helium.
In the hot, dense nuclear furnaces of their cores, stars break atoms together, fusing them into heavier elements in a process called stellar nucleosynthesis. But these heavier elements largely accumulate in the star until the fusion material runs out and dies. spew its contents out into the space around it. It is a process that usually takes some time.
Witstok and his colleagues used the JWST to study lingering dust during the Cosmic Dawn and discovered something strange. They found an unexpectedly strong feature in the spectrum associated with the absorption of light from carbon-rich dust, in galaxies as early as 800 million years after the Big Bang.
The problem is that these dust grains are thought to take several hundred million years to form, and the characteristics of the galaxies suggest that they are too young for this formation timescale. But it is not an impossible problem to solve.
The first stars in the universe were thought to be much more massive than the younger stars we see around us today. Since heavier stars use up their fuel reserves more quicklythey would have had relatively short lives, exploding in supernovae that could have dispersed heavier material relatively early.
There are also stars these days who are absolute dust factories. They are called Wolf-Rayet stars, massive stars that have reached the end of their lives, on the brink of a supernova. They don’t have much hydrogen left, but they have a lot of nitrogen or carbon, and they are ejecting mass at a very fast rate. Those ejecta also contain a lot of carbon.
The discovery of large amounts of carbon in multiple galaxies during the cosmic dawn could provide evidence that these processes not only occurred, but were more common during the early Universe than in more recent space-time.
This, in turn, suggests that massive stars were the norm for the first generation, which helps explain why we don’t see any of them lingering in the universe today.
The research has been published in Nature.