Evolution and Global Warming are facts, not theories!

Hand Evolution by Megan Godtland

Science and Reason, use them to guide your life.

Microwave Earth by Megan Godtland

2019 Science Stats

77 Evolution News Articles
for November 2021
Click on the links below to get the full story from its source

Nature cares only that you reproduce and raise the kids.
After you've done that, get out of the way.

11-30-21 Can omicron-specific vaccines arrive fast enough to make a difference?
Vaccine-makers are already adapting vaccines to fight the omicron coronavirus variant, but it will probably already have swept the world by the time these arrive. Is omicron better at evading existing vaccines than older coronavirus variants? Vaccine-makers are racing to produce omicron-specific versions, just in case. But if omicron is as transmissible as some fear, it could rapidly sweep around the world, triggering another wave of covid-19 cases before people can be protected with an updated vaccine. mRNA vaccines can be updated more quickly than other types of vaccine, and the two main manufacturers of the covid-19 mRNA vaccines, Moderna and Pfizer/BioNTech, say they will be able to start production within the next few months. But the updated vaccines will still need to go through some tests in people before they can be rolled out more widely. This is why authorities around the world are urging people to get vaccinated now if they haven’t already, or to get a booster shot if they have been. High levels of antibodies might provide decent enough protection against the variant, even with a less-than-perfect vaccine. This is true of the delta variant: vaccine efficacy against symptomatic infections is around 15 per cent lower with delta, but booster shots can raise it to more than 90 per cent. To get an idea of how good omicron is at evading existing immunity, researchers will need to test how well antibodies taken from people who have been vaccinated or previously infected work against omicron in the lab. These neutralisation tests can be done in a matter of days, but they require live samples of omicron, which are hard to come by for now. At best, however, neutralisation tests will only give us an idea of how much more likely people are to get symptomatic infections. What we really want to know is if omicron is more transmissible and more likely to cause severe disease.

11-30-21 Here’s the chemistry behind marijuana’s skunky scent
Newly identified sulfur compounds in cannabis flowers give the plant its telltale funky odor. Scientists have finally sniffed out the molecules behind marijuana’s skunky aroma. The heady bouquet that wafts off of fresh weed is actually a cocktail of hundreds of fragrant compounds. The most prominent floral, citrusy and piney overtones come from a common class of molecules called terpenes, says analytical chemist Iain Oswald of Abstrax Tech, a private company in Tustin, Calif., that develops terpenes for cannabis products (SN: 4/30/18). But the source of that funky ganja note has been hard to pin down. Now, an analysis is the first to identify a group of sulfur compounds in cannabis that account for the skunklike scent, researchers report November 12 in ACS Omega. Oswald and colleagues had a hunch that the culprit may contain sulfur, a stinky element found in hops and skunk spray. So the team started by rating the skunk factor of flowers harvested from more than a dozen varieties of Cannabis sativa on a scale from zero to 10, with 10 being the most pungent. Next, the team created a “chemical fingerprint” of the airborne components that contributed to each cultivar’s unique scent using gas chromatography, mass spectroscopy and a sulfur chemiluminescence detector. As suspected, the researchers found small amounts of several fragrant sulfur compounds lurking in the olfactory profiles of the smelliest cultivars. The most dominant was a molecule called prenylthiol, or 3-methyl-2-butene-1-thiol, that gives “skunked beer” its notorious flavor (SN: 11/27/05). The sulfur compounds have been found in nature, but never before in cannabis, says Amber Wise, an analytical chemist with Medicine Creek Analytics in Fife, Wash., who was not involved in the study. Oswald was surprised to find that prenylthiol and many of the other sulfurous suspects in cannabis share structural similarities with molecules found in garlic. And like these alliaceous analogs, a little goes a long way.

11-29-21 Living robots made from frog cells can replicate themselves in a dish
Swarms of tiny "xenobots" can self-replicate in the lab by pushing loose cells together – the first time this form of reproduction has been seen in multicellular organisms. Swarms of tiny living robots can self-replicate in a dish by pushing loose cells together. The xenobots – made from frog cells – are the first multicellular organisms found to reproduce in this way. Xenobots were first created last year, using cells taken from the embryo of the frog species Xenopus laevis. Under the right lab conditions, the cells formed small structures that could self-assemble, move in groups and sense their environment. Now, the researchers behind the work have found that xenobots can also self-replicate. Josh Bongard at the University of Vermont and Michael Levin at Tufts University in Massachusetts and their colleagues began by extracting rapidly dividing stem cells that are destined to become skin cells from frog embryos. When the cells are brought together in clumps, they form spheres of around 3000 cells within five days. Each clump is around half a millimetre wide and covered in minuscule hair-like structures. These act like flexible oars, propelling the xenobots forward in corkscrew paths, says Bongard. The team noticed that individual clumps of cells appeared to work together in a swarm, pushing other loose cells in the dish together. The resulting piles of cells gradually formed new xenobots. Further experiments revealed that groups of 12 xenobots placed in a dish of around 60,000 single cells appear to work together to form either one or two new generations. “One [xenobot] parent can begin a pile and then, by chance, a second parent can push more cells into that pile, and so on, generating the child,” says Bongard. Each round of replication creates slightly smaller xenobot offspring, on average. Eventually, offspring that comprise fewer than 50 cells lose their ability to swim and reproduce.

11-29-21 Canine teeth shrank in human ancestors at least 4.5 million years ago
The extra-large, dagger-like canine teeth seen in male great apes have been missing from human ancestors for at least 4.5 million years – possibly because females opted for less aggressive partners. Male hominins may have lost the extra-large canine teeth that are seen in most other male primates at least 4.5 million years ago – relatively early in our evolution. This suggests that male human ancestors became less aggressive with each other around the same time, possibly because females preferred less aggressive mates, says a researcher behind the finding. Modern-day human males have proportionately the smallest canines of all male great apes. For most other primates, such as gorillas and chimpanzees, males have significantly bigger canines than females. Larger canines have been linked with more fighting between males for access to females. It is unclear when in our evolutionary history male canines shrank, because fossils that are several million years old lack DNA that could be sequenced and assigned to a sex. The ancestors of humans and chimpanzees split about 7 million years ago, so the change in tooth size is thought to have happened at some point since then. Gen Suwa at the University of Tokyo in Japan and his colleagues measured the dimensions of more than 300 fossil teeth spanning 6 million years of hominin evolution. These included 24 from Ardipithecus ramidus, one of the earliest known hominins, which lived about 4.5 million years ago. The A. ramidus canines didn’t clearly fall into two distinct groups, so the team developed a statistical technique for analysing subtle variations to distinguish male and female teeth. To check its accuracy, the group tested their technique on modern samples from primate teeth for which the sex was known.

11-29-21 A single vaccine could protect against many mosquito-borne diseases
A vaccine that changes the way our body responds to mosquito bites could protect us from diseases the insects carry, and also seems to make mosquitoes lay fewer eggs.A vaccine designed to protect against all mosquito-borne diseases by changing the way our immune system responds to bites has been shown to be safe in a small human trial. The results suggest it should reduce infections by the Zika virus, and also show that mosquitoes that feed on vaccinated people lay fewer eggs. “The results are very positive,” says Olga Pleguezuelos at PepTcell, one of the companies involved in developing the vaccine with funding from the UK and US governments. The hope is that the vaccine could provide protection against all mosquito-borne diseases including Zika, West Nile, chikungunya, dengue, yellow fever and malaria. It might work best when combined with other vaccines targeting these pathogens directly, Pleguezuelos says, but for some of these diseases no vaccines are currently available. When a mosquito inserts its proboscis into our skin, it secretes saliva containing a complex mix of proteins that not only stops blood clotting, but also changes our immune response in a way that makes us more likely to be infected by any diseases the mosquito carries. If you inject malaria parasites into people with a needle, it takes thousands to infect them, Pleguezuelos says. But a mosquito bite that transfers as few as five parasites can infect people. Together with researchers at the National Institutes of Health in the US, Pleguezuelos has developed a vaccine containing synthetic proteins that match parts of some of the key proteins in mosquito saliva. The aim is to change our immune response to mosquito saliva in a way that reduces infections.

11-29-21 A sailor’s story captures the impact of rising serious fungal infections
Some fungi have emerged as recent threats to human health, causing problems that are hard to diagnose and hard to treat. Tyson Bottenus once captained an 80-foot schooner called the Aquidneck. He sailed tourists off the coast of Newport, R.I., discussing the area’s history and sites. In January 2018, he had finished another season at the schooner’s helm and had recently gotten engaged to his partner of many years, Liza Burkin. To celebrate, the couple, avid cyclists who’ve ridden through New Zealand and Japan, set off for a bike tour of Costa Rica. “We were on this very, very dusty road for a long time,” Bottenus remembers of a ride to Montezuma on the Nicoya Peninsula. “It was a dirty, sandy, hard-packed road.” While going downhill, Bottenus crashed, badly scraping his elbow. The next morning, a doctor spent about an hour picking out little rocks and cleaning dirt from the wound before she bandaged it up. The injury kept him from swimming but otherwise didn’t disrupt the trip. About a month after returning to Rhode Island, Bottenus started having headaches. He couldn’t control his mouth properly — his speech was off, and he was drooling. Eventually his doctor ordered an MRI, which revealed a lesion in his brain. “My first thought was, I must have some sort of cancer,” he says. “I’m only 31…. I’m way too young for this.” It wasn’t cancer. Nor was it any of the infections proposed as doctors searched for a diagnosis. Two brain biopsies didn’t provide enough tissue to identify the problem. In August 2018, Bottenus became very ill and was hospitalized. He couldn’t walk. His mouth muscles weren’t working. And he could no longer tie the drawstring of his pants with a square knot, a common knot sailors use to fasten two ropes together. It’s a knot Bottenus has taught others and could previously do with his eyes closed. “I’m a captain of boats, supposedly,” he remembers thinking. “I am not the person I think I am.”

11-27-21 NHS England to test Netflix-style subscriptions for antibiotics
A Netflix-style scheme to fund new antibiotics through a subscription – through which the health service pays drug companies a set annual fee, regardless of how many doses are used – will start next year in England. The approach tackles a key element of the antibiotic resistance crisis: that few new drugs are being developed, while existing ones are failing. The field is currently unprofitable for pharmaceutical firms because when new antibiotics reach the clinic, health services use them sparingly, to slow the spread of bacterial resistance. Under the new scheme, manufacturers Pfizer and the Japanese firm Shionogi will receive fixed payments from England’s National Health Service from April for two recently launched antibiotics, called Zavicefta and Fetcroja, respectively. The amount to be paid hasn’t yet been announced, but the UK government has previously said it could be as much as £10 million a year. “It doesn’t matter if we use no pills, one pill or 10,000 pills,” says Colin Garner at Antibiotic Research UK, a charity that wasn’t involved in the scheme. “[Funding] is not at all related to the number of antibiotics used.” Bacterial resistance to antibiotics, used to treat conditions such as pneumonia, sepsis and wound infections, is seen as one of the biggest threats to public health. Resistant microbes have become more common as antibiotic use has increased, raising the prospect that simple infections will become untreatable, and common operations and cancer treatments will be far riskier. For years, various schemes have been debated to try to provide an incentive for companies to do research and development in this field. The UK subscription model is the first designed explicitly to reimburse firms an amount that reflects an antibiotic’s overall value to the health service, even if it is kept as a medicine of last resort and used little.

11-25-21 Mammoth ivory pendant is oldest decorated jewellery found in Eurasia
A pendant carved with mysterious dots and unearthed in a Polish cave is thought to be over 40,000 years old. A pendant carved from mammoth ivory is the oldest known ornate jewellery made by humans in Eurasia. The discovery is shaking up our understanding of the emergence of so-called symbolic behaviours in the region. The oval-shaped pendant, 4.5 centimetres long and 1.5 cm wide, was unearthed in Stajnia cave in Poland. It has two holes drilled into it, presumably to be used for thread, and is decorated with a sequence of more than 50 small indents in a looping curve. “It’s a beautiful piece of past work from Homo sapiens, an amazing piece of jewellery,” says Sahra Talamo at the University of Bologna, Italy, who led the team that analysed the pendant. Using a new radiocarbon dating technique, the researchers discovered that the pendant was created 41,500 years ago, making it the oldest of its kind found in Eurasia. “We were quite shocked,” Talamo says. This predates other objects and personal ornaments with punctured dot motifs found in France and Germany by 2000 years. It also highlights Poland as an important region for artistic innovation for the first wave of modern humans in Europe who developed new types of decoration for their bodies as a marker of personal or cultural identity. The pattern of dots on the ivory forms an asymmetrical loop, but exactly what they mean is still an open question, says Talamo. “The most beautiful interpretation is that it is a lunar calendar,” she says. The motif is similar to the one found on the Blanchard plaque from France, an engraved bone dated to around 30,000 years ago, which has been postulated to be a hunting tally to count the number of animals killed, or a marker of the position of the moon over time. The excavations at the Stajnia cave also reveal how modern humans were in Poland around 10,000 years earlier than previously thought. “Poland was not supposed to have Homo sapiens there at this time,” says Talamo.

11-24-21 How complex does an organism have to be to exhibit free will?
The way I see it, the illusion of having free will (whatever that means) stems from our inability to take into account the many factors that go into our every thought and action. In that sense, the bare minimum that an organism would seem to need to exhibit free will is a consciousness that is complex enough to show signs of introspection, but not so complex as to be able to process all those factors. I am intrigued that this question on free will was originally posed in the same issue of New Scientist as an article about sophisticated AI (9 October, p36). If the brain is indeed a super-complex computer, then free will is certainly an illusion. There is no evidence that there is anything in the brain other than synapses firing in response to stimuli. In which case, the best we can do to assert something akin to free will might be to choose to determine an outcome based on the toss of a coin. Only by doing this can we be sure that the decision wasn’t predetermined by a combination of various inputs, from education to indigestion! A brain exists to receive inputs from external and internal sensors. From these, it forms a model of the world, with the organism in it, and takes decisions as to what to do based on that model. The more sophisticated the brain, the more sophisticated the model and the more options the organism has to choose from – and the less likely it is to produce the same decision in similar circumstances. The moth isn’t capable of perceiving an option of not banging into a hot light bulb, but a fox or a human can update their models of the world to change their options and their decisions in the light of experience.

11-24-21 Can bullets fired upwards cause injuries when they return to earth?
If you see a soldier shooting into the air, run for shelter first and cringe later. Falling bullets cause injuries and deaths. A bullet shot straight up into the sky will fly upwards until its initial kinetic energy is exhausted. It will then start falling and accelerate towards the ground under the influence of gravity until it reaches its terminal velocity, which is limited by air resistance. According to the US Centers for Disease Control and Prevention, falling bullets can hit the ground at speeds greater than 61 metres per second (m/s). Bullets travelling between 46 and 61 m/s penetrate skin. Faster than this, and they can penetrate the skull. Celebratory gunfire can cause injuries that require emergency room treatment and death. In Puerto Rico, 19 people were injured by celebratory gunfire on New Year’s Eve 2003, and one died. In the US too, New Year’s Eve sees guns fired into the air, and deaths have occurred in the likes of Maryland, Ohio and Texas. Celebratory gunfire is common in many countries and sadly there are hundreds of documented cases of people being killed and injured by falling bullets. In September this year, there was a report of 17 deaths and 41 injuries in Kabul, Afghanistan, following gunfire celebrations after the Taliban claimed to have captured the Panjshir valley. An individual’s chance of being hit by a falling bullet is small, but if hit, the likelihood of being killed is up to five times greater, at 32 per cent, than it is from a direct gunshot. This is because injury typically occurs to the head and shoulders rather than to less critical body parts. If fired vertically into the air, a bullet can reach a height of up to around 2 miles. But because of the various forces acting on a projectile that is fired in this way, the shooter is extremely unlikely to be hit by one of his own bullets as it comes back down.

11-24-21 Probiotics may prevent bacterial infection of the blood in mice
Feeding mice a probiotic of harmless bacteria helps prevent harmful microbes entering the blood where they could build up and potentially cause a condition called sepsis. Consuming a type of bacterium that is commonly found in soil helps mice avoid a blood infection that could potentially lead to sepsis, and the research might one day lead to treatments for people too. Sepsis results from the activity of bacteria, including Enterococcus faecalis. These microbes can live in the human gut without causing disease, but in people who take antibiotics for prolonged periods, or treatments that weaken their immunity, E. faecalis can spread into the blood where it can cause body-wide infection. This is sepsis. Now, for the first time, there is evidence from experiments in mice that consuming a probiotic can prevent blood infection. The probiotic was in the form of spores from another type of bacterium, Bacillus subtilis. These spores are dormant forms of the bacterium that don’t reproduce themselves and are highly resistant to environmental damage. On entering the gut, they activate and grow, influencing the growth of other bacteria in the intestine. Michael Otto at the National Institute of Allergy and Infectious Diseases in Maryland and his colleagues first mimicked the treatment that people with blood cancer often receive, by giving mice the chemotherapeutic drug cyclophosphamide for a few days and then following this up with a cocktail of antibiotics. The team then fed mice with two doses of either B. subtilis spores or a salt solution before a dose of E. faecalis the following day. The next day, the mice that had received the salt solution placebo treatment did have E. faecalis in their blood, where it could potentially cause sepsis, but those that had received the probiotic avoided blood infection.

11-24-21 Survival of the friendliest? Why Homo sapiens outlived other humans
We once shared the planet with at least seven other types of human. Ironically, our success may have been due to our deepest vulnerability: being dependent on others. HUMANS today are uniquely alone. For the majority of the existence of Homo sapiens, we shared the planet with many other types of human. At the time when our lineage first evolved in Africa some 300,000 years ago, there were at least five others. And if you were going to place a bet on which of those would outlast all the rest, you might not have put your money on us. The odds would have seemed more favourable for the Neanderthals, who had already adapted to live in colder conditions and expanded to inhabit much of Eurasia. Or Homo erectus, who had made a success of living in south-east Asia. By contrast, our direct Homo sapiens ancestors were the new kids on the block, and wouldn’t successfully settle outside of Africa until more than 200,000 years later. Yet, by 40,000 years ago, or possibly a bit more recently, we were the only humans left standing. Why? Many explanations have been put forward: brainpower, language or just luck. Now, a new idea is building momentum to explain our dominance. Ironically, it may be some of our seemingly deepest vulnerabilities – being dependent on others, feeling compassion and experiencing empathy – that could have given us the edge. Today, surrounded by computers, phones and all the other clever things we have invented, it is easy to pin our success on our cognitive abilities. But the more we learn about other types of human, the more they seem similar to us in this regard. In the case of Neanderthals, and possibly the mysterious Denisovans, this includes the ability to make sophisticated tools, such as projectile spears that enabled them to hunt large game. Similarly, we are discovering that artistic flair – a marker for the ability to think symbolically, and thought to be another vital ingredient for our dominance – wasn’t just the preserve of our species. Homo erectus etched patterns onto shells some 500,000 years ago and Neanderthals drew on cave walls.

11-24-21 What can we can learn from being the last type of human left standing?
IT IS sobering to think that if the Neanderthals had continued for 2000 more generations, they would still be sharing the planet with us today. Our other close relatives, the mysterious Denisovans, came even closer to surviving to modern times, and would have needed just 750 more generations of their lineage. Instead, we Homo sapiens find ourselves alone, the sole survivors out of the seven or more types of human that we once shared a planet with. It is easy to assume that we killed the others off, but the most likely explanation for their demise is that dramatic swings in the climate left these other humans – who evidence suggests lived in small, isolated groups – vulnerable to dying out. Another common assumption is that our early Homo sapiens ancestors led lives that were “nasty, brutish and short” in the words of English philosopher Thomas Hobbes who, in his 1651 book Leviathan, reflected rather pessimistically on the nature of humans. “It wasn’t survival of the brutish, but survival of the compassionate and sociable” But if the latest archaeological and genetic research is to be believed, we shouldn’t be so hard on ourselves. In fact, our soft skills – compassion, tolerance and the desire to make connections with others – may have been the secret to our survival through those climate swings when all the other types of human died out. The social networks formed as a result of our emotional nature were a vital insurance policy for tougher times, allowing our early Homo sapiens ancestors to share not only food and resources but ideas. This, in turn, left us better able to adapt to the vagaries of the climate, as we discuss in our cover story. It wasn’t survival of the brutish, but survival of the compassionate and sociable.

11-24-21 Red light therapy could improve eyesight that has declined due to age
Exposure to deep red or near-infrared light can improve the function of the eye’s mitochondria, the powerhouses in cells, resulting in slight but lasting improvement to declining eyesight. An unusual experimental treatment for fading sight involves shining a red light into the eyes for a few minutes to boost the activity of mitochondria, microscopic structures that provide energy inside cells. In the first small test of the approach in 24 people, one short exposure to the light slightly improved people’s performance in tests of colour vision for several days. Deep red light and near-infrared light have previously been shown to enhance the function of mitochondria in a range of cell-based and animal experiments. These wavelengths seem to work by improving the performance of key molecular structures within mitochondria, called ATP synthase pumps. These pumps manufacture a molecule called ATP, which cells use for energy, by rotating within the watery environment of the mitochondria. Deep red light has just the right wavelength, at 670 nanometres, to be absorbed by water molecules, which gives them more energy. This makes the water surrounding each pump less viscous, letting the structure rotate faster. “It is like heating up jam to make it easier to stir,” says Glen Jeffery at University College London. Although making cells more energy efficient could affect a wide range of bodily systems, Jeffery’s group has been investigating cells of the retina, a patch of light-sensitive tissue at the back of the eye, as they are packed with more mitochondria than any other cell in the body. Impaired mitochondria may contribute to declining eyesight with age and have been implicated in several causes of blindness.

11-24-21 Neanderthals may have grown their baby teeth faster than we do
A tooth from a Neanderthal child who lived 120,000 years ago suggests that our cousin species began cutting their baby teeth at 4 months – earlier than for the average modern human. The first estimate for when a milk front tooth erupted from the gum in the upper jaw of a Neanderthal child is revealing new information about the development of our extinct cousins. The young owner of the deciduous tooth – sometimes known as a baby tooth – lived in what is now Krapina, Croatia, about 120,000 years ago, and the tooth may have emerged sooner after birth than we expect for our own species – from around 4 months of age rather than from about 7 months. Front teeth, or incisors, are generally the first to erupt from the gum. They enable infants to start eating harder foods. Until now, very little was known about how milk teeth developed in Neanderthal children. “Milk teeth are a unique window on the prenatal life and early childhood of past populations. They grow as part of a developing organism. So, we can use teeth to get information on the growth rates of children,” says Alessia Nava at the University of Kent, UK. Nava and her colleagues used high-energy X-rays to take three-dimensional pictures of the Neanderthal tooth, from the white top part of the tooth called the crown, down to a small part of the tooth root. A tooth’s crown is made of enamel. In the enamel of milk teeth, there is a faint mark called the neonatal line below which there is enamel produced before the baby was born and above which lies enamel laid down postnatally. Enamel is deposited by cells in a daily cycle, which gives it a pattern of stripes called cross-striations. The distance between adjacent stripes represents the amount of tooth growth in a day.

11-23-21 3D-printed 'living ink' is full of microbes and can release drugs
A living ink made entirely from bacterial cells can be 3D-printed to make structures that release anti-cancer drugs or mop up toxins from the environment. An ink made using engineered bacterial cells can be 3D-printed into structures that release anti-cancer drugs or capture toxins from the environment. The microbial ink is the first printable gel to be made entirely from proteins produced by E.coli cells, without the addition of other polymers. “This is the first of its kind… a living ink that can respond to the environment. We have repurposed the matrix that these bacteria normally utilise as a shielding material to form a bio-ink,” says Avinash Manjula-Basavanna at the Massachusetts Institute of Technology in Boston. By embedding another kind of genetically modified E.coli within the gel, Manjula-Basavanna and his colleagues built living structures that either released the anti-cancer drug azurin or captured the toxin bisphenol A (BPA) from the environment. BPA is commonly used to make plastics and has been linked to infertility and cancer. The researchers made the ink from protein polymer molecules called curli nanofibres. First, they genetically engineered E. coli cells to produce subunits of curli nanofibres that had one of two oppositely charged modules, known as either a “knob” or a “hole”, attached to them. By growing a mixture of the two types of cells, they produced curli fibres that crosslinked with each other when the knobs from one fibre locked into the oppositely charged holes from another fibre. The team then filtered the bacteria through a nylon membrane to concentrate the crosslinked fibres, before removing the cells from the mixture. This produced a gel that had a suitable viscosity and elasticity for printing.

11-23-21 New Australopithecus sediba bones suggest extinct hominin was bipedal
The discovery of new Australopithecus sediba fossils mean we can now reconstruct most of the spine of one individual, and strengthen the case that the species was bipedal at least some of the time. Spinal bones of an extinct human relative have been found in lumps of rock blasted out of a South African cave and used to reconstruct one of the most complete back fossils of any hominin. The spine was curved, suggesting that Australopithecus sediba spent a lot of time walking on two legs. A. sediba was first described in 2010 by Lee Berger at the University of the Witwatersrand in Johannesburg, South Africa, and his team. They described two partially preserved individuals: a male child called Karabo and an adult female. Both were found in the Malapa cave system and lived about 2 million years ago. Malapa was first excavated by miners around a century ago, and some of the first A. sediba bones were found in chunks of rock that had been blasted from the cave with dynamite. The miners used some of the blocks to build a road. For the past decade, Berger’s colleagues have been chipping away at these blocks. One has now yielded four vertebrae from the lower back of the female, plus a bone called the sacrum that links the spine to the pelvis. The team has named the female Issa, which means “protector” in Swahili. The new bones mean that most of Issa’s lower back has now been excavated. After fitting the vertebrae together, the team concluded that her spine was curved so that, when viewed from the side, it forms a gentle S-shape, a characteristic of humans that keeps the body’s mass centred over the pelvis for efficient bipedal walking. However, previous studies of A. sediba had also found clear adaptations in the upper body for tree climbing, suggesting it was halfway between tree-dwelling apes and fully bipedal humans.

11-21-21 Milk allergy could be treated with gradual exposure to baked milk
Children who were gradually exposed to baked milk powder learned to tolerate higher doses in a small clinical trial. Children who are severely allergic to milk may be able to start tolerating it if they are given tiny amounts of baked milk followed by progressively larger doses, a small clinical trial suggests. Larger studies are needed to confirm the effect and the therapy shouldn’t be attempted without medical supervision, doctors say. About 3 per cent of preschool-aged children are allergic to proteins in cow’s milk, making it the most common food allergy in young children. Most naturally outgrow it by the time they go to school, but one in five of them continue to be allergic as they get older. Children with persistent milk allergies and their parents have to be vigilant because milk can be hidden in unexpected products like crisps and breakfast cereals. Milk has now overtaken peanuts to become the most common cause of death from food allergy in schoolchildren in the UK. Jennifer Dantzer at John Hopkins University in Baltimore, Maryland, and her colleagues tested whether children with a severe cow’s milk allergy – who react even to trace amounts of milk – could learn to tolerate it if they were gradually exposed to baked milk, which is typically less allergenic than raw milk. The team randomly assigned 30 people aged between 3 and 18 with a severe cow’s milk allergy to either slowly have baked milk powder introduced to their diets or a placebo powder made of tapioca flour. Each day, children ate a baked cupcake or muffin made with ingredients including some of the powder, starting with a dose of 0.1 milligrams and building up to 2 grams over the course of a year. They didn’t know whether they were consuming the baked milk or placebo powder.

11-19-21 Body odour chemical makes men calmer but women more aggressive
A chemical that is sometimes emitted from human skin, breath and faeces has no detectable smell, but it appears to influence people’s behaviour, with men becoming calmer and women becoming more aggressive. Although scientists have yet to determine when or under what conditions people and other mammals release hexadecanal, it seems clear that humans are “communicating” with each other subconsciously through their body odours, says Eva Mishor at the Weizmann Institute of Science in Israel. “Humans [smell] each other – their children, their romantic partners, strangers – all the time,” says Mishor. “Our study gives more power to the notion that humans communicate from the chemical volatiles they emit, and that we get lots of information from them.” Previous studies had shown that hexadecanal acts as a “social buffer” that reduces stress in mice, she says. In humans, EEG-based studies have suggested the compound triggers brain activity differently in men and women, although it wasn’t known how. To find out, Mishor and her colleagues asked 67 men and 60 women – ranging in age from 21 to 34, all identifying as their gender assigned at birth – to play an online economics game against other human or robot players. All the participants had been asked to sniff clove oil prior to playing; for half the people, the clove oil was mixed with hexadecanal, which didn’t affect the oil’s perceived smell. The men, however, responded with 18.4 per cent less volume if they had inhaled the chemical, compared with those who hadn’t, she says. Functional MRI scans of the participants’ brains showed that after sniffing hexadecanal, both men and women had increased activity in the parts of the brain associated with recognising social cues. Then, when they felt provoked by the game, women had increased neural activity linking those regions to brain areas responsible for aggressive behaviour. The same activity along neural connections in men, meanwhile, were reduced.

11-19-21 Response to anaesthetic can predict if people will recover after coma
In a first small test, the brainwaves of people in a minimally conscious state changed in a characteristic way when given an anaesthetic, showing whether they were likely to recover. A new type of test may be able to predict if people who enter a state of impaired consciousness after a brain injury will eventually recover. The test, which involves seeing how people’s brainwaves respond to a general anaesthetic, was highly accurate when carried out on 12 people in a state of impaired consciousness – but now it needs to be trialled on more such people. People who are in a coma after a brain injury, such as from a car accident or a stroke, sometimes emerge into a condition between coma and consciousness called a vegetative state. In this condition, their eyes may sometimes open, but they cannot respond to commands or move purposefully. It is currently very hard for doctors to predict which of these people will recover, leading to great distress for families wondering if life support should be continued or stopped. “You need to make life-and-death decisions about someone you love, with minimal evidence – it’s typically agonising,” says Stefanie Blain-Moraes at McGill University in Montreal, Canada. The new test exploits the fact that the brains of healthy people respond to anaesthesia with characteristic changes in their brainwaves, shown by placing electrodes on their scalp to carry out an electroencephalogram (EEG). When healthy people are awake, their alpha brainwaves – which have a frequency of around 7 to 14 hertz – peak at the front of the brain first, with subsequent peaks happening progressively towards the back of the brain. If they lose consciousness under a general anaesthetic, the pattern flips, with the back leading the front. Blain-Moraes wondered if this might be a useful sign of consciousness in people in a vegetative state. “I suspect that nobody has tried this before because most people equate unresponsiveness with unconsciousness,” she says. “Why would you anaesthetise someone who you thought was already unconscious?”

11-18-21 Analysis of earliest covid-19 cases points to Wuhan market as source
A fresh look at what we know about the first covid-19 cases shows that the earliest known instance was in a person who worked at the Huanan market in Wuhan, which was suspected as the source from the start of the pandemic. An analysis of what we know about the earliest covid-19 cases has strengthened the argument that the coronavirus pandemic began when animals at the Huanan market in Wuhan, China, passed the virus on to people. Among other things, it concludes that the first case was a woman who worked as a seafood vendor at the market, who became ill on 11 December 2019. It is clear that the SARS-CoV-2 virus derives from bat coronaviruses. What isn’t clear is where, when and how it got from bats into people. It has been suspected right from the start that live animals at the Huanan market might be the intermediate host, as the initial cases were clustered around this site. A World Health Organization report on the origins of SARS-CoV-2, published earlier this year, states that the first person known to have covid-19 became ill on 8 December and had no connection to the market. This was partly why the report concluded that no firm conclusion about the role of the market could be drawn. However, this man – a 41-year-old accountant living 30 kilometres away from the market – went to hospital on 8 December because of dental problems and only developed covid-19 symptoms on 16 December, says Michael Worobey at the University of Arizona in his analysis. These dates are confirmed by media interviews with the accountant, hospital records and a scientific paper, Worobey says. That means that the earliest known cases were indeed linked to the market, with the first being the seafood vendor who became ill on 11 December. Why this isn’t in the WHO report is unclear, as the team did speak to the accountant. “My guess is that they were told that this was the ‘December 8’ patient and just accepted it as read,” Worobey says. “But it would be interesting to learn more about that interview, for sure.”

11-18-21 Cancer cells steal energy-generating parts from immune cells
Cancer cells use tiny tubes to reach out to nearby immune cells and capture their energy-generating mitochondria. Cancer cells can boost their own growth by stealing energy-generating parts from nearby immune cells. We already knew that some cell types grow nanotubes, tentacle-like structures made of a protein called actin. The nanotubes can let one cell link itself to another so the two can transport components including mitochondria – energy-generating structures – between them. Now we have our first evidence that cancer cells can do something similar, using nanotubes to hijack mitochondria from two types of immune cells called T-cells and natural killer T-cells, both of which can kill cancer cells. “The fact that cancer cells send out nanoscale tentacles and suck out the mitochondria is a rather surprising finding,” says Shiladitya Sengupta at Harvard Medical School. He and his colleagues put immune cells and cancer cells from mice in the same dish for 16 hours before taking pictures of their interactions using a microscope. They found that, on average, each cancer cell formed one nanotube with a T-cell, while most of the nanotubes were between 50 and 2000 nanometres wide. By labelling the mitochondria inside the immune cells with a fluorescent chemical marker, the team discovered that the mitochondria were transferred towards the cancer cells along the nanotubes. Significantly, cancer cells consumed oxygen at around double the rate and reproduced more often when they were placed in contact with T-cells for 16 hours, compared with a control group of cancer cells that were grown in the presence of T-cells but were physically separated from them. This suggests that stealing mitochondria helps cancer cells generate energy and grow. Consistent with this idea, cancer cells grown in the presence of physically separated T-cells reproduced and respired at a similar rate to cells grown in the absence of T-cells.

11-17-21 Why are we irrational? How a logical flaw stops us solving problems
Myths and stories trump rational reasoning when it comes to analysing distant threats like climate change. But we have tools to combat that – and it’s a myth irrationality is on the rise. TAKE a look at the data to the right, showing crime rates in US cities according to whether or not they ban concealed handguns. Based on these numbers, would you conclude that gun control reduces crime? Take as much time as you want. If you answered no, give yourself a pat on the back. Most people answer yes, dazzled by the large number of cities with gun control and decreasing crime. But what matters is the proportion of cities with falling crime. That’s 75 per cent for cities with gun control and 84 per cent for those without. The rational conclusion is that gun control increases crime, or at least doesn’t decrease it. Before you punch the air or a passer-by, the data is fake. But faced with it, supporters of gun control are more likely to jump to the wrong conclusion. Opponents of gun control scrutinise the data more cautiously and more often spot the real pattern. The test is designed to winkle out a pervasive and intractable source of human irrationality, the myside bias. It expresses the tribal thinking that evolution has gifted us (see “Why are we good and evil?”): a tendency to seek and accept evidence that supports what we already believe. “You direct your reasoning to end up with a conclusion that is already a sacred belief or a shibboleth in your side, your team, your coalition, your party, your posse,” says Steven Pinker at Harvard University, author of Rationality: What it is, why it seems scarce, why it matters. On an individual level, such “motivated reasoning” is generally fairly harmless, but once promoted to group level, it can unleash chaos. The obvious example is climate change, where positions are determined almost completely by politics and denial is impervious to scientific facts.

11-17-21 Why does evolution happen? The rules on Earth may well be universal
Dig down, and evolution by natural selection is just about spontaneous, sustained accumulation of complexity – if life elsewhere exists, it’s likely to develop in the same way. EVOLUTION is a fact of life, at least of life as we know it. Here on Earth, organisms that just so happen to be better adapted, or “fit”, for their environment, perhaps by virtue of a fortuitous mutation, tend to survive longer and leave more offspring. The less fit leave fewer descendants and the unfit none at all. Whatever it was that made the winners fit thus accumulates in the next generation, a cruel and random Squid Game called evolution by natural selection. As to why it happens, on one level that’s simple. According to biologist Richard Dawkins, evolution is simply a change in gene frequencies in populations. If a gene in a colony of woodlice living under a dead log becomes more or less common for some reason, evolution has happened. But must it be like that? All life on Earth that we know of comes from the same origin and uses the same biochemical operating system based on DNA. Putative life on other planets, or “shadow life” from an independent origin on Earth, might conceivably operate under very different rules. Does life have to evolve – and if so, does that have to be by natural selection? “That’s a very interesting and large question,” says Dawkins. Arik Kershenbaum at the University of Cambridge, author of The Zoologist’s Guide to the Galaxy: What animals on Earth reveal about aliens – and ourselves, thinks the alternatives are limited. One might be to postulate an intelligent designer, as perhaps we ourselves might someday design self-replicating organisms. But that only gets you so far, says Kershenbaum. “It’s definitely a possibility we could find planets covered in artificial life,” he says. “But then you have to explain where the designers, the original life, came from.”

11-17-21 Why are we conscious? The answer lies in other animals’ heads
It’s easy to think human conscious experience is unique, but a better understanding of consciousness’s mysteries comes by tracing it back in the evolutionary tree. THE smell of coffee, the blue of the sky, the anticipation of seeing a loved one: it is impossible to imagine our lives without the vivid conscious experiences of our every waking moment. And yet they have vexed philosophers for centuries. “The nature of consciousness is extraordinarily difficult to define,” says Eva Jablonka at Tel Aviv University in Israel. It was once thought of as an immaterial force, a “ghost in the machine” separate from physical reality. Today, however, many neuroscientists argue that our felt experience is simply the product of our brain’s inner workings. That makes the question of “why?” loom large. Many actions controlled by the brain occur unconsciously, beneath the level of our awareness. Why make exceptions? Grasping this means thinking outside our own box, says Anil Seth at the University of Sussex, UK. “Human consciousness is not the only form of being conscious,” he says. We tend to emphasise conscious experiences that make us think we are better and smarter than other animals, like our ability to recognise ourselves in a mirror, he says. “This is not very helpful.” The absolute fundamental of consciousness – having an actual experience of things – is something seemingly shared by many other organisms. “In my view, there are grades and varieties of awareness, and there is no principled dividing line about which – SHAZAM! – the light of consciousness is turned on,” says Daniel Dennett at Tufts University in Massachusetts. With a broader view of consciousness, we can look back along the tree of life to get an idea about its earliest glimmers. Jablonka, together with Simona Ginsburg at the Open University of Israel, has done this with a concept the pair call unlimited associative learning, a capacity to learn about and connect new stimuli,even when experienced at different times.

11-17-21 Why are we irrational? How a logical flaw stops us solving problems
Myths and stories trump rational reasoning when it comes to analysing distant threats like climate change. But we have tools to combat that – and it’s a myth irrationality is on the rise. TAKE a look at the data to the right, showing crime rates in US cities according to whether or not they ban concealed handguns. Based on these numbers, would you conclude that gun control reduces crime? Take as much time as you want. If you answered no, give yourself a pat on the back. Most people answer yes, dazzled by the large number of cities with gun control and decreasing crime. But what matters is the proportion of cities with falling crime. That’s 75 per cent for cities with gun control and 84 per cent for those without. The rational conclusion is that gun control increases crime, or at least doesn’t decrease it. Before you punch the air or a passer-by, the data is fake. But faced with it, supporters of gun control are more likely to jump to the wrong conclusion. Opponents of gun control scrutinise the data more cautiously and more often spot the real pattern. The test is designed to winkle out a pervasive and intractable source of human irrationality, the myside bias. It expresses the tribal thinking that evolution has gifted us (see “Why are we good and evil?”): a tendency to seek and accept evidence that supports what we already believe. “You direct your reasoning to end up with a conclusion that is already a sacred belief or a shibboleth in your side, your team, your coalition, your party, your posse,” says Steven Pinker at Harvard University, author of Rationality: What it is, why it seems scarce, why it matters. On an individual level, such “motivated reasoning” is generally fairly harmless, but once promoted to group level, it can unleash chaos. The obvious example is climate change, where positions are determined almost completely by politics and denial is impervious to scientific facts.

11-17-21 Why do we grieve? The surprising origin of the feeling of loss
The debilitating pain we sometimes feel at the loss of those we love is an evolutionary mystery. It could all come down to what happens in our childhoods. “TIS better to have loved and lost than never to have loved at all,” wrote Alfred Tennyson. Try telling that to someone in the throes of grief. “It’s so awful and so debilitating. People don’t eat and they don’t sleep, and they don’t function,” says Randolph Nesse at Arizona State University. Aside from the overwhelming emotional pain and sadness, grief is bad for our physical health too: those who have been recently bereaved are more likely to have health problems and even die in the weeks and months following a loss. Evolution is famously all about survival (see “Why does evolution happen?”). So if grief is so debilitating that it leaves us unable to cope with life, why did we evolve this trait? “It doesn’t make that much sense for people to be so dramatically impaired for so long,” says Nesse. One popular explanation starts with childhood. When we are young and vulnerable, forming strong attachments and staying close to others is a smart survival move. The reactions of children separated from their mothers – an intense “protest” phase, followed by a withdrawn period known as “despair” – are also seen in grieving adults. More recently, neuroimaging studies have backed up this idea. When grieving people think about the deceased, a reward centre in the brain associated with social bonding lights up. The protest phase of loss is also characterised in behaviours like grieving people needing to find or see the body, thinking they have seen the deceased alive and even believing in ghosts. This “searching” behaviour for someone you know is dead might sound pointless, but it may have been different in our evolutionary past. “If you’re a hunter-gatherer and your 3-year-old disappears, you’re not just going to say, ‘Too bad’, you’re going to go looking for that 3-year-old for days and weeks and months,” says Nesse. “You’re not going to give up.”

11-17-21 Why are we good and evil? A single quality may be at the root of it
The human capacity for both good and evil has long mystified philosophers. Evolutionary biology suggests they are both offshoots of one of our oddest character traits. “THE evil that men do lives after them; the good is oft interred with their bones. So it will be with Dzhokhar Tsarnaev.” So said Judge George O’Toole before sentencing Tsarnaev to death for his part in the 2013 Boston Marathon bombing. During the trial, it emerged that the killer was well liked by his teachers and friends, had been compassionate to people with disabilities and had apologised to victims and their families. But, said O’Toole, his goodness would always be overshadowed by his hateful act. The human capacity for both good and evil, often within the same person, has long been recognised and puzzled over; O’Toole was quoting the Roman general Mark Antony in Shakespeare’s Julius Caesar. What is it about us that endows us with such diametrically opposite propensities? Evolutionary biology has an answer, and it doesn’t reflect well on human nature. Acts of both good and evil are driven by altruism – and that is ultimately selfishness in disguise. Take our expert-led evolution course and explore the source of life’s diversity For a long time, altruism was a biological mystery. The prime directive of evolution is to pass on our genes to the next generation. Engaging in costly behaviours with no obvious survival pay-off seems to go against that grain. The polymath J. B. S. Haldane eventually twigged it: individuals mostly make sacrifices for close relatives, and hence help to usher copies of their own genes into the next generation. As Haldane put it: “I would lay down my life for two brothers or eight cousins.” Acts of true selflessness exist, but these are explained as reciprocal altruism, where kindness to strangers (who may in fact be relatives) is banked for the future. That’s all good, but what about evil? Evildoers often see their acts as being for the greater good. This “pathological altruism” lies behind some of the worst atrocities in human history, including wars of aggression and genocide. The Boston Marathon bomber apparently thought that radical Islam was a good enough cause to maim and kill for.

11-17-21 Calls to mental health helplines rose by a third in covid-19 lockdowns
Though calls for mental health services spiked by 35 per cent early in the pandemic, the proportion of people seeking help for suicidal thoughts remained the same as before the covid-19 restrictions were put in place. Calls to mental health helplines in 19 countries rose by about a third, on average, shortly after the start of lockdowns brought in during the early months of the covid-19 pandemic, before subsiding to similar levels to before. During the surge, there were small increases in the proportion of calls made by people feeling lonely and by people fearful of becoming infected with the coronavirus, but the nature of people’s concerns stayed broadly similar to calls made before the pandemic. Several studies have suggested that more people have felt anxious or depressed since the pandemic began. These levels are usually assessed using mental health surveys and suicide statistics, but Marius Brülhart at the University of Lausanne in Switzerland wondered if there was another way to chart changes in people’s mental well-being. “We thought: ‘What can we do to get a measure of the mental health of the population?’,” says Brülhart. Most phone helpline services for those who are in mental distress keep logs of their calls, including brief notes on the reasons people rang. Brülhart and his colleagues analysed anonymous data from 8 million calls to helplines in 19 countries, including the US, China, Israel and several European countries, but not the UK, looking at the period from early 2019 to early 2021. They found that the number of calls peaked about six weeks after each country’s coronavirus restrictions began, spiking to 35 per cent higher than before the pandemic – although some helplines initially lacked enough capacity to answer all calls and so may have missed some of the earlier rise.

11-17-21 Anonymised genomes cannot be linked to faces as previously claimed
In theory, genomes shared anonymously could be linked to people on social media because the DNA can be used to predict facial features, but the risk is vanishingly small. What your face looks like is determined almost entirely by the DNA you inherit. This has led to the claim that the millions of anonymised genomes shared for medical research could be linked to specific individuals via photos shared on social media – but the risk is very low, according to Rajagopal Venkatesaramani at Washington University in St Louis, Missouri, and his colleagues. The researchers studied the genomic data and online photos of 126 individuals, then tried to match faces to genomes. They worked backwards from the faces, using AI to analyse the photos and predict gene variants, then looking for genomes with those predicted variants. Given a subset of just 10 individuals, the team was able to identify a quarter of them. However, as the number of people increased, accuracy plummeted. For groups larger than 100 people, it was negligible. Venkatesaramani and his colleagues say a key reason for this is that social media images are much lower quality than the studio photographs used in previous studies. Daniel Crouch at the University of Oxford, who has studied the genetics of facial features, agrees that the risk is low. But he says the team’s analysis shows that this is actually due to the difficulty of linking gene variants with specific facial features, rather than image quality. “It is not really the quality of photos that matters that much,” says Crouch. “We are still only really just starting to understand the genetics of facial variation.” “Once our understanding of facial genetics improves, our ability to link faces and DNA will improve too,” he says. “However, I suspect we will never quite get to a point where we can predict whether a DNA sample belongs to a specific person, drawn from anyone on the planet, at least in our lifetimes.”

11-17-21 mRNA vaccine against tick bites could help prevent Lyme disease
An mRNA vaccine that causes a red, itchy skin rash in response to bites by ticks may allow them to be removed before they transmit Lyme disease-causing bacteria. An mRNA vaccine designed to create an immune response to ticks so they can be removed before they transmit Lyme disease has been shown to be effective in guinea pigs. It is hoped the finding will pave the way for clinical trials in people. Lyme disease is caused by a bacterium called Borrelia burgdorferi that is transmitted through tick bites. If left untreated, it can cause lifelong health problems like Lyme arthritis and nerve pain. Erol Fikrig at Yale University and his colleagues have developed a vaccine that trains the immune system to respond to tick bites, by exposing it to 19 proteins found in tick saliva. The vaccine contains mRNA molecules that instruct cells to make these proteins, in the same way that mRNA covid-19 vaccines direct cells to make coronavirus proteins. Guinea pigs given the anti-tick vaccine developed red, itchy rashes when they were later bitten by ticks, suggesting their immune systems were responding. The ticks also tended to detach early without sucking as much blood as they normally would. The researchers then placed ticks carrying Lyme disease-causing bacteria on vaccinated and unvaccinated guinea pigs. The ticks were removed from the vaccinated animals when their skin rashes emerged – usually in the first 18 hours – and none became infected with the bacteria. In contrast, half the unvaccinated animals became infected. If the vaccine works the same way in people, it will enable us to “readily detect a tick bite early, due to redness at the bite site, and likely itching”, says Fikrig. This is important because tick bites are often painless and go unnoticed. The tick could then be pulled off before transmitting any Lyme disease-causing bacteria, which normally takes about 36 hours.

11-17-21 World's largest mass extinction may have begun with volcanic winter
The end-Permian mass extinction 252 million years ago might have begun when eruptions triggered a volcanic winter. For decades, we have been trying to unravel the causes of the end-Permian mass extinction, the most devastating extinction event in our planet’s history. The prevailing view is that global warming played a part, but now there is evidence that the warming was preceded by a volcanic winter – a long, global cold spell called by volcanic activity that would have destabilised ecosystems. About 252 million years ago during the end-Permian extinction, life on Earth came dangerously close to a terminal collapse. In the geologic blink of an eye, roughly 85 per cent of the species on the planet vanished. This is thought to have begun when lava oozed across modern-day Siberia in a series of eruptions that pumped enough carbon dioxide and methane into the atmosphere to raise global temperatures and starve the oceans of oxygen. Now, a study suggests that the so-called Siberian Traps aren’t the only eruptions to blame for the extinction. “In southern China, there are unusual levels of copper and mercury embedded in ash layers right at the mass extinction boundary,” says Michael Rampino at New York University, one of the authors of the study. The ash layers are also rich in sulphur, which hints at the style of volcanic eruption: “This suggests explosive volcanism in the region,” he says. These explosive eruptions – which were distinct from the non-explosive Siberian eruptions – were catastrophic enough that the ensuing ash cloud likely heralded the beginning of what Rampino refers to as a “volcanic winter”, a rapid period of global cooling that the researchers think may have preceded the warming caused by the Siberian Traps. “There would have been global effects on climate as material from the eruptions would have been carried around the globe by stratospheric winds,” says Rampino.

11-17-21 New high-speed video reveals the physics of a finger snap
Friction plus compressibility of the finger pads are key to a speedy snap. It all happens in a snap. New high-speed video exposes the blink-and-you’ll-miss-it physics behind snapping your fingers. The footage reveals the extreme speed at which the gesture occurs, and shows that friction plus the compressibility of the finger pads are key to humans’ ability to snap properly, researchers report November 17 in Journal of the Royal Society Interface. Finger snaps last only about seven milliseconds — that’s roughly 20 times as fast as the blink of an eye, says biophysicist Saad Bhamla of Georgia Tech in Atlanta. After slipping off the thumb, the middle finger rotates at a rate up to 7.8 degrees per millisecond, nearly what a professional baseball pitcher’s arm can achieve, the team found. And a snapping finger accelerates almost three times as fast as pitchers’ arms. When covered with high-friction rubber or low-friction lubricant, fingers made snaps that fell flat, the team found, indicating that bare fingers have a level of friction ideal for a speedy snap (SN: 8/1/19). That friction between thumb and middle finger allows energy to be stored before it’s suddenly unleashed. Too little friction means less pent-up energy and a slower snap. But too much friction impedes the finger’s release, also slowing the snap. Bhamla and colleagues were inspired by a scene in the 2018 movie Avengers: Infinity War. The supervillain Thanos snaps his fingers while wearing a supernatural metal glove, obliterating half of the universe’s life. The team wondered if it would be possible to snap while wearing a rigid glove. Typically, when the fingers press together in a snap, they compress, increasing the contact area and friction between them. So the researchers tested snapping with fingers covered by hard thimbles. Sure enough, the snaps were sluggish.

11-17-21 An ancient exploding comet may explain why glass litters part of Chile
An airburst over the Atacama Desert 12,000 years ago melted the ground into glass, scientists say. Scattered across a swath of the Atacama Desert in Chile lie twisted chunks of black and green glass. How the glass ended up there, sprinkled along a 75-kilometer-long corridor, has been a mystery. Now, analyses of space dust in the glass show that the glass probably formed when a comet, or its remnants, exploded over the desert 12,000 years ago, researchers report November 2 in Geology. This corridor is the best evidence yet of a comet impact site on Earth, says Peter Schultz, a planetary geologist at Brown University in Providence, R.I. There are only about 190 known impact craters on Earth (SN: 12/18/18). Falling space rocks carved out these sites, but none are known to have been created by a comet. That’s because comets, which are made of mostly ice and some rock, tend to explode before reaching the ground, a fate they share with some small asteroids. These fiery events — known as airbursts — are dramatic, generating massive amounts of heat and strong winds. But the effects are temporary and often fail to leave lasting imprints, like craters, behind. That’s especially true in wet environments. In 1908, an airburst from an asteroid or comet over a remote part of Russia flattened trees and generated a shock wave that knocked people off their feet hundreds of kilometers away. The trees have since grown back over the site of what’s now known as the Tunguska blast, leaving just a marsh (SN: 6/5/08). “If it hadn’t been observed, no one would know it happened,” says Mark Boslough, a physicist at the University of New Mexico in Albuquerque who wasn’t involved in the new research. The Atacama, the world’s driest desert, is better suited to preserving impact sites. And it’s full of sand — the raw material for making glass, which forms when sand is heated to high temperatures. Heat from volcanic activity is responsible for almost all of the naturally derived glass on Earth.

11-16-21 Some genes in the brain may make 100 different proteins
Researchers have sequenced the full "transcriptome" of part of the human brain, revealing all the ways its cells use genes to make proteins. The genetic code for the human brain is much more complex than we realised, with some genes potentially encoding tens or even hundreds of different proteins. The finding comes from the first sequencing of the full “transcriptome” – a readout of all the different ways genes may be used to make proteins – of part of the human brain. “From the same set of genetic information, you can derive a lot more endpoints,” says Jonathan Mill at the University of Exeter, UK. When cells make proteins, first the relevant gene is used as a template to build a copy in the form of mRNA, a process known as transcription. The mRNA sequence is then used to create the protein. Not all of every gene is used, though, as genes consist of several stretches of protein-coding DNA called exons, interspersed with segments called introns that are normally thought to be snipped out from the mRNA, allowing the exons to be “spliced” together. It has long been known that some genes can produce slightly different mRNA sequences, because not all the exons are used. More recently, it was discovered that some introns may not be removed. The importance of such alternative splicing processes has been unclear. Now, Mill’s group has used new sequencing techniques to characterise and quantify all the different mRNA in cells from the cerebral cortex, an outer part of the brain that is important in complex thought processes. The team studied tissue samples from people who had agreed to donate their brain for research after death. From the nearly 13,000 genes active in the cerebral cortex, the researchers found almost 33,000 different mRNA molecules. About a fifth of the genes produced mRNAs containing introns, and more than 200 genes made between 10 and 100 different mRNAs. This shows that alternative splicing in the brain is more important than we realised, says Mill. “The brain is this very complex organ, so it would make sense.”

11-16-21 ‘Life as We Made It’ charts the past and future of genetic tinkering
A new book explores humans’ enduring role as meddlers in evolution. With genetic engineering, humans have recently unleashed a surreal fantasia: pigs that excrete less environment-polluting phosphorus, ducklings hatched from chicken eggs, beagles that glow ruby red under ultraviolet light. Biotechnology poses unprecedented power and potential — but also follows a course thousands of years in the making. In Life as We Made It, evolutionary biologist Beth Shapiro pieces together a palimpsest of human tinkering. From domesticating dogs to hybridizing endangered Florida panthers, people have been bending evolutionary trajectories for millennia. Modern-day technologies capable of swapping, altering and switching genes on and off inspire understandable unease, Shapiro writes. But they also offer opportunities to accelerate adaptation for the better — creating plague-resistant ferrets, for instance, or rendering disease-carrying mosquitoes sterile to reduce their numbers (SN: 5/14/21). For anyone curious about the past, present and future of human interference in nature, Life as We Made It offers a compelling survey of the possibilities and pitfalls. Shapiro is an engaging, clear-eyed guide, leading readers through the technical tangles and ethical thickets of this not-so-new frontier. Along the way, the book glitters with lively, humorous vignettes from Shapiro’s career in ancient DNA research. Her tales are often rife with awe (and ripe with the stench of thawing mammoths and other Ice Age matter). The book’s first half punctures the misconception that we “have only just begun to meddle with nature.” Humans have meddled for 50,000 years: hunting, domesticating and conserving. The second half chronicles the advent of recent biotechnologies and their often bumpy rollouts, leading to squeamishness about genetically modified food and a blunder that resulted in accidentally transgenic cattle.

11-15-21 AI can quickly identify structure of drugs designed for ‘legal highs’
An artificial intelligence can identify designer drugs that have similar effects to substances such as cocaine and heroin, but which can’t be detected by current tests. An AI tool can quickly suggest possible candidates for the chemical structures of psychoactive “designer drugs” from a simple analysis. The tool could fast-track the development of lab tests that screen the use of drugs with similar effects to substances such as cocaine and heroin, but aren’t detectable with current tests. “Our method could cut down the time required to identify a new designer drug from weeks or months to just hours,” says Michael Skinnider at the University of British Columbia in Canada. Skinnider and his colleagues created a machine learning tool called DarkNPS by training it with chemical structures of around 1700 known designer drugs, collected from forensic labs around the world. The training set included tandem mass spectrometry results for each drug, which is a common technique that provides information on the mass of a molecule and the elements it contains. This allowed the AI to identify patterns between tandem mass spectrometry data and chemical structures. Given tandem mass spectrometry data for a previously unseen drug, DarkNPS could then guess the molecular structure with an accuracy of 51 per cent. This number increased to 86 per cent if the AI could give its top 10 predictions of its structure, meaning that it could be most useful for narrowing the search. “This could save an enormous amount of time and make it possible to identify new designer drugs much sooner after they’ve hit the market,” says Skinnider. The researchers also used the tool to look at drugs that could be created in the future by using the AI to generate 1 billion possible chemical structures. Afterwards, the team acquired data for 194 new designer drugs and found that 176 of these appeared in the set generated by the AI.

11-14-21 Having impostor syndrome may actually make you better at your job
People who are less confident at work were rated as having better interpersonal skills, suggesting there may be upsides to impostor syndrome. People with “impostor syndrome”, who feel underqualified for their jobs, tend to make better employees because they compensate by striving to be likeable, empathetic and collaborative, new research suggests. The term impostor syndrome was coined in 1978 by two psychologists who studied women with illustrious careers. These women still believed they were “really not bright” and thought they had risen to their distinguished positions through luck or error. These impostor thoughts have since been found to affect people from all backgrounds, although they tend to be more common among women and ethnic minority groups. Impostor syndrome can be detrimental to a person’s well-being, as it is associated with anxiety and low self-esteem. It has long been assumed to hinder work performance too as a consequence, but no one has experimentally verified this. Basima Tewfik at the MIT Sloan School of Management in Cambridge, Massachusetts, measured levels of impostor syndrome among 155 employees at an investment advisory firm in the US. The participants were presented with written statements like “At work, others think I have more knowledge or ability than I think I do” and asked to rate how frequently they thought they applied. Tewfik then asked their supervisors to rate the participants’ performance and interpersonal skills by asking how much they agreed with sentences like “This employee creates effective working relationships with colleagues”. Employees with impostor syndrome were generally rated as having better interpersonal skills than their more confident peers and were considered just as competent. “People with impostor syndrome were basically the ones you’d want to work with,” says Tewfik.

11-11-21 How worried should we be about covid-19 spreading among wild animals?
Studies in the US have revealed a "silent epidemic" of human coronavirus in wild white-tailed deer and it could be circulating in other wild animals in other parts of the world. Recent studies suggest SARS-CoV-2, the virus that causes covid-19, is rife among the 30 million white-tailed deer in North America. This means there is a risk of deer infecting other animals, and also of new variants emerging in animals and jumping back to people. So what does this mean for the pandemic and how concerned should we be? It has long been clear that people infected with SARS-CoV-2 are occasionally infecting pets, farm and zoo animals. Until now, however, it was thought outbreaks in animals had either died out or been eliminated. For instance, in November 2020 Denmark culled millions of mink after the virus began spreading in farmed mink, and these mink then infected a few farm workers. Now Suresh Kuchipudi and Vivek Kapur at Pennsylvania State University and colleagues have found an astonishingly high rate of infection among white-tailed deer in Iowa. The team has been testing 5000 samples taken up to January 2021. After a third of PCR tests on the first 300 samples came up positive, the team decided to make their results public. “This is the first evidence of any free-living wild animal species having widespread SARS-CoV-2 infection,” says Kuchipudi. The team think what they have found is the tip of the iceberg. It is likely the coronavirus is common in white-tailed deer across North America, they say. It is likely to keep circulating indefinitely because deer populations have a high turnover rate. The extent of the problem may have gone unnoticed for long because white-tailed deer show few symptoms when infected. And it is quite possible that SARS-CoV-2 is also spreading unnoticed in other wild species elsewhere in the world, Karpur says. “The search for wildlife reservoirs has not been as much or as comprehensive as one might have hoped,” he says. “We could have these effectively silent epidemics and, who knows, pandemics going on among wild species that we are completely unaware of.”

11-11-21 Using tools helps you understand language and vice versa
Language and tool use seem to be governed by the same brain region, suggests a study involving an fMRI scanner. Practising a tool-using task helps people do better in a test of complex language understanding – and the benefits go the other way too. The crossover may happen because some of the same parts of the brain are involved in tool use and language, says Claudio Brozzoli at the National Institute of Health and Medical Research in Lyon, France. One idea is that language evolved by co-opting some of the brain networks involved in tool use. Both abilities involve sequences of precise physical movements – whether of the hands or of the lips, jaws, tongue and voice box – which must be done in the right order to be effective. Brozzoli’s team asked volunteers to lie in a brain scanner while carrying out tasks involving either tool use or understanding complex sentences. The tool-based task involved placing small, key-shaped pegs into a tray of holes using a pair of long pliers, while viewing the hands through an arrangement of mirrors. The language test involved understanding sentences such as: “The writer that the poet admires writes the paper.” During both tasks, the fMRI scanner showed higher activity in small structures deep in the brain called the basal ganglia. The pattern of activity was similar during both tasks. To see how the two activities affect each other, new groups of volunteers were asked to do tool-use and language tasks, sometimes with similar but less complex tasks switched in for comparison. For instance, to see how tool use influenced language comprehension, 52 people did two complex language tasks. In between the two language tasks, half the group did the pegs and pliers task, and their scores on the second language task were, on average, about 30 per cent higher than on the first language task.

11-11-21 Paralysed mice walk again after gel is injected into spinal cord
A self-assembling gel that stimulates nerve regeneration has shown promise as a treatment for paralysis in mice. A self-assembling gel injected at the site of spinal cord injuries in paralysed mice has enabled them to walk again after four weeks. The gel mimics the matrix that is normally found around cells, providing a scaffold that helps cells to grow. It also provides signals that stimulate nerve regeneration. Samuel Stupp at Northwestern University in Chicago and his colleagues created a material made of protein units, called monomers, that self-assemble into long chains, called supramolecular fibrils, in water. When they were injected into the spinal cords of mice that were paralysed in the hind legs, these fibrils formed a gel at the injury site. The researchers injected 76 paralysed mice with either the fibrils or a sham treatment made of salt solution, a day after the initial injury. They found that the gel enabled paralysed mice to walk by four weeks after the injection, whereas mice given the placebo didn’t regain the ability to walk. The team found that the gel helped regenerate the severed ends of neurons and reduced the amount of scar tissue at the injury site, which usually forms a barrier to regeneration. The gel also enhanced blood vessel growth, which provided more nutrients to the spinal cord cells. “The extent of functional recovery and solid biological evidence of repair we observed using a model that truly emulates the severe human injury makes the therapy superior to other approaches,” says Stupp. Other experimental treatments being developed for paralysis use stem cells, genes or proteins and have questionable safety and effectiveness, says Stupp. The walking ability of mice was assessed in two ways. First, the mice were given an overall score to represent their ankle movement, body stability, paw placement and steps. Mice treated with the gel had a score three times higher than sham-treated mice.

11-11-21 University of Oxford starts new Ebola vaccine trials
Clinical trials have begun for a new Ebola vaccine developed by the University of Oxford. The jab has been designed to tackle the Zaire and Sudan types of Ebola, which together have caused nearly all Ebola outbreaks and deaths worldwide. The University of Oxford has launched phase one of its trials, testing the vaccine in human volunteers. Ebola vaccines exist for the Zaire species but Oxford researchers hope the new jab will have a wider reach. Teresa Lambe, lead scientific investigator at the University of Oxford, said: "Sporadic Ebolavirus outbreaks still occur in affected countries, putting the lives of individuals, especially frontline health workers, at risk. We need more vaccines to tackle this devastating disease." There are four species of Ebola virus that have been known to cause disease in humans. Of these, Zaire is the most lethal, causing death in 70% to 90% of cases if left untreated. The new vaccine developed by Oxford scientists is based on a weakened version of a common cold virus that has been genetically modified so that it is impossible for it to replicate in humans. This method has already been used successfully in the Oxford-AstraZeneca Covid-19 vaccine. Phase one of the trials will see 26 people aged 18 to 55 receive one dose of the ChAdOx1 biEBOV Ebola vaccine at the university. They will then be monitored over a six-month period, with results expected in the second quarter of 2022.

11-11-21 ‘Ghosting’ in casual relationships linked to some personality types
Ceasing contact with a partner abruptly is considered more acceptable by people with the personality traits of Machiavellianism, narcissism and psychopathy, at least in short-term relationships. Ghosting, or breaking up with someone by stopping contact without warning, is considered more acceptable in short-term relationships, and may be linked with certain personality types, a study suggests. When someone ends a relationship by abruptly stopping answering phone calls and messages, it can be very painful for their ex-partner, even when the relationship was short-lived. But according to Peter Jonason at the University of Padua in Italy, such a strategy may seem rational to people who have higher scores for the so-called dark triad of personality traits: Machiavellianism, being manipulative and cynical; narcissism, being self-centred or unempathetic; and psychopathy, being socially callous and antagonistic. “This kind of cold and detached form of break up – one that doesn’t take anyone else’s feelings into consideration – is an easily reasoned outcome of the way in which these people’s brains work,” he says. “They prefer to just kind of bail.” Jonason and his colleagues asked 341 volunteers to complete a 27-point questionnaire that scored them on their dark triad traits. Participants were asked how much they agree with statements such as “many group activities tend to be dull without me”, “you should wait for the right time to get back at people” and “I’ll say anything to get what I want”. Volunteers were aged 18 to 72 and were 76 per cent female, 42 per cent undergraduate students, and 72 per cent white, the rest being primarily African American. The team then asked the participants to rank how acceptable ghosting is in different situations on a 10-point scale, and say if they had ever ghosted anyone in the past.

11-11-21 Our Human Story newsletter: The patterns of domestication
Why did ancient humans begin to domesticate animals? Plus, a new hominin species has been named – but it may not stick. Hello, and welcome to Our Human Story, New Scientist’s monthly newsletter all about human evolution and the origin of our species. To receive this free monthly newsletter in your inbox, sign up here. This month, prompted by the arrival of our family’s new kitten Peggy, I’m gently pawing at humanity’s relationship with animals. In recent years, we’ve learned a lot about when and where different species were domesticated – but to me this just raises even more questions. It’s a truism that humans have exerted an outsized influence on the natural world. We have domesticated dozens of animals and plants. There are the familiar examples like cats, chickens and maize, but also many that aren’t so familiar in the Western world, like the dozens of crops domesticated by farmers (if that is exactly the right word) in the Amazon rainforest over millennia. As with many aspects of prehistory, the more we learn, the older domestication looks. Until relatively recently, it was thought that every domestication took place within the past 11,000 years. This period is known as the Holocene, when the climate has been relatively stable and when some humans took up habits like sedentary farming, urban living and writing. But one domestication preceded it: dogs. We still haven’t pinned down when and where this happened, but dogs were being buried alongside people as if they were pets at least around 14,000 years ago, and they may have split from wolves up to 40,000 years ago. There was possibly more than one domestication event, with only some leaving living descendants. But what’s clear is that it was pre-Holocene and before the advent of permanent settled farming. It may have begun with a form of cooperative hunting. Set against this are the many clear examples of domestication during the Holocene. For example, I recently wrote about a massive genetic study of horses, which showed that modern domestic horses are descended from a population that lived in what is now Russia, around the Volga and Don rivers, about 4200 years ago. The domestication may have begun a little earlier, but only by a few centuries. How can we explain why domestication happened so late?

11-11-21 Chan Chan: Mass grave found in ancient Peruvian city
Archaeologists in Peru have uncovered the remains of 25 people in the ancient city of Chan Chan. The skeletons were found in a small space measuring 10 sq m in what was once the capital of the Chimú empire. The Chimú ruled parts of present-day Peru. Their empire reached its height in the 15th Century before their defeat by the Incas. Chan Chan, where the mass grave was found, was the largest mud citadel in pre-Columbian America. Experts think that the mass grave may have been a burial place where members of the Chimú elite were laid to rest. Archaeologist Sinthya Cueva told Reuters news agency said that most of the remains belonged to young women. "None of them are over 30 years old." While it is known that the Chimú carried out human sacrifices, including of children, archaeologist Jorge Meneses Bartra said that there was no evidence those in the newly discovered grave died that way. Scientists will carry out tests to try to determine their cause of death. Mr Meneses said that the position of one of the skeletons suggested that it had been buried there shortly after the person's death, while other bones appeared bleached by the elements and were jumbled together - suggesting that they had been moved to the grave site later. According to Ms Cueva, the later shows that the Chimú handled and moved the remains of their loved ones. The bodies were wrapped in a sitting position in several layers of fabric, the first of cotton and the second of vegetable matter. Almost 50 pieces of ceramics were also found in the grave, local media reported.

11-11-21 New species of UK dinosaur was 8 metres long with a bulbous nose
The Isle of Wight in the UK was once home to an ecosystem rich in dinosaurs, including Brighstoneus simmondsi, an 8-metre-long relative of Iguanodon. Not all new dinosaur species are fresh finds from the field. The latest to step onto the palaeontological scene was originally excavated in 1978 on the Isle of Wight in the UK, but was confused with its more famous relative Iguanodon for decades until researchers took a second look. The roughly 128 million-year-old dinosaur was found as an accumulation of bones including some from the skull, spine and hips. All that material was enough for Jeremy Lockwood at the University of Portsmouth, UK, and his colleagues to name the dinosaur Brighstoneus simmondsi, honouring the nearby village of Brighstone and the dinosaur’s discoverer, Keith Simmonds. In the grand span of the dinosaur family tree, the approximately 8-metre-long Brighstoneus is an iguanodont related to the thumb-spiked Iguanodon itself. In fact, the family resemblance partially obscured the dinosaur’s identity. Various iguanodont bones have been attributed to Iguanodon, but only recently have palaeontologists begun to realise that many of these animals were physically distinct and lived at different times. Some of the most salient features that mark Brighstoneus as a new species are in the dinosaur’s skull. “An obvious difference that distinguishes Brighstoneus is the bulbous nose,” says Lockwood, as well as a greater number of teeth in its lower jaw. The dinosaur lived about 4 million years earlier than a related animal, called Mantellisaurus, that has also been found in the same area. “Palaeontology definitely went through a lumping period, where everything vaguely similar to Iguanodon got assigned to that dinosaur,” says Karen Poole at the New York Institute of Technology. Brighstoneus, however, truly does appear to be new and indicates that there were many more iguanodonts than previously known. “This has prompted a major review of existing material,” says Lockwood, and early indications are that there are even more species awaiting discovery in museum drawers.

11-10-21 Covid-resistant people point way to universal coronavirus vaccine
Many groups worldwide are trying to develop vaccines that protect against a wide range of coronaviruses and prevent another pandemic. These efforts have now been boosted by the discovery that some healthcare workers had pre-existing immunity to the SARS-CoV-2 virus during the first wave of the pandemic. During the first half of 2020, around 700 healthcare workers in the UK were tested weekly as part of a crowdfunded study called COVIDsortium. Most of these people, who wore protective equipment, never tested positive for covid-19 in PCR tests or developed covid-19 antibodies – proteins that bind to the outside of viruses, preventing cells from being infected. However, when Leo Swadling and Mala Maini at University College London and their colleagues looked more closely, they found some of those who tested negative had a protein in their blood that is linked to covid-19 infection, as well as T cell responses to the SARS-CoV-2 virus. T cells are part of the immune system. It appears these people had what Swadling calls an “abortive infection”, where a strong, early T cell response enabled them to get rid of the virus very quickly. Cells infected by viruses sound the alarm by displaying viral proteins on their surface, and T cells are the immune cells that learn to recognise these proteins and destroy infected cells. Crucially, while antibodies can only target proteins on the outside of a virus, T cells can learn to recognise any viral proteins. When the team looked at early blood samples from the people who had an abortive infection, they found that even before being exposed to SARS-CoV-2, they had some T cells that could recognise the proteins that this virus uses to replicate itself inside infected cells. The most likely explanation is that these people were often exposed to the existing human coronaviruses that cause around 10 per cent of colds, says Maini. “We don’t know the historic infections of these individuals, so we don’t know for sure where the T cells are coming from,” she says.

11-10-21 How a universal flu vaccine could prevent the next pandemic
Even now, the risk of a deadly flu pandemic is an ever-present danger. But progress on a universal vaccine could protect us and remove the need for annual jabs. FED up with this whole pandemic business? Unfortunately, another one could start any day. In particular, a flu pandemic remains an ever-present threat. “It’s not a question of if, but when,” says Peter Palese at the Icahn School of Medicine at Mount Sinai, New York. That is why he and many others are trying to develop a vaccine to tackle the next flu pandemic before it even starts – a so-called universal flu vaccine that protects against all flu viruses. “The advances in technology have been extraordinary,” says Palese. “We know a lot about the structure [of flu viruses], we know a lot about the immunology, which we didn’t know five years ago. I think we are poised to have a universal flu vaccine.” There is no doubt about the seriousness of the threat. Seasonal flu viruses already in circulation kill around 400,000 people worldwide every year, despite an annual campaign to vaccinate against them. And when animal flu viruses jump to people, they can be far more deadly. A bird flu virus was the cause of the 1918 H1N1 flu, which may have killed 1 in 20 people alive at the time. There have been four other flu pandemics since then, though fortunately all were much less lethal. But the ever-growing number of farmed birds and pigs provides a breeding ground for flu viruses and opportunities for them to infect humans, meaning the danger hasn’t gone away. Already this year, there have been 25 human cases of H5N6 bird flu in Asia. Between 2014 and 2020, there were just 26 cases in total. Half of those known to be infected have died. With every person infected, there is a risk of the virus mutating and acquiring the ability to spread readily from person to person, sparking a pandemic.

11-10-21 The surprising upsides of the prions behind horrifying brain diseases
Infectious proteins called prions that turn brains to sponge have been implicated in some horrible diseases, but it turns out that we couldn't survive without them. I was at my laboratory bench one morning in 1980 when a colleague walked in and declared that he had identified the cause of scrapie, a mysterious and fatal infection that leaves tiny holes in the brains of sheep and goats. Stanley Prusiner had been studying the disease for some years and was stirring up controversy with his outlandish claim that the scrapie agent lacked genes or indeed any genetic material. It was, he said, an infectious protein – something never heard of before. His issue that morning was what to call this unique protein. He had two candidates: “piaf” and “prion”. I have forgotten what piaf stood for, but I remember pointing out that the name was already taken by a popular French singer. Fine, he said, in any case he preferred prion, a contraction of protein and infection. I agreed. What I didn’t say was that in my native French tongue prions means “let us pray” – and that if he persisted with his idea of infectious proteins, he would need prayers. Prusiner held strong in the face of adversity and, in 1997, won a Nobel prize for his discovery. By then, prions had been linked to Creutzfeldt-Jakob disease (CJD) in humans and to bovine spongiform encephalopathy or “mad cow disease”. There were also suggestions that they were involved in common neurodegenerative diseases including Alzheimer’s. What nobody predicted was the existence of “good” prions. We now know that prions emerged early in the evolution of life and play essential biological roles, from giving yeasts the ability to rapidly adapt to allowing you to form long-term memories.

11-10-21 Psilocybin therapy steps closer to credibility with largest trial yet
Promising results from a psilocybin trial suggest that psychedelic therapies for depression could help some – but not all – people who don't respond to conventional antidepressants. Is psychedelic medicine finally ready to live up to the hype? Yesterday, promising results were announced from the largest clinical trial of psilocybin for depression to date. They suggest that, while psilocybin therapy is far from a panacea, it can help some people for whom current medicines are ineffective. The study was led by Compass Pathways, a UK-based company that holds patents for two synthetic formulations of psilocybin, the active ingredient in magic mushrooms. It involved 233 people with treatment-resistant depression, meaning to be eligible they had to have tried two other treatments without success. The participants were randomly assigned one of three doses: 1, 10 or 25 milligrams. The 1 milligram dose is considered so small that it is effectively a placebo, but it meant all participants knew they would get psilocybin, creating an expectation of some benefit. All were given psychological support before, during and after a single dosing session. In the 25 milligram group, 36.7 per cent of patients had improved depression severity scores three weeks after dosing, and 24.1 per cent were still responding after 12 weeks. These numbers may seem low: in another trial published earlier this year, 70 per cent of patients who received psilocybin therapy showed a response at six weeks. But that study was smaller and involved two doses of the drug. “There is a general trend in science that the first small studies have huge effect sizes, and as you study more, they get less and less. The hope that we all have is that it doesn’t disappear,” says Allan Young at King’s College London, who worked on the Compass study. “We need to do a lot more work looking at the duration of the effect and see how it pans out in the clinic, but the fact that a group has a persistent benefit to 12 weeks, to my mind, is really heartening.”

11-10-21 Medicine must stop using race and ethnicity to interpret test results
SHOULD your race or ethnicity influence the prescription you get from your doctor? Both are still used in medicine to interpret test results and guide treatment decisions, but the evidence is questionable and the approach can cause serious harm. Medical guidelines in the US, UK and elsewhere often recommend the use of algorithms that contain adjustments for a person’s race or ethnicity, from tools used to assess bone fracture risk to devices containing embedded racial or ethnic adjustments for measuring lung function. The latter can be partly traced back to the suggestion by US slaveholder Samuel Cartwright in the 1800s that Black people had naturally low lung capacity and so were healthier when enslaved. These algorithms are finally coming under significant scrutiny. Recently, the US National Kidney Foundation and the American Society of Nephrology formally established a consensus against the use of race adjustment in kidney function equations. A similar race-based kidney test adjustment was also removed from UK medical guidance set by the National Institute for Health and Care Excellence (NICE). These decisions came in response to growing concerns that the race adjustment was contributing to underdiagnosis and undertreatment of kidney disease among Black people. Yet race-based decisions are still permeating other parts of medicine with little evidence to support them. NICE, for example, has declined to review its guidance on high blood pressure treatment that recommends different drugs for Black people compared with everyone else. The guidance currently says that doctors should prescribe drugs called ACE-inhibitors to people under the age of 55 with high blood pressure – unless they are of “black African or African-Caribbean family origin”, in which case they should receive different drugs.

11-10-21 Origins of Japanese and Turkish language family traced back 9000 years
Millet farmers living 9000 years ago in what is now north-east China may have spoken a proto-Transeurasian language that gave rise to Japanese, Turkish and other modern tongues. A vast Transeurasian language family that contains the Japanese, Korean, Mongolian, Turkish and Tungusic languages has had its origins traced back 9000 years, to early farming communities in what is now north-east China. Transeurasian languages are spoken across a wide region of Europe and northern Asia. Until now, researchers assumed that they had spread from the mountains of Mongolia 3000 years ago, spoken by horse-riding nomads who kept livestock but didn’t farm crops. Martine Robbeets at the Max Planck Institute for the Science of Human History in Jena and her colleagues used linguistic, archaeological and genetic evidence to conclude instead that it was the onset of millet cultivation by farmers in what is now China that led to the spread of the language family. The team did this by studying the linguistic features of the languages and using computational analysis to map their spread through space and time based on their similarities to each other. Doing so allowed Robbeets and her team to trace the proto-Transeurasian language back to the Liao river area of north-east China around 9000 years ago. This is the exact time and place that millet is known to have been domesticated, according to archaeological evidence, says Robbeets. By adding genetic information and carbon-dating millet grains, the team revealed that the proto-Transeurasian-speaking population split into separate communities that then started adopting early forms of Japanese, Korean and the Tungusic languages to the east of the original site, as well as early forms of Mongolic languages to the north and of Turkic languages to the west.

11-9-21 Lost capital city of the Mongol Empire was far bigger than thought
The city, built by the son of Genghis Khan, was once thought to be about one-tenth as big as it actually was. The capital of the Mongol Empire has been mapped in unprecedented detail. It turns out that the city of Karakorum was far larger than once thought and was quite unlike medieval European cities in its layout. In the late 1100s and early 1200s, the Mongol leader Temüjin established a vast empire spanning much of Asia and Europe. Temüjin became known as Chinggis Khan, and is also remembered as Genghis Khan in many nations today. After his death in 1227, his son Ögödei became the new Khan. He established Karakorum as the capital of the empire on a site that his father had used as a camp. However, the Mongol Empire only lasted a few decades before becoming fractured into smaller states and eventually dissolving. As a result, by the early 1400s, Karakorum was largely abandoned. It was never forgotten, but for centuries its location was lost, until it was rediscovered in 1889 by Russian explorer Nikolai Yadrintsev. The city was mapped a few years later and further expeditions have since added data, but the overall scale of it remained unclear. Jan Bemmann at the University of Bonn in Germany and his colleagues mapped the city using a superconducting quantum interference device (SQUID), which senses disturbances in the magnetic field caused by underground structures. They also explored the area on foot. The team concludes that Karakorum’s size has been underestimated. There is a central walled city with an area of 133 hectares that we already knew about, but there were also many buildings and structures outside the walls. This is unlike medieval European cities, which were confined within walls. The researchers roughly estimate that this outer region spans 1180 hectares, but they say that number is “highly speculative”.

11-9-21 Regular 10pm bedtime linked to lower heart risk
There appears to be an optimal bedtime - between 10pm and 11pm - linked to better heart health, say researchers who have studied 88,000 volunteers. The team behind the UK Biobank work believe synchronising sleep to match our internal body clock may explain the association found with a reduced risk of heart attacks and strokes. The body's natural 24-hour rhythm is important for wellbeing and alertness. It can also impact things like blood pressure. For the study, which is published in the European Heart Journal, the researchers collected data on sleep and wake times over seven days using a wristwatch-like device worn by the volunteers. And they followed up what happened to the volunteers in terms of heart and circulatory health over an average of six years. Just over 3,000 of the adults developed cardiovascular disease. Many of these cases occurred in people who went to bed later or earlier than the "ideal" 10pm to 11pm. The link persisted after adjustment for sleep duration and sleep irregularity. The researchers tried to control for other factors known to affect a person's heart risk, such as their age, weight and cholesterol levels, but stress their study cannot prove cause and effect. Study author Dr David Plans, from the University of Exeter, said: "While we cannot conclude causation from our study, the results suggest that early or late bedtimes may be more likely to disrupt the body clock, with adverse consequences for cardiovascular health. "The riskiest time was after midnight, potentially because it may reduce the likelihood of seeing morning light, which resets the body clock." Regina Giblin, senior cardiac nurse at the British Heart Foundation, said: "This large study suggests that going to sleep between 10 and 11pm could be the sweet spot for most people to keep their heart healthy long-term. "However, it's important to remember that this study can only show an association and can't prove cause and effect. More research is needed into sleep timing and duration as a risk factor for heart and circulatory diseases. Getting enough sleep is important for our general wellbeing as well as our heart and circulatory health, and most adults should aim for seven to nine hours of sleep per night, she said.

11-9-21 ‘The Dawn of Everything’ rewrites 40,000 years of human history
A new book recasts social evolution as surprisingly varied. Concerns abound about what’s gone wrong in modern societies. Many scholars explain growing gaps between the haves and the have-nots as partly a by-product of living in dense, urban populations. The bigger the crowd, from this perspective, the more we need power brokers to run the show. Societies have scaled up for thousands of years, which has magnified the distance between the wealthy and those left wanting. In The Dawn of Everything, anthropologist David Graeber and archaeologist David Wengrow challenge the assumption that bigger societies inevitably produce a range of inequalities. Using examples from past societies, the pair also rejects the popular idea that social evolution occurred in stages. Such stages, according to conventional wisdom, began with humans living in small hunter-gatherer bands where everyone was on equal footing. Then an agricultural revolution about 12,000 years ago fueled population growth and the emergence of tribes, then chiefdoms and eventually bureaucratic states. Or perhaps murderous alpha males dominated ancient hunter-gatherer groups. If so, early states may have represented attempts to corral our selfish, violent natures. Neither scenario makes sense to Graeber and Wengrow. Their research synthesis — which extends for 526 pages — paints a more hopeful picture of social life over the last 30,000 to 40,000 years. For most of that time, the authors argue, humans have tactically alternated between small and large social setups. Some social systems featured ruling elites, working stiffs and enslaved people. Others emphasized decentralized, collective decision making. Some were run by men, others by women. The big question — one the authors can’t yet answer — is why, after tens of thousands of years of social flexibility, many people today can’t conceive of how society might effectively be reorganized.

11-9-21 ‘Penis worms’ may have been the original hermits
Soft-bodied critters inhabited abandoned shells about 500 million years ago, researchers say. Hermit crabs have been taking shelter in abandoned shells for millions of years, but scientists now have evidence suggesting that the “hermit” lifestyle has existed far longer than that. Besides hermit crabs, a few modern-day species of crustaceans and worms inhabit the cast-off shells of other marine creatures, mostly for protection against predators, says Martin Smith, a paleontologist at Durham University in England. Until recently, the oldest known fossils suggesting hermiting behavior were about 170 million years old, he says. Now, Smith and his colleagues say that they have unearthed fossils of hermiting creatures almost three times that age, from a geologic period dubbed the Cambrian. Remains of the ancient squatters were preserved in rocks laid down as seafloor sediments about 500 million years ago in what is now southern China. The cone-shaped shells that seem to hold the occupants probably had belonged to hyoliths, a once-common group of ancient marine invertebrates that died out more than 250 million years ago (SN: 1/11/17). The marine creatures that then took shelter in those vacant shells, the researchers say, belong to a group called priapulid worms — commonly known as penis worms, thanks to their suggestive body shape. The Chinese rocks contain dozens of empty shells, Smith says. But four of those shells appear to have been inhabited by penis worms, he and his colleagues report November 8 in Current Biology. Because there were no free-ranging priapulids preserved in the ancient sediments, the researchers propose that the worms were living inside the shells. A relatively consistent ratio between the size of a worm and the shell it was preserved within suggests that the animals picked a shell based on its size and then moved to another when they outgrew their adopted home, Smith says. Modern-day hermit crabs use the same strategy, though none of the 20 species of penis worms around today have this hermiting behavior.

11-8-21 Penis worms had hermit crab-like defence system 530 million years ago
More than half a billion years ago, tiny penis worms had learned to protect themselves by grabbing and living inside snail-like shells, like hermit crabs do today. Penis worms started living like hermit crabs hundreds of millions of years before hermit crabs even existed, suggesting that the world’s earliest animal ecosystems were more ecologically sophisticated than previously thought. Priapulids are tiny, toothed sea worms that are sometimes carnivorous. Colloquially named “penis worms” for their phallus-like shape, they apparently started inhabiting empty cone-shaped seashells 530 million years ago. This implies that the animals were protecting themselves from predators in surprisingly advanced ways for the time, says Martin Smith at Durham University, UK. Previous research has suggested that true multicellular animals evolved rather abruptly through massive evolutionary developments and life form diversification in an event known as the Cambrian explosion, roughly half a billion years ago. While the evolution and diversification of predators certainly helped fuel this event, scientists have generally believed that life forms didn’t really begin to become as ecologically and behaviourally complex as animals today until several hundred million years later, says Smith. His group’s finding challenges that belief, he says. “Grabbing a shell… takes a level of behavioural complexity to say, ‘Well, I need to find a shell that I fit in’,” says Smith. “And it requires a reasonably sophisticated neural processing level, which isn’t something we’ve associated at all with [these] worms that just slime around on the sea floor, [nor] with the Cambrian.” “But fossil records keep throwing us these curve balls, and making us think, ‘Whoa, OK, this was even more of an explosion than we thought’,” he says.

11-5-21 Covid-19 vaccine tested with suction technique similar to cupping
Studies in rats suggest a device that applies suction to the skin may make cells take up more vaccine particles and enhance the immune response. A device that creates suction against the skin, in a similar manner to the alternative medicine technique of cupping, is being investigated as a new type of covid-19 vaccine delivery method. The suction device is being used in human trials of a DNA-based experimental vaccine against the coronavirus. Work in rats has now found that the approach enhances the immune response. Cupping uses heated cups placed on the skin to create a partial vacuum next to the body as the air inside the cups cools down. It is used in several types of alternative therapies, such as traditional Chinese medicine, for purposes such as reducing pain and inflammation, although there is no good evidence that it works. When it comes to vaccines, however, suction against the skin seems to make cells of the dermis take up more vaccine particles. The suction device is being used in trials of a covid-19 vaccine made by South Korean biotech firm GeneOne Life Science. The vaccine is based on a small circle of DNA called a plasmid, which encodes the coronavirus spike protein. First, the vaccine is injected into the skin of the arm as normal. Then, the suction machine, which has a 6-millimetre orifice, is applied at the injection site for 30 seconds. It isn’t painful and leaves no mark, says Hao Lin at Rutgers University in New Jersey, who has tried the device on himself. Studies on rats, published today, show that using the suction boosted the amount of antibodies made by the animals 100-fold. This may happen because stretching and then relaxing the skin cells encourages their cell membranes to to pull inwards, taking in particles that were previously outside of the cell, says Lin.

11-5-21 Only 45% of US parents give peanuts to infants despite latest advice
Parents have long avoided giving children peanuts early in their life for fear of making them allergic, but updated US advice published in 2017 recommends early introduction. Less than half of US children are given peanuts to eat in the first 11 months of their lives, even though their early introduction lowers the risk of developing peanut allergies. Ruchi Gupta at Northwestern University in Illinois and her colleagues surveyed more than 3000 households with children who were 7 months to 3.5 years old. Participants were demographically representative of the US caregiver population. “High allergy risk children who eat peanuts early in life are five times less likely to develop a peanut allergy,” says Carina Venter, a co-author of the study, at the University of Colorado in Denver. That finding in a 2015 study led to a swift change in US guidelines regarding the early introduction of peanuts. Parents had long avoided giving their children peanuts for fear it may make them allergic, and official guidance on the topic was unclear. But in 2017, the US National Institute of Allergy and Infectious Diseases recommended that high allergy risk children in the US be given peanuts between four and six months of age, while other children should also be given peanuts ‘freely’ and early in their lives. However, in the new analysis, the team found that only 58 per cent of participants said their doctors had advised them about the benefits of early peanut introduction. Only 40 per cent of these respondents said that their doctors had told them to introduce peanuts to their child as early as the first 11 months. Just 44.7 per cent of participants reported that they had given their children peanuts that early. Gupta and Venter presented this work at a meeting of the American College of Allergy, Asthma & Immunology on 5 November.

11-5-21 CRISPR-based 'antibiotic' eliminates dangerous bacterium from the gut
Genetically engineered bacteria armed with CRISPR could help combat antibiotic-resistant infections and also allow doctors to edit people's microbiomes. A benign bacterium armed with a designer, CRISPR-based weapon has been used to eliminate a harmful bacterium from the guts of mice while leaving all other microbes unharmed. The approach could give us a new way of tackling antibiotic-resistant infections of the gut and skin, says Sébastien Rodrigue at the University of Sherbrooke in Canada, and also help treat a wide range of diseases by editing the microbiome. Others have shown that this approach works in cells growing in dishes but Rodrigue’s team is the first to get it to work effectively in animals. “And if it works in mice, it should also work in other animals, including people,” he says. CRISPR is best known as a gene-editing tool, but it can also be programmed to kill bacterial cells that have specific bits of DNA inside them. The hard part is that doing this requires getting a CRISPR system inside every single one of the bacterial cells that you want to kill. “The real challenge is the delivery,” says Rodrigue. One way to deliver CRISPR is to exploit circular bits of DNA within bacteria known as conjugative plasmids. These carry genes that make the bacteria pass them on to other bacterial cells via a process called conjugation. Rodrigue’s team tested lots of different conjugative plasmids in a common group of bacteria to find the one that was most effective at transferring itself. The group then evolved it in the lab to make it even more efficient. The team added the genes for a CRISPR system targeting an antibiotic-resistant strain of E. coli, and put the plasmid inside a benign bacterium used as a probiotic. When the CRISPR-armed probiotic bacteria were given to mice, they eliminated 99.9 per cent of the E. coli bacteria in four days.

11-5-21 50 years ago, scientists were on the trail of ‘memory molecules’
Excerpt from the November 6, 1971 issue of Science News. The first memory molecule has been isolated, characterized and synthesized … [from the brains of] rats that had been shocked in the dark…. It is a protein and dubbed “scotophobin,” after the Greek words for “fear of the dark.” [One researcher] has injected synthetic rat scotophobin into the brains of hundreds of goldfish. While the fish indeed exhibited fear of the dark and resisted learning to swim into the dark, the fear was of brief duration. The idea that scotophobin stores memories and can be used to transfer them between organisms was met with intense skepticism and was eventually discredited by neuroscientists. But the search for a physical basis of memory continues. Over the last few decades, other memory molecule candidates have popped up, including a protein called PKM-zeta, which may help with memory retrieval, and even RNA (SN: 6/9/18, p. 9). Still, the dominant theory is that memories are stored in synapses, connections between nerve cells in the brain (SN: 2/3/18, p. 22).

11-5-21 Brainless sponges contain early echoes of a nervous system
Cells crawling around digestive chambers might help coordinate feeding. Brains are like sponges, slurping up new information. But sponges may also be a little bit like brains. Sponges, which are humans’ very distant evolutionary relatives, don’t have nervous systems. But a detailed analysis of sponge cells turns up what might just be an echo of our own brains: cells called neuroids that crawl around the animal’s digestive chambers and send out messages, researchers report in the Nov. 5 Science. The finding not only gives clues about the early evolution of more complicated nervous systems, but also raises many questions, says evolutionary biologist Thibaut Brunet of the Pasteur Institute in Paris, who wasn’t involved in the study. “This is just the beginning,” he says. “There’s a lot more to explore.” The cells were lurking in Spongilla lacustris, a freshwater sponge that grows in lakes in the Northern Hemisphere. “We jokingly call it the Godzilla of sponges” because of the rhyme with Spongilla, say Jacob Musser, an evolutionary biologist in Detlev Arendt’s group at the European Molecular Biology Laboratory in Heidelberg, Germany. Simple as they are, these sponges have a surprising amount of complexity, says Musser, who helped pry the sponges off a metal ferry dock using paint scrapers. “They’re such fascinating creatures.” With sponges procured, Arendt, Musser and colleagues looked for genes active in individual sponge cells, ultimately arriving at a list of 18 distinct kinds of cells, some known and some unknown. Some of these cells used genes that are essential to more evolutionarily sophisticated nerve cells for sending or receiving messages in the form of small blobs of cellular material called vesicles.One such cell, called a neuroid, caught the scientists’ attention. After seeing that this cell was using those genes involved in nerve cell signaling, the researchers took a closer look. A view through a confocal microscope turned up an unexpected locale for the cells, Musser says. “We realized, ‘My God, they’re in the digestive chambers.’”

11-5-21 A child’s partial skull adds to the mystery of how Homo naledi treated the dead
Found in a narrow opening, the remains stoke possibility of ancient, deliberate cave disposals. A child’s partial skull found in a remote section of a South African cave system has fueled suspicion that an ancient hominid known as Homo naledi deliberately disposed of its dead in caves. An international team led by paleoanthropologist Lee Berger of University of the Witwatersrand, Johannesburg pieced together 28 skull fragments and six teeth from a child’s skull discovered in a narrow opening located about 12 meters from an underground chamber where cave explorers first found H. naledi fossils (SN: 9/10/15). Features of the child’s skull qualify it as H. naledi, a species with an orange-sized brain and skeletal characteristics of both present-day people and Homo species from around 2 million years ago. “The case is building for deliberate, ritualized body disposal in caves by Homo naledi,” Berger said at a November 4 news conference held in Johannesburg. While that argument is controversial, there is no evidence that the child’s skull was washed into the tiny space or dragged there by predators or scavengers (SN: 4/19/16). Berger’s group describes the find in two papers published November 4 in PaleoAnthropology. In one, Juliet Brophy, a paleoanthropologist at Louisiana State University in Baton Rouge and colleagues describe the youngster’s skull. In the other, paleoanthropologist Marina Elliott of Canada’s Simon Fraser University in Burnaby and colleagues detail new explorations in South Africa’s Rising Star cave system. Researchers nicknamed the new find Leti, short for a word in a local South African language that means “the lost one.” Leti likely dates to the same time as other H. naledi fossils, between 335,000 and 236,000 years ago (SN: 5/9/17). Berger’s team suspects Leti died at about age 4 to 6 years based on the rate at which children grow today. But that’s a rough approximation as the scientists can’t yet say how fast H. naledi kids grew.

11-4-21 Homo naledi infant skull discovery suggests they buried their dead
The partial skull of a Homo naledi child from around 250,000 years ago has been found in a deep, inaccessible cave – suggesting it was placed there by other H. naledi. The skull of a small child belonging to a different human species has been found deep in a cave system in South Africa. The team that made the discovery has named the child Leti and believes the skull shows that the Homo naledi species buried their dead. Leti’s skull was found in a narrow fissure that is almost impossible to access. For that reason, the team argues that the skull was placed there deliberately, as a form of funerary practice. Presenting their findings at a virtual press conference, the researchers said it is evidence that hominins have been performing funerary rights for hundreds of thousands of years – even hominins with brains much smaller than ours. “We can see no other reason for this small child’s skull being in the extraordinarily difficult position,” said Lee Berger at the University of the Witwatersrand in Johannesburg, South Africa. Berger and his colleagues have been exploring the Rising Star cave system in South Africa for several years. In 2015, they described Homo naledi, a new species of hominin, found in the caves. More than a thousand bones were found strewn over the floor of the system’s Dinaledi Chamber, which could only be reached by expert cavers able to fit through small spaces. H. naledi had some features that resembled modern humans, but in other respects it looked like an older species: in particular, its brain was small. Two years later, the researchers found a remarkably complete H. naledi skeleton in another part of the cave, the Lesedi Chamber. They called the individual Neo. Crucially, the team also managed to narrow down how long ago H. naledi lived. The remains are only about 250,000 years old, meaning H. naledi existed at the same time as our species and other big-brained hominins like the Neanderthals – yet they retained features from species that lived millions of years earlier.

11-4-21 Brainless sponges have cells that might be the precursors of neurons
Sponges are arguably the simplest animals and they lack a nervous system, but peculiar cells in their digestive chambers may be evolutionary precursors of neurons. Sponges lack anything resembling brains, but they nevertheless may have played a key role in the early evolution of the nervous system. A new study finds that sponges contain cells that have some of the capabilities of neurons – and these may be the evolutionary precursors of true brain cells. “The nervous system came about very early in animals and this transition is completely enigmatic so far,” says Detlev Arendt at the European Molecular Biology Laboratory in Heidelberg, Germany. Most animals have brains, or at least neurons, the cells that are their building blocks. Neurons carry electrical signals along their length and can communicate with each other by releasing chemicals called neurotransmitters, often at specialised junctions known as synapses. However, sponges are the exception. They are one of the oldest animal groups still extant – possibly the very oldest. And they don’t have a nervous system. “Sponges don’t have anything that looks like neurons, synapses or brains,” says co-author Jacob Musser, also at the European Molecular Biology Laboratory. But his team has found they might have precursors of these things. Musser, Arendt and their colleagues studied a freshwater sponge called Spongilla lacustris. They broke apart sponges and tracked individual cells to see which genes were active. This revealed that the sponges were made up of 18 distinct cell types, each with a different pattern of gene activity. The team then stained the different cells to figure out where they were within the body. One cell type stood out. The team calls them “neuroid” because they had long tendrils, resembling those of neurons. They were found in the sponge’s digestive chamber and made contact with many of the other cells within. Their gene activity pattern suggested they were secreting signalling chemicals, similar to those that neurons release at synapses to communicate with their neighbours.

11-4-21 Water-absorbing material inspired by plant roots could power robots
Soft robots could one day be powered by a material that absorbs water to become strong and stiff, mimicking the physics of the cells in plant roots. Plants may have no muscles, but they can grow upwards against the strain of gravity and their roots can even shift soil and rocks – because their cells can absorb water to form strong structures. Now an artificial material which mimics this ability could help to create better soft robots and medical implants. Shelby Hutchens and her colleagues at the University of Illinois Urbana-Champaign formed so-called plant tissue analogues (PTAs) by fabricating closed cells from a compound of silicon called polydimethylsiloxane, which is semi-permeable like plant cell walls. Researchers used varying salt levels inside the cells to control how much pure water from outside was absorbed through the polydimethylsiloxane cell walls via osmosis. The higher the salt concentration, the more water was absorbed and the stiffer and larger the cells became. This had to be tuned carefully, as at very high salt concentrations the artificial cells ruptured. The team found that if a layer of this material was bonded to a less expandable substance, the increase in size of the PTA caused it to move and bend into an arched shape as one side expanded and the other remained at its original size. This movement required no electrical power, only a source of moisture, and could be used to power soft robots or medical devices in the future. Previous materials like hydrogels have been shown to exhibit the same expanding behaviour, but they lose stiffness as they swell. In contrast, the PTA was found to strengthen as it took in water. In one experiment, the team showed how this affects the potential application of the materials. They took a strip of PTA and a strip of hydrogel, each bonded to a material that expanded less, and exposed them to water. Both exhibited the same swelling and deformation, curling upwards. However, when the experiment was repeated with the addition of a 5-gram weight to the end of each strip, the PTA curled upwards while “holding” the weight, but the hydrogel lacked the internal strength to do so.

11-4-21 HPV vaccine cutting cervical cancer by nearly 90%
The human papillomavirus, or HPV, vaccine is cutting cases of cervical cancer by nearly 90%, the first real-world data shows. Cancer Research UK described the findings as "historic", and said it showed the vaccine was saving lives. Nearly all cervical cancers are caused by viruses, and the hope is vaccination could almost eliminate the disease. The researchers said the success meant those who were vaccinated may need far fewer cervical smear tests too. Cervical cancer is the fourth most common cancer in women around the world, killing more than 300,000 each year. Almost nine-in-10 deaths are in low and middle income countries where there is little access to cervical cancer screening. The hope is vaccination will have an even bigger impact in those countries than wealthier nations such as the UK. More than 100 countries have starting using the vaccine as part of World Health Organization plans to get close to eliminating cervical cancer. In the UK, girls are offered the vaccine between the ages of 11 and 13, depending on where they live. The vaccine has also been offered to boys since 2019. The HPV vaccine can only prevent an infection, it cannot rid the body of the virus once it has been caught. The viruses are so widespread that immunisation has to be aimed at children before they become sexually active. The study, published in the Lancet, looked at what happened after the vaccine was introduced for girls in England in 2008. Those pupils are now adults in their 20s. The study showed a reduction in both pre-cancerous growths and an 87% reduction in cervical cancer. "The impact has been huge," said Prof Peter Sasieni, one of the researchers at King's College London. The reductions were less dramatic when older teenagers were immunised as part of a catch-up campaign. This is because fewer older teenagers decided to have the jab and they may already have been sexually active. Overall, the study estimated the HPV programme has prevented about 450 cancers and 17,200 pre-cancers. Prof Sasieni said that was "just the tip of the iceberg" because those vaccinated were still young to be getting cancer, so the numbers would only grow with time.

11-4-21 HPV vaccine cuts cervical cancer by 87 percent in 'historic' UK study
A British initiative to vaccinate teenage girls against the human papillomavirus (HPV) slashed cervical cancer rates by 87 percent when the vaccine was administered at age 12 and 13, 62 percent when offered at age 14 to 16, and 34 percent among women vaccinated at 16 to 18, researchers reported Thursday in the medical journal The Lancet. Cancer Research UK, which funded the study, called the results "historic" and said it proves the HPV vaccine saves lives. Britain's National Health Service began offering the HPV vaccine to girls as young as 11 in 2008, and the new study compares cervical cancer outcomes between vaccinated and unvaccinated women now that the first cohort is in their 20s. Most cervical cancer is caused by one of two HPVs blocked by the vaccine, and immunization is much more effective if administered before teens become sexually active. The Lancet study estimated that by June 2019, the vaccine had prevented 450 cases of cervical cancer in the immunized groups and 17,200 cases of precancerous cervical carcinomas. Cervical cancer is the No. 4 most common cancer in women worldwide and kills 300,000 each year, BBC News reports. Almost 90 percent of those deaths are in low- to middle-income countries, where the vaccine could have the biggest impact. "We've known for many years that HPV vaccination is very effective in preventing particular strains of the virus, but to see the real-life impact of the vaccine has been truly rewarding," said lead author Peter Sasieni, of King's College London. "Assuming most people continue to get the HPV vaccine and go for screening, cervical cancer will become a rare disease."

11-4-21 Tiny region of human brain that helps regulate sleep studied at last
Our sleep cycles are thought to be regulated partly by the suprachiasmatic nucleus, a 2mm-wide structure in the brain that has now been imaged for the first time with a brain scanner. We have taken our closest look at the activity of a tiny brain region thought to be involved in the human circadian clock. Johanna Meijer at Leiden University in the Netherlands and her colleagues have been studying the suprachiasmatic nucleus, which sits in the hypothalamus and is thought to play a role in regulating our sleep cycles. However, because this structure is less than 2 millimetres wide it is difficult to image using an MRI – and that means it is challenging to record its activity. “Due to the low spatial resolution of [most] MRI machines, it is not possible to determine if the [neuronal] signals are coming from the suprachiasmatic nucleus or from another nearby nucleus,” says Meijer. Her team got around this problem using an MRI machine with a particularly powerful 7-tesla magnetic field. This offers sufficiently high resolution to image this tiny part of the brain. There are only about 90 7-tesla MRI scanners in the world. Meijer’s team used one of them to study suprachiasmatic nuclei in the brains of 12 men. She says these nuclei are the smallest brain structures to have been imaged in living people. However, to then study a suprachiasmatic nucleus’s activity pattern, Meijer’s team needed to be able to shine lights of various frequencies into the eyes of the individual inside the MRI scanner and monitor how the nucleus responded. This was a challenge because standard LED lights would be affected by the unusually powerful magnetic field and radio pulses inside a 7-telsa MRI scanner. The researchers custom-built an LED with voltage suppressors and shielding so it could to function under such conditions.

11-3-21 A new kind of brain scan is letting us understand how toddlers think
Technological advances mean that we can finally tackle an age-old question: what's going on in the minds of children? THREE-year-old Sophie is sitting at a low table, trying to build a house out of large plastic bricks, as a nearby adult gives gentle encouragement. It could be a scene from any nursery school, but for the incongruous apparatus that Sophie wears: a snugly fitting black cap studded with sensors and sprouting multiple thick, black wires. It looks slightly sinister, but the harmless cap is letting researchers do something that has never been done before: peer inside the brains of active toddlers. Sophie is a participant at the ToddlerLab, a state-of-the-art facility at Birkbeck University of London that is investigating child development. The wires from the cap she is wearing run from the top of her head into two small recording units tucked into her backpack. This apparatus enables the team to image her brain as she moves around, while a pair of motion-capture gloves and 16 discreet cameras evenly spaced around the ceiling record the movement of each of her fingers down to one-hundredth of a second. Brain imaging has taught us a lot in the past two decades about the structure and function of the brain in sickness and in health, but most approaches have limitations. The standard magnetic resonance imaging (MRI) device is a huge noisy machine that people have to lie inside, quiet and still, for up to an hour at a time. This means MRI can’t be used easily on young children or to study any activities that require moving around – which are significant chunks of human existence. Now the technology being used at the ToddlerLab, called functional near-infrared spectroscopy or fNIRS, is changing that. With equipment small enough to sit inside a lightweight cap, it beams infrared light through the skull, so that it is scattered into nearby receivers, also sited in the cap.

11-3-21 Why Alzheimer’s is not a single disease – and why that matters
Despite decades of research, there’s no consensus on what causes Alzheimer’s. But a new way of thinking is transforming how we study the condition, and could finally deliver effective treatments. FOR the first time in nearly two decades, a new treatment for Alzheimer’s disease was approved by the US Food and Drug Administration in June. But instead of joy and relief, the announcement was largely met with frustration and even anger. Some experts pointed out that the clinical trial that was the basis of approval for Biogen’s drug aducanumab didn’t conclusively show that it reduced cognitive decline. Instead, the FDA based its unprecedented decision on evidence that the drug treats the underlying cause of Alzheimer’s. The trouble is, it is far from clear that the target of this drug – clumps of beta-amyloid protein – is truly the cause. Drugs aimed at beta-amyloid have failed time and time again. Large-scale clinical trials representing billions of dollars of research have shown no positive impact; some experimental medicines even seemed to make cognition worse. With this new drug, there were also concerns about the FDA approval process. Ultimately, the agency’s acting commissioner Janet Woodcock requested an investigation into her own agency’s decision-making. But the controversy isn’t limited to this decision. The big worry is that the continued focus on beta-amyloid is a dangerous distraction, that it may actually obscure the complex nature of the disease and waste precious time. The growing consensus is that there is no single cause of Alzheimer’s, but a complex web of contributing factors. That may not seem like good news, but there is a silver lining: many different factors can provide many paths for treatment.

11-3-21 Treating Alzheimer's as having many causes may help us beat it
FOR nearly three decades, we have waited anxiously for a blockbuster drug that could defeat Alzheimer’s disease. We believed we had identified the culprit behind this debilitating condition: sticky clumps of the protein beta-amyloid in the brain. Even as drug after drug homing in on this target failed to make a difference to symptoms, we continued to pour more money into the effort. Regrettably, it is now becoming clear that this time could have been better spent zooming out from beta-amyloid, to look at the big picture of possible Alzheimer’s causes. Doing so reveals a far more complicated and insidious illness. It seems to be a condition that doesn’t have a lone underlying trigger, but instead results from multiple overlapping processes and risk factors, which you can read about in detail in our cover story. By thinking of Alzheimer’s in the same way as we do multifaceted conditions like heart disease, researchers are now combining knowledge from across disciplines to identify, and tackle, the many known risk factors. “There is a real possibility that we could dismantle Alzheimer’s by a thousand tiny cuts” This new approach comes not a moment too soon, because 10 million new cases of dementia are diagnosed globally each year. The vast majority of these, between 60 and 70 per cent, are Alzheimer’s disease. As people are living longer than ever, the number of people living with dementia is predicted to almost double every 20 years. Accepting that Alzheimer’s is more complicated than we thought might seem disheartening. And yet, targeting the many factors implicated in the disease, including the role of infections, diet, sleep habits and inflammation, puts at least some control back in our own hands, because these are things we can all do something about. It means we don’t have to simply wait for pharmaceutical companies to deliver: we can also cut our own chances of getting dementia.

11-3-21 Exercising more often doesn't increase your risk of knee arthritis
Previous research has found conflicting results on a link between exercise and knee arthritis, but now it seems that the amount of physical activity you do has no impact - though more strenuous workouts might. There seems to be no link between the amount of exercise people do and whether they develop painful osteoarthritis in their knees, according to a large study of activity levels and arthritis pain. But the research couldn’t rule out that high-impact forms of exercise like running bring on the condition. Osteoarthritis is more common as people get older and is sometimes referred to as a “wear and tear” condition. Arthritic knees often have visible damage to their cartilage, a rubbery layer that covers the ends of bones. Previous studies have found conflicting results on whether exercise can make arthritis more likely. So Lucy Gates at the University of Southampton in the UK and her colleagues combined the results of six such investigations, involving over 5000 people who initially had no knee pain or other evidence of arthritis. At the outset, people were asked about how much exercise they did, including playing sports, walking and cycling. They recorded the average time spent exercising each week, and their activities were graded by their metabolic equivalent, or MET, scores, a standard way of classifying activities according to how much they raise a person’s metabolic rate. At the end of the studies, which lasted from five to 12 years, people were also asked if they had developed frequent knee pain or if arthritis had been diagnosed by a scan. The likelihood of developing arthritis didn’t correlate with activity levels, either by how much time people spent exercising each week or by their combined time and MET scores.

11-2-21 Vipers evolved either nose or eye horns depending on their habitat
The horns on vipers’ eyes could help camouflage their head in rocky areas or trees, while those with nose horns may blend in better on forest floors. The small horns that stud the heads of many viper species may play a role in camouflage, suggesting they evolved as a result of the varying environments the snakes inhabit. Theo Busschau and Stéphane Boissinot at New York University Abu Dhabi in the United Arab Emirates studied whether horn evolution in these reptiles could be tied to environmental factors by analysing the physical features of 263 viper species to determine their evolutionary relationships to each other. They also compared horn placement – over the eyes or on the nose – with the vipers’ habitat preferences. Eyebrow horns were associated with vipers that live in trees or open habitats, and nose horns were linked to those living on the forest floor. “The common factor between arboreal habitats and rocky or sparsely vegetated habitats is a lack of cover,” says Busschau. Eyebrow horns “could disrupt the outline of a viper’s head and possibly also conceal the eyes, allowing them to blend in better with their environment”, he says. On forest floors, nose horns could make viper heads harder to spot among leaves and twigs. The horns have independently evolved dozens of times in vipers across the world, suggesting that environmental pressures are pushing vipers to converge on the feature, the researchers argue. Ken Toyama at the University of Toronto in Canada wasn’t too surprised by the findings. Forest habitats, particularly in the tropics, “provide some of the most structurally diverse environments on Earth”, he says. “Complex habitats provide a good opportunity for [visual camouflage] to evolve.” Busschau says he plans to study the vipers’ genomes to find the DNA changes that underpin how the horns keep evolving.

11-1-21 Heart rates synchronise if two people get on well during first date
A study of young heterosexual people on blind dates found that those who instantly felt sparks developed synchronised patterns of heart rates and palm sweating. When people feel instant chemistry with each other on a first date, their hearts start to beat in tune, a new study shows. We often think we know what we are looking for in a partner, but research shows that the people we actually end up falling for often don’t match our ideal preferences. “While someone may seem a perfect match on Tinder, we may feel nothing when we meet the person in real life,” says Eliska Prochazkova at Leiden University in the Netherlands. This may be because attraction isn’t simply based on what someone “looks like on paper”, but also on a gut feeling we get when we are with them, she says. To study what happens at a physiological level when people instantly spark on a first date, Prochazkova and her colleagues set up “dating cabins” at three festivals – one for music, one for arts and one for science – in the Netherlands. They invited 142 single heterosexual males and females aged 18 to 38 to go on 4-minute blind dates in these cabins. The participants wore eye-tracking glasses, heart rate monitors and devices for monitoring the sweatiness of their palms. Some pairs reported becoming more attracted to each other as their dates progressed, while others failed to click. Of all the pairs that were matched up, 17 per cent expressed a mutual wish to go on another date. The pairs that wanted to see each other again and rated each other as attractive tended to be those who developed physiological synchrony. Their heart rates began to speed up and slow down at the same time and their palm sweatiness increased and decreased in tandem. It was common for pairs to also mirror each other’s smiles, laughs, head nods and hand gestures, but this type of synchrony didn’t predict mutual attraction. The results largely replicate those that the team found in an earlier version of the study, which they posted to a preprint server in 2019.

11-1-21 Brain implants boost ability to think flexibly and shake anxiety
Electrically stimulating a region within the centre of the brain helped people with pre-existing brain implants to adapt to changing goals and improve their well-being, pointing to a method for personalised treatment. Electrical brain stimulation in people with pre-existing brain implants has allowed them to think more flexibly and clear anxious thoughts, suggesting it has the potential to treat conditions like depression. Alik Widge at the University of Minnesota and his colleagues found that applying an electric current within the centre of the brain boosted people’s ability to rapidly adapt to changing goals, known as cognitive control, and in some cases improved their feelings of well-being too. The inability to disengage from habitual ways of thinking is commonly seen in people with mood disorders, such as depression and obsessive compulsive disorder. In these conditions, people are often unable to extricate themselves from thought processes triggered by habits or distress. Widge recruited 21 people who already had electrodes placed in their brain as a treatment for epilepsy. They did not have depression, but some had mood disorders associated with epilepsy. He and his team used these electrodes to provide small bursts of stimulation to the brain while participants performed a cognitive control task, in which they were shown a trio of numbers between one and three on a screen. Two of the numbers were always the same, and their task was to identify the odd number out and press the corresponding key on a keypad. In some tasks, the position of the unique number matched its physical position on the keypad, in other tasks it didn’t. The task forces participants to use cognitive control to overcome the difference between where the number is on the screen and the keypad. When their brains were stimulated, participants were around 5 per cent faster at answering correctly. It might not seem like much, says Widge, but you don’t need a lot of change in flexible thinking to help people make small tweaks in their life that can then accumulate over time and help change behaviours. Although the participants couldn’t tell when the stimulation was switched on and off, in trials where it was on some of them reported that their thoughts were more focused and their background anxiety was easier to ignore, suggesting the stimulation may have had an influence.

11-1-21 Are viruses alive, not alive or something in between? And why does it matter?
Villain. Killer. Menace. Since 2020, scientists and public officials have used these words to describe SARS-CoV-2, the virus that causes COVID-19. News articles, research papers and tweets repeatedly personify the virus as a bad guy intent on killing us. Simultaneously, we’re intent on killing it, with handwashing, antiseptic wipes, hand sanitizer, bleach, even robots zapping hospital rooms with ultraviolet light. Yet, according to most scientists, we’ve been working hard to kill something that isn’t alive. Scientists have argued for hundreds of years over how to classify viruses, says Luis Villarreal, professor emeritus at the University of California, Irvine, where he founded the Center for Virus Research. In the 1700s, viruses were believed to be poisons. In the 1800s, they were called biological particles. By the early 1900s, they’d been demoted to inert chemicals. Throughout, viruses have rarely been considered alive. More than 120 definitions of life exist today, and most require metabolism, a set of chemical reactions that produce energy. Viruses do not metabolize. They also don’t fit some other common criteria. They do not have cells. They cannot reproduce independently. Viruses are inert packages of DNA or RNA that cannot replicate without a host cell. A coronavirus, for example, is a nanoscale sphere made up of genes wrapped in a fatty coat and bedecked in spike proteins. Still, viruses have many traits of living things. They are made of the same building blocks. They replicate and evolve. Once inside a cell, viruses engineer their environment to suit their needs — constructing organelles and dictating which genes and proteins the cell makes. Recently discovered giant viruses — which rival the size of some bacteria — have been found to contain genes for proteins used in metabolism, raising the possibility that some viruses might metabolize.

77 Evolution News Articles
for November 2021

Evolution News Articles for October 2021