Blog

  • Cancer’s Devastating Effect on a Brain Circuit Could Destroy Will

    Cancer’s Devastating Effect on a Brain Circuit Could Destroy Will

    A cruel consequence of advanced cancer is the profound apathy many patients experience as they lose interest in once-cherished activities.

    This symptom is part of a syndrome called cachexia, which affects about 80% of late-stage cancer patients, leading to severe muscle wasting and weight loss that leave patients bone thin despite adequate nutrition.

    This loss of motivation doesn’t just deepen patients’ suffering, it isolates them from family and friends. Because patients struggle to engage with demanding therapies that require effort and persistence, it also strains families and complicates treatment.

    Doctors typically assume that when late-stage cancer patients withdraw from life, it is an inevitable psychological response to physical deterioration. But what if apathy isn’t just a byproduct of physical decline but an integral part of the disease itself?

    In our newly published research, my colleagues and I have discovered something remarkable: Cancer doesn’t simply waste the body – it hijacks a specific brain circuit that controls motivation.

    Our findings, published in the journal Science, challenge decades of assumptions and suggest it might be possible to restore what many cancer patients describe as most devastating to lose – their will to engage with life.

    Untangling fatigue from physical decline

    To unravel the puzzle of apathy in cancer cachexia, we needed to trace the exact path inflammation takes in the body and peer inside a living brain while the disease is progressing – something impossible in people. However, neuroscientists have advanced technologies that make this possible in mice.

    Modern neuroscience equips us with a powerful arsenal of tools to probe how disease changes brain activity in mice. Scientists can map entire brains at the cellular level, track neural activity during behavior, and precisely switch neurons on or off. We used these neuroscience tools in a mouse model of cancer cachexia to study the effects of the disease on the brain and motivation.

    We identified a small brain region called the area postrema that acts as the brain’s inflammation detector. As a tumor grows, it releases cytokines − molecules that trigger inflammation − into the bloodstream. The area postrema lacks the typical blood-brain barrier that keeps out toxins, pathogens and other molecules from the body, allowing it to directly sample circulating inflammatory signals.

    When the area postrema detects a rise in inflammatory molecules, it triggers a neural cascade across multiple brain regions, ultimately suppressing dopamine release in the brain’s motivation center − the nucleus accumbens.

    While commonly misconstrued as a “pleasure chemical,” dopamine is actually associated with drive, or the willingness to put in effort to gain rewards: It tips the internal cost-benefit scale toward action.

    We directly observed this shift using two quantitative tests designed with behavioral economics principles to measure effort. In the first, mice repeatedly poked their noses into a food port, with progressively more pokes required to earn each food pellet.

    In the second task, mice repeatedly crossed a bridge between two water ports, each gradually depleting with use and forcing the mice to switch sides to replenish the supply, similar to picking berries until a bush is empty.

    As cancer progressed, mice still pursued easy rewards but quickly abandoned tasks requiring greater effort. Meanwhile, we watched dopamine levels fall in real time, precisely mirroring the mice’s decreasing willingness to work for rewards.

    Our findings suggest that cancer isn’t just generally “wearing out” the brain − it sends targeted inflammatory signals that the brain detects. The brain then responds by rapidly reducing dopamine levels to dial down motivation. This matches what patients describe: “Everything feels too hard.”

    Restoring motivation in late-stage disease

    Perhaps most exciting, we found several ways to restore motivation in mice suffering from cancer cachexia − even when the cancer itself continued progressing.

    First, by genetically switching off the inflammation-sensing neurons in the area postrema, or by directly stimulating neurons to release dopamine, we were able to restore normal motivation in mice.

    Second, we found that giving mice a drug that blocks a particular cytokine − working similarly to existing FDA-approved arthritis treatments − also proved effective. While the drug did not reverse physical wasting, it restored the mice’s willingness to work for rewards.

    While these results are based on mouse models, they suggest a treatment possibility for people: Targeting this specific inflammation-dopamine circuit could improve quality of life for cancer patients, even when the disease remains incurable.

    The boundary between physical and psychological symptoms is an artificially drawn line. Cancer ignores this division, using inflammation to commandeer the very circuits that drive a patient’s will to act. But our findings suggest these messages can be intercepted and the circuits restored.

    Rethinking apathy in disease

    Our discovery has implications far beyond cancer. The inflammatory molecule driving loss of motivation in cancer is also involved in numerous other conditions − from autoimmune disorders such as rheumatoid arthritis to chronic infections and depression.

    This same brain circuit might explain the debilitating apathy that millions of people suffering from various chronic diseases experience.

    Apathy triggered by inflammation may have originally evolved as a protective mechanism. When early humans faced acute infections, dialing down motivation made sense − it conserved energy and directed resources toward recovery.

    But what once helped people survive short-term illnesses turns harmful when inflammation persists chronically, as it does in cancer and other diseases. Rather than aiding survival, prolonged apathy deepens suffering, worsening health outcomes and quality of life.

    While translating these findings into therapies for people requires more research, our discovery reveals a promising target for treatment. By intercepting inflammatory signals or modulating brain circuits, researchers may be able to restore a patient’s drive.

    For patients and families watching motivation slip away, that possibility offers something powerful: hope that even as disease progresses, the essence of who we are might be reclaimed.

  • Scientists Identified a Healthier Way to Cook Broccoli – But There’s a Catch

    Scientists Identified a Healthier Way to Cook Broccoli – But There’s a Catch

    In recent years, broccoli has gained a reputation as an excellent vegetable due to its high levels of a particularly beneficial compound called sulforaphane.

    With some studies showing how this compound plays a role in blood sugar control and potentially even has anti-cancer benefits, it’s no wonder that broccoli pills are on the rise.

    However, a previous study showed that eating the whole vegetable gets you more sulforaphane than taking a supplement – so a team of Chinese researchers decided to try and find the best way to cook broccoli.

    They arrived at a clear winner, publishing their results in 2018 in the Journal of Agricultural and Food Chemistry – but it’s a tough sell if you have better things to do with your time.

    There’s a method behind the madness, though. Sulforaphane doesn’t just sit there in the broccoli florets, ready to be consumed. Instead, the vegetable contains several compounds called glucosinolates.

    It also contains the enzyme myrosinase, which plants have evolved for defending themselves against herbivores. Through what’s known as ‘myrosinase activity’, the glucosinolates get transformed into sulforaphane, which is what we want.

    To kick myrosinase activity into gear, you need to do damage to the broccoli, so you’d think cooking would do the trick.

    Unfortunately, studies have shown that common broccoli cooking methods, like boiling and microwaving, seriously reduce the amount of glucosinolates in the vegetable – even if you just zap it for a couple minutes. And myrosinase is super-sensitive to heat, too.

    Hence, by far the largest amount of sulforaphane you can get from broccoli is by munching on raw florets. Ugh.

    This got the team of researchers thinking about the results of stir-frying – the single most popular method for preparing vegetables in China.

    “Surprisingly, few methods have reported the sulforaphane concentrations in stir-fried broccoli, and to the best of our knowledge, no report has focused on sulforaphane stability in the stir-frying process,” the researchers note in their study.

    First, they basically pulverized the broccoli, chopping it into 2-millimeter pieces to get as much myrosinase activity going as possible (remember, the activity happens when broccoli is damaged).

    Then, they divided their samples into three groups – one was left raw, one was stir-fried for four minutes straight after chopping, and the third was chopped and then left alone for 90 minutes before being stir-fried for four minutes as well.

    The 90-minute waiting period was to see whether the broccoli would have more time to develop the beneficial compounds before being lightly cooked.

    And that’s exactly what the team found – the broccoli that was stir-fried right away had 2.8 times less sulforaphane than the one left to ‘develop’ for longer.

    “Our results suggest that after cutting broccoli florets into small pieces, they should be left for about 90 minutes before cooking,” the team writes, adding that they didn’t test it but thought “30 minutes would also be helpful”.

    We’re not sure we’re willing to commit to all that effort, though. The team does say they’re looking into ways to reduce the chopping needed, so watch this space – or just eat some raw broccoli.

  • Scientists Have Pinpointed The Best Diets to Boost Healthy Aging

    Scientists Have Pinpointed The Best Diets to Boost Healthy Aging

    Old age awaits everyone, but it hits some harder – and earlier – than others.

    The way we age hinges partly on factors beyond our control, like genes or exposure to pollutants. But research suggests we can still influence the outcome with key behaviors, especially in how we sleep, exercise, and eat.

    In a new 30-year study, researchers have taken an in-depth look at the links between eating habits and healthy aging, which they define as reaching age 70 without major chronic diseases or declines in cognitive, physical, or mental health.

    The study is among the first to analyze multiple dietary patterns in middle age in relation to overall healthy aging, explains co-author Frank Hu, epidemiologist at Harvard University.

    “Studies have previously investigated dietary patterns in the context of specific diseases or how long people live,” Hu says. “Ours takes a multifaceted view, asking, ‘How does diet impact people’s ability to live independently and enjoy a good quality of life as they age?’”

    Hu and his colleagues used longitudinal data from 105,000 adults between the ages of 39 and 69 (averaging 53 years old), collected between 1986 and 2016 as part of the Nurses’ Health Study and the Health Professionals Follow-Up Study.

    Subjects regularly completed dietary questionnaires over the 30-year period. The authors of the new study scored their eating habits by adherence to eight dietary patterns, seeking to learn which is most likely to promote healthy aging.

    The eight dietary patterns are: the Alternative Healthy Eating Index (AHEI), the Alternative Mediterranean Index (aMED), the Dietary Approaches to Stop Hypertension (DASH), the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND), the healthful plant-based diet (hPDI), the Planetary Health Diet Index (PHDI), the empirically inflammatory dietary pattern (EDIP), and the empirical dietary index for hyperinsulinemia (EDIH).

    All eight dietary patterns share some basic inclinations, the researchers note, such as a focus on whole, plant-based foods and healthy fats.

    In addition to those eight patterns, the study examined subjects’ consumption of ultra-processed foods – industrial creations often featuring many ingredients and excessive sugar, salt, and unhealthy fats.

    The study found 9,771 participants met their definition of healthy aging, representing 9.3 percent of the sample population.

    All eight dietary patterns were associated with healthy aging, the study found, so adhering to any one of them raised a person’s odds of staying spry at 70.

    This suggests the solution isn’t simple, the researchers note, and no single diet is best for everyone. That said, one contender did stand out.

    The best diet overall for healthy aging is the AHEI, according to the findings. It’s similar to the older Healthy Eating Index, the researchers explain, but more oriented toward preventing chronic disease.

    Subjects in the top quintile for the AHEI were 86 percent more likely to achieve healthy aging at 70 than those in the lowest quintile, the study found. If the cutoff age was 75, the top quintile was 2.24 times likelier to age healthily.

    The AHEI emphasizes plant-based foods like fruits, vegetables, whole grains, nuts, and legumes, while minimizing red and processed meats, sugar-sweetened drinks, sodium, and refined grains.

    “Since staying active and independent is a priority for both individuals and public health, research on healthy aging is essential,” says co-author Marta Guasch-Ferré, Harvard nutritionist.

    “Our findings suggest that dietary patterns rich in plant-based foods, with moderate inclusion of healthy animal-based foods, may promote overall healthy aging and help shape future dietary guidelines,” she adds.

    While the AHEI showed the best results, it was closely followed by several others, including the aMED, DASH, PHDI, and MIND, the researchers report.

    Along with the strongest link to overall healthy aging, the AHEI had the strongest link with maintaining physical function and mental health. The PHDI showed the strongest link with maintaining cognitive health and surviving to age 70.

    Higher consumption of ultra-processed foods, on the other hand, was associated with a lower likelihood of healthy aging.

    “Our findings also show that there is no one-size-fits-all diet. Healthy diets can be adapted to fit individual needs and preferences,” says lead author Anne-Julie Tessier, nutritionist at the University of Montreal.

  • Hidden Abnormalities Discovered in The Brains of Elite Soldiers

    Hidden Abnormalities Discovered in The Brains of Elite Soldiers

    Repeated exposure to shock waves in the line of military duty can leave lasting impressions on the brain and may affect its functionality, new research has discovered, even if the changes don’t appear in standard brain scans.

    Researchers from Harvard Medical School lead an investigation on US special operations forces to get a better idea of how trauma from bomb blasts can increase the risk of traumatic brain injury (TBI) over the long term.

    Compared to healthy controls and those with low levels of blast exposure, service members with records of high blast exposure showed noticeable differences in functional connectivity – how different regions of the brain communicate and work together.

    Those differences in functional connectivity appeared in tandem with symptoms of higher severity on neuropsychological tests. The tests were configured to look for issues previously linked to TBIs in military personnel.

    “We found that service members with more blast exposure had more severe symptoms – including memory problems, emotional difficulties, and signs of post-traumatic stress disorder – and that their brains showed weaker connectivity in key areas,” says neuroradiologist Andrea Diociasi.

    “In short, repeated trauma seems to weaken the brain’s internal communication.”

    The study looked at 212 service members, both active and retired, who had a history of repetitive blast exposure. They were put through a range of brain imaging scans and psychological evaluations, tailored to assess the health of veterans.

    In particular, the researchers searched for evidence of ‘invisible’ injuries that don’t show up on normal magnetic resonance imaging ( MRI) scans. These kinds of brain impacts are often overlooked, making it difficult for researchers to match physical alterations in the brain with mental health problems.

    To detect these issues, the MRI analysis was conducted at a higher level of detail and combined with statistical models. The findings were so clear, the team used them to develop a predictive model that could spot a brain exposed to high blast levels with 73 percent accuracy.

    “We also noticed certain brain regions were actually larger in more-exposed individuals, which could reflect long-term tissue changes like scarring,” says Diociasi.

    “These aren’t injuries you can always see with the naked eye, but they are real – and now we can start measuring them.”

    The researchers are confident the approaches used in their study apply to other causes of brain injury, such as in contact sports or as the result of serious accidents at work.

    The study also succeeded in providing a more comprehensive map of how trauma leads to brain connectivity changes, and from there to clinical symptoms – potentially opening up routes for improved assessments and treatments.

    “The findings reveal that even when the brain looks normal, it might still be carrying hidden signs of trauma – and we now have tools to detect them,” says Diociasi.

    “That opens the door for earlier detection, better treatment, and a deeper understanding of how repeated trauma affects the brain over time.”

  • A Sprinkle of Artificial Sweetener Could Help Battle Drug-Resistant Bacteria

    A Sprinkle of Artificial Sweetener Could Help Battle Drug-Resistant Bacteria

    While the health effects of artificial sweeteners are still up for debate, a new study suggests they might also be a surprising weapon in the fight against antibiotic resistance.

    Led by a team from Brunel University in the UK, the researchers behind the study ran controlled lab tests to see how saccharin, a common artificial sweetener, interacted with bacteria. We know the chemical can influence gut bacteria, including those critical to our health, and the researchers wanted to take a closer look.

    The results, from a bacteria-bashing perspective, were impressive: saccharin caused serious disruption to the structures of several bacteria strains, inhibiting how easily they grew and multiplied.

    “We’ve identified a novel antimicrobial – saccharin,” says Ronan McCarthy, microbiologist from Brunel University. “Saccharin breaks the walls of bacterial pathogens, causing them to distort and eventually burst, killing the bacteria.”

    “Crucially, this damage lets antibiotics slip inside, overwhelming their resistance systems.”

    The team put saccharin up against some particularly nasty and drug-resistant bacteria, including Staphylococcus aureus and Escherichia coli. While the compound’s effectiveness varied between bacteria types, these early results are promising, and suggest saccharin could be effective against multiple bacteria strains with some tweaking.

    In further experiments into saccharin’s bug-beating qualities, the researchers also developed a surgical dressing from the artificial sweetener. In lab tests on pig skin, it proved to be more effective than standard wound dressing materials like silver at reducing bacteria levels. It turns out that saccharin is something of a superhero in tackling bugs.

    “This is very exciting,” says McCarthy. “Normally it takes billions of dollars and decades to develop a new antibiotic. But here we have a compound that’s already widely used, and it not only kills drug-resistant bacteria but also makes existing antibiotics more effective.”

    “We urgently need new drugs to treat resistant infections – and saccharin could represent a new therapeutic approach with exciting promise.”

    The stats on antibiotic resistance are pretty sobering. Bacteria that have evolved to be immune to the best drugs we have are on the rise, and are accounting for more and more deaths each year – with the annual figures now into the millions.

    While we are making some progress in fighting the threat of these superbugs – including finding new vulnerabilities that could be used to take down drug-resistant bacteria – the worry is that they’re the ones winning the race.

    Although it’s early days, saccharin could be a huge help in catching up, so we need to see how it works in clinical treatments before progress can be made. It’s also important to bear in mind the full range of effects that artificial sweeteners can have on the body, which may not all be positive.

    “Antibiotic resistance is one of the major threats to modern medicine,” says McCarthy.

    “Procedures such as tooth extractions and cancer treatment often rely on antibiotics to prevent or treat infection. But doctors are increasingly facing cases where the drugs no longer work.”

  • The Mere Thought of Being Hungry Could Alter Your Immune System

    The Mere Thought of Being Hungry Could Alter Your Immune System

    Feeling hungry doesn’t just make you reach for a snack – it may also change your immune system.

    In a recent study in mice, we found that simply perceiving hunger can change the number of immune cells in the blood, even when the animals hadn’t actually fasted. This shows that even the brain’s interpretation of hunger can shape how the immune system adapts.

    Our new research published in Science Immunology challenges the long-standing idea that immunity is shaped primarily by real, physical changes in nutrition, such as changes in blood sugar or nutrient levels. Instead, it shows that perception alone (what the brain “thinks” is happening) can reshape immunity.

    We focused on two types of highly specialised brain cells (AgRP neurons and POMC neurons) that sense the body’s energy status and generate the feelings of hunger and fullness in response. AgRP neurons promote hunger when energy is low, while POMC neurons signal fullness after eating.

    Using genetic tools, we artificially activated the hunger neurons in mice that had already eaten plenty of food. Activating this small but powerful group of brain cells triggered an intense urge to seek food in the mice. This finding builds on what multiple previous studies have shown.

    To our surprise, though, this synthetic hunger state also led to a marked drop in specific immune cells in the blood, called monocytes. These cells are part of the immune system’s first line of defence and play a critical role in regulating inflammation.

    Conversely, when we activated the fullness neurons in fasted mice, the monocyte levels returned close to normal, even though the mice hadn’t eaten.
    These experiments showed us the brain’s perception of being hungry or fed was on its own enough to influence immune cell numbers in the blood.

    Why might this happen?

    Why would the brain do this? Although we haven’t formally tested this, we think one possibility is that this complex, multi-organ communication system evolved to help the body anticipate and respond to potential shortages. By fine-tuning energy use and immune readiness based on perceived needs, the brain would be able to coordinate an efficient whole-body response before a real crisis begins.

    If the brain senses that food might be limited (for example, by interpreting environmental cues previously associated with food scarcity) it may act early to conserve energy and adjust immune function in advance.

    If these findings are confirmed in humans, this new data could, in future, have real-world implications for diseases where the immune system becomes overactive – such as cardiovascular diseasesmultiple sclerosis, and wasting syndrome in cancer patients.

    This is of further relevance for metabolic and eating disorders, such as obesity or anorexia. Not only are these disorders often accompanied by chronic inflammation or immune-related complications, they can also alter how hunger and fullness are computed in the brain.

    And, if the brain is able to help dial the immune system up or down, it may be possible to develop new brain-targeted approaches to aid current immuno-modulatory therapies.

    Still, there’s much we don’t know. We need more studies investigating how this mechanism works in humans. These studies could prove challenging, as it isn’t possible yet to selectively activate specific neurons in the human brain with the same precision we can in experimental models.

    Interestingly, more than a century ago a Soviet psychiatrist, A. Tapilsky, conducted an unusual experiment where he used hypnosis to suggest feelings of hunger or fullness to patients. Remarkably, immune cell counts increased when patients were told they were full and decreased when they were told they were hungry.

    These early observations hinted at a powerful connection between the mind and body, well ahead of today’s scientific understanding and are eerily prescient of our current ability to use powerful genetic tools to artificially generate internal sensations like hunger or fullness in animal models.

    What’s clear is that the brain’s view of the body’s energy needs can shape the immune system – sometimes even before the body itself has caught up. This raises new questions about how conditions such as stress, eating disorders and even learned associations with food scarcity might drive inflammation and disease.

  • Bird Flu Is Evolving Dangerously, But We Can Prevent a Disaster

    Bird Flu Is Evolving Dangerously, But We Can Prevent a Disaster

    Disease forecasts are like weather forecasts: We cannot predict the finer details of a particular outbreak or a particular storm, but we can often identify when these threats are emerging and prepare accordingly.

    The viruses that cause avian influenza are potential threats to global health. Recent animal outbreaks from a subtype called H5N1 have been especially troubling to scientists.

    Although human infections from H5N1 have been relatively rare, there have been a little more than 900 known cases globally since 2003 – nearly 50 percent of these cases have been fatal – a mortality rate about 20 times higher than that of the 1918 flu pandemic. If the worst of these rare infections ever became common among people, the results could be devastating.

    Approaching potential disease threats from an anthropological perspective, my colleagues and I recently published a book called “Emerging Infections: Three Epidemiological Transitions from Prehistory to the Present” to examine the ways human behaviors have shaped the evolution of infectious diseases, beginning with their first major emergence in the Neolithic period and continuing for 10,000 years to the present day.

    Viewed from this deep time perspective, it becomes evident that H5N1 is displaying a common pattern of stepwise invasion from animal to human populations. Like many emerging viruses, H5N1 is making incremental evolutionary changes that could allow it to transmit between people.

    The periods between these evolutionary steps present opportunities to slow this process and possibly avert a global disaster.

    Spillover and viral chatter

    When a disease-causing pathogen such as a flu virus is already adapted to infect a particular animal species, it may eventually evolve the ability to infect a new species, such as humans, through a process called spillover.

    Spillover is a tricky enterprise. To be successful, the pathogen must have the right set of molecular “keys” compatible with the host’s molecular “locks” so it can break in and out of host cells and hijack their replication machinery.

    Because these locks often vary between species, the pathogen may have to try many different keys before it can infect an entirely new host species.

    For instance, the keys a virus successfully uses to infect chickens and ducks may not work on cattle and humans. And because new keys can be made only through random mutation, the odds of obtaining all the right ones are very slim.

    Given these evolutionary challenges, it is not surprising that pathogens often get stuck partway into the spillover process. A new variant of the pathogen might be transmissible from an animal only to a person who is either more susceptible due to preexisting illness or more likely to be infected because of extended exposure to the pathogen.

    Even then, the pathogen might not be able to break out of its human host and transmit to another person. This is the current situation with H5N1.

    For the past year, there have been many animal outbreaks in a variety of wild and domestic animals, especially among birds and cattle. But there have also been a small number of human cases, most of which have occurred among poultry and dairy workers who worked closely with large numbers of infected animals.

    Epidemiologists call this situation viral chatter: when human infections occur only in small, sporadic outbreaks that appear like the chattering signals of coded radio communications – tiny bursts of unclear information that may add up to a very ominous message. In the case of viral chatter, the message would be a human pandemic.

    Sporadic, individual cases of H5N1 among people suggest that human-to-human transmission may likely occur at some point. But even so, no one knows how long or how many steps it would take for this to happen.

    Influenza viruses evolve rapidly. This is partly because two or more flu varieties can infect the same host simultaneously, allowing them to reshuffle their genetic material with one another to produce entirely new varieties.

    These reshuffling events are more likely to occur when there is a diverse range of host species. So it is particularly concerning that H5N1 is known to have infected at least 450 different animal species. It may not be long before the viral chatter gives way to larger human epidemics.

    Reshaping the trajectory

    The good news is that people can take basic measures to slow down the evolution of H5N1 and potentially reduce the lethality of avian influenza should it ever become a common human infection. But governments and businesses will need to act.

    People can start by taking better care of food animals. The total weight of the world’s poultry is greater than all wild bird species combined. So it is not surprising that the geography of most H5N1 outbreaks track more closely with large-scale housing and international transfers of live poultry than with the nesting and migration patterns of wild aquatic birds.

    Reducing these agricultural practices could help curb the evolution and spread of H5N1.

    People can also take better care of themselves. At the individual level, most people can vaccinate against the common, seasonal influenza viruses that circulate every year.

    At first glance this practice may not seem connected to the emergence of avian influenza. But in addition to preventing seasonal illness, vaccination against common human varieties of the virus will reduce the odds of it mixing with avian varieties and giving them the traits they need for human-to-human transmission.

    At the population level, societies can work together to improve nutrition and sanitation in the world’s poorest populations. History has shown that better nutrition increases overall resistance to new infections, and better sanitation reduces how much and how often people are exposed to new pathogens. And in today’s interconnected world, the disease problems of any society will eventually spread to every society.

    For more than 10,000 years, human behaviors have shaped the evolutionary trajectories of infectious diseases. Knowing this, people can reshape these trajectories for the better.

  • Screen Time In Bed May Increase Insomnia Odds, Study Suggests

    Screen Time In Bed May Increase Insomnia Odds, Study Suggests

    If you’re reading this in bed on your phone, you’re not alone. Lots of people use their phones before and beyond bedtime, especially young adults and teens.

    Still, you might want to call it a night soon (after you finish reading this, of course). Extended screen time before bed – or in bed – is widely suspected to disrupt sleep, although key details about the dynamic remain unclear.

    In a new study, researchers tried to shed more light on the issue, using data from a large survey of 45,202 university students in Norway.

    Screen time in bed is associated with 59 percent higher odds of insomnia, the study found, leading to 24 fewer minutes of total sleep per night.

    But people use screens in many ways, some of which may affect sleep more than others. Would TV sabotage your slumber as much as social media?

    Some previous studies suggest social media is especially bad for sleep, even more than other types of screen time. Yet little research has directly compared various screen-based activities and their impact on sleep.

    Most studies that have done so focused on teenagers, the researchers note.

    The new study features a slightly older demographic, ranging in age from 18 to 28, and draws from vast data collected for the Students’ Health and Well-being Study 2022, a nationally representative study of Norwegian students.

    The survey contains demographic information about students as well as several health and lifestyle factors, including screen use and sleep.

    “Sleep problems are highly prevalent among students and have significant implications for mental health, academic performance, and overall well-being, but previous studies have primarily focused on adolescents,” says Gunnhild Johnsen Hjetland, clinical psychologist at the Norwegian Institute of Public Health.

    “Given the widespread use of screens in bed, we aimed to explore the relationship between different screen activities and sleep patterns,” she says. “We expected that social media use might be more strongly associated with poorer sleep, given its interactive nature and potential for emotional stimulation.”

    According to the findings, however, social media use was no more of a hindrance to sleep than other screen-based activities.

    “The type of screen activity does not appear to matter as much as the overall time spent using screens in bed,” Hjetland says.

    “We found no significant differences between social media and other screen activities, suggesting that screen use itself is the key factor in sleep disruption – likely due to time displacement, where screen use delays sleep by taking up time that would otherwise be spent resting.”

    Participants reported whether they used any electronic media in bed, and for how long. They specified if they were watching movies or TV, checking social media, browsing the internet, listening to audio, gaming, or reading study-related content.

    The researchers grouped these into three broader categories: just social media, no social media, or social media plus other screen-based activities.

    In addition, participants reported their bedtimes and rising times, how long it took them to fall asleep, how often they struggled to fall or stay asleep, how often they felt sleepy during the day, and duration of their sleep troubles.

    Those reporting more post-bedtime screen time were much more likely to report symptoms of insomnia, the study found.

    The specific activity seemed to matter less than total screen time, suggesting screen use might curtail sleep by displacing rest rather than boosting wakefulness.

    There are some notable caveats. The sample size is large, for example, yet lacks the cultural diversity to make the findings broadly generalizable.

    The study also grouped many screen-based activities together, obscuring possible nuance in narrower categories.

    And while the study shows correlation, it can’t reveal causality. People checking social media actually reported better sleep overall, but the influence could go either way.

    “Another interpretation is that social media use is not the preferred activity for students who struggle the most with their sleep,” the researchers write.

    Some students use technology as a sleep aid, and may choose activities commonly considered more calming, like watching a movie or listening to music instead of doomscrolling.

    “If you struggle with sleep and suspect that screen time may be a factor, try to reduce screen use in bed, ideally stopping at least 30 to 60 minutes before sleep,” Hjetland says. “If you do use screens, consider disabling notifications to minimize disruptions during the night.”

  • Researchers Identify New Blood Group After 50 Year Mystery

    Researchers Identify New Blood Group After 50 Year Mystery

    When a pregnant woman had her blood sampled back in 1972, doctors discovered it was mysteriously missing a surface molecule found on all other known red blood cells at the time.

    After 50 years, this strange molecular absence finally led to researchers from the UK and Israel describing a new blood group system in humans. In 2024, the team published their paper on the discovery.

    “It represents a huge achievement, and the culmination of a long team effort, to finally establish this new blood group system and be able to offer the best care to rare, but important, patients,” UK National Health Service hematologist Louise Tilley said last September, after nearly 20 years of personally researching this bloody quirk.

    While we’re all more familiar with the ABO blood group system and the Rh factor (that’s the plus or minus part), humans actually have many different blood group systems based on the wide variety of cell-surface proteins and sugars that coat our blood cells.

    Our bodies use these antigen molecules, amongst their other purposes, as identification markers to separate ‘self’ from potentially harmful not-selves.

    If these markers do not match up when receiving a blood transfusion, this life-saving tactic can cause reactions or even end up being fatal.

    Most major blood groups were identified early in the 20th century. Many discovered since, like the Er blood system first described by researchers in 2022, only impact a small number of people. This is also the case for the new blood group.

    Previous research found more than 99.9 percent of people have the AnWj antigen that was missing from the 1972 patient’s blood. This antigen lives on a myelin and lymphocyte protein, leading the researchers to call the newly described system the MAL blood group.

    When someone has a mutated version of both copies of their MAL genes, they end up with an AnWj-negative blood type, like the pregnant patient. Tilley and team identified three patients with the rare blood type that didn’t have this mutation, suggesting that sometimes blood disorders can also cause the antigen to be suppressed.

    “MAL is a very small protein with some interesting properties which made it difficult to identify and meant we needed to pursue multiple lines of investigation to accumulate the proof we needed to establish this blood group system,” explained University of the West of England cell biologist Tim Satchwell.

    To determine they had the correct gene, after decades of research, the team inserted the normal MAL gene into blood cells that were AnWj-negative. This effectively delivered the AnWj antigen to those cells.

    The MAL protein is known to play a vital role in keeping cell membranes stable and aiding in cell transport. What’s more, previous research found that the AnWj isn’t actually present in newborn babies but appears soon after birth.

    Interestingly, all the AnWj-negative patients included in the study shared the same mutation. However, no other cell abnormalities or diseases were found to be associated with this mutation.

    Now that the researchers have identified the genetic markers behind the MAL mutation, patients can be tested to see if their negative MAL blood type is inherited or due to suppression, which could be a sign of another underlying medical problem.

    These rare blood quirks can have devastating impacts on patients, so the more of them we can understand, the more lives can be saved.

  • Hello world!

    Welcome to WordPress. This is your first post. Edit or delete it, then start writing!