Author: admin

  • Hidden Abnormalities Discovered in The Brains of Elite Soldiers

    Hidden Abnormalities Discovered in The Brains of Elite Soldiers

    Repeated exposure to shock waves in the line of military duty can leave lasting impressions on the brain and may affect its functionality, new research has discovered, even if the changes don’t appear in standard brain scans.

    Researchers from Harvard Medical School lead an investigation on US special operations forces to get a better idea of how trauma from bomb blasts can increase the risk of traumatic brain injury (TBI) over the long term.

    Compared to healthy controls and those with low levels of blast exposure, service members with records of high blast exposure showed noticeable differences in functional connectivity – how different regions of the brain communicate and work together.

    Those differences in functional connectivity appeared in tandem with symptoms of higher severity on neuropsychological tests. The tests were configured to look for issues previously linked to TBIs in military personnel.

    “We found that service members with more blast exposure had more severe symptoms – including memory problems, emotional difficulties, and signs of post-traumatic stress disorder – and that their brains showed weaker connectivity in key areas,” says neuroradiologist Andrea Diociasi.

    “In short, repeated trauma seems to weaken the brain’s internal communication.”

    The study looked at 212 service members, both active and retired, who had a history of repetitive blast exposure. They were put through a range of brain imaging scans and psychological evaluations, tailored to assess the health of veterans.

    In particular, the researchers searched for evidence of ‘invisible’ injuries that don’t show up on normal magnetic resonance imaging ( MRI) scans. These kinds of brain impacts are often overlooked, making it difficult for researchers to match physical alterations in the brain with mental health problems.

    To detect these issues, the MRI analysis was conducted at a higher level of detail and combined with statistical models. The findings were so clear, the team used them to develop a predictive model that could spot a brain exposed to high blast levels with 73 percent accuracy.

    “We also noticed certain brain regions were actually larger in more-exposed individuals, which could reflect long-term tissue changes like scarring,” says Diociasi.

    “These aren’t injuries you can always see with the naked eye, but they are real – and now we can start measuring them.”

    The researchers are confident the approaches used in their study apply to other causes of brain injury, such as in contact sports or as the result of serious accidents at work.

    The study also succeeded in providing a more comprehensive map of how trauma leads to brain connectivity changes, and from there to clinical symptoms – potentially opening up routes for improved assessments and treatments.

    “The findings reveal that even when the brain looks normal, it might still be carrying hidden signs of trauma – and we now have tools to detect them,” says Diociasi.

    “That opens the door for earlier detection, better treatment, and a deeper understanding of how repeated trauma affects the brain over time.”

  • A Sprinkle of Artificial Sweetener Could Help Battle Drug-Resistant Bacteria

    A Sprinkle of Artificial Sweetener Could Help Battle Drug-Resistant Bacteria

    While the health effects of artificial sweeteners are still up for debate, a new study suggests they might also be a surprising weapon in the fight against antibiotic resistance.

    Led by a team from Brunel University in the UK, the researchers behind the study ran controlled lab tests to see how saccharin, a common artificial sweetener, interacted with bacteria. We know the chemical can influence gut bacteria, including those critical to our health, and the researchers wanted to take a closer look.

    The results, from a bacteria-bashing perspective, were impressive: saccharin caused serious disruption to the structures of several bacteria strains, inhibiting how easily they grew and multiplied.

    “We’ve identified a novel antimicrobial – saccharin,” says Ronan McCarthy, microbiologist from Brunel University. “Saccharin breaks the walls of bacterial pathogens, causing them to distort and eventually burst, killing the bacteria.”

    “Crucially, this damage lets antibiotics slip inside, overwhelming their resistance systems.”

    The team put saccharin up against some particularly nasty and drug-resistant bacteria, including Staphylococcus aureus and Escherichia coli. While the compound’s effectiveness varied between bacteria types, these early results are promising, and suggest saccharin could be effective against multiple bacteria strains with some tweaking.

    In further experiments into saccharin’s bug-beating qualities, the researchers also developed a surgical dressing from the artificial sweetener. In lab tests on pig skin, it proved to be more effective than standard wound dressing materials like silver at reducing bacteria levels. It turns out that saccharin is something of a superhero in tackling bugs.

    “This is very exciting,” says McCarthy. “Normally it takes billions of dollars and decades to develop a new antibiotic. But here we have a compound that’s already widely used, and it not only kills drug-resistant bacteria but also makes existing antibiotics more effective.”

    “We urgently need new drugs to treat resistant infections – and saccharin could represent a new therapeutic approach with exciting promise.”

    The stats on antibiotic resistance are pretty sobering. Bacteria that have evolved to be immune to the best drugs we have are on the rise, and are accounting for more and more deaths each year – with the annual figures now into the millions.

    While we are making some progress in fighting the threat of these superbugs – including finding new vulnerabilities that could be used to take down drug-resistant bacteria – the worry is that they’re the ones winning the race.

    Although it’s early days, saccharin could be a huge help in catching up, so we need to see how it works in clinical treatments before progress can be made. It’s also important to bear in mind the full range of effects that artificial sweeteners can have on the body, which may not all be positive.

    “Antibiotic resistance is one of the major threats to modern medicine,” says McCarthy.

    “Procedures such as tooth extractions and cancer treatment often rely on antibiotics to prevent or treat infection. But doctors are increasingly facing cases where the drugs no longer work.”

  • The Mere Thought of Being Hungry Could Alter Your Immune System

    The Mere Thought of Being Hungry Could Alter Your Immune System

    Feeling hungry doesn’t just make you reach for a snack – it may also change your immune system.

    In a recent study in mice, we found that simply perceiving hunger can change the number of immune cells in the blood, even when the animals hadn’t actually fasted. This shows that even the brain’s interpretation of hunger can shape how the immune system adapts.

    Our new research published in Science Immunology challenges the long-standing idea that immunity is shaped primarily by real, physical changes in nutrition, such as changes in blood sugar or nutrient levels. Instead, it shows that perception alone (what the brain “thinks” is happening) can reshape immunity.

    We focused on two types of highly specialised brain cells (AgRP neurons and POMC neurons) that sense the body’s energy status and generate the feelings of hunger and fullness in response. AgRP neurons promote hunger when energy is low, while POMC neurons signal fullness after eating.

    Using genetic tools, we artificially activated the hunger neurons in mice that had already eaten plenty of food. Activating this small but powerful group of brain cells triggered an intense urge to seek food in the mice. This finding builds on what multiple previous studies have shown.

    To our surprise, though, this synthetic hunger state also led to a marked drop in specific immune cells in the blood, called monocytes. These cells are part of the immune system’s first line of defence and play a critical role in regulating inflammation.

    Conversely, when we activated the fullness neurons in fasted mice, the monocyte levels returned close to normal, even though the mice hadn’t eaten.
    These experiments showed us the brain’s perception of being hungry or fed was on its own enough to influence immune cell numbers in the blood.

    Why might this happen?

    Why would the brain do this? Although we haven’t formally tested this, we think one possibility is that this complex, multi-organ communication system evolved to help the body anticipate and respond to potential shortages. By fine-tuning energy use and immune readiness based on perceived needs, the brain would be able to coordinate an efficient whole-body response before a real crisis begins.

    If the brain senses that food might be limited (for example, by interpreting environmental cues previously associated with food scarcity) it may act early to conserve energy and adjust immune function in advance.

    If these findings are confirmed in humans, this new data could, in future, have real-world implications for diseases where the immune system becomes overactive – such as cardiovascular diseasesmultiple sclerosis, and wasting syndrome in cancer patients.

    This is of further relevance for metabolic and eating disorders, such as obesity or anorexia. Not only are these disorders often accompanied by chronic inflammation or immune-related complications, they can also alter how hunger and fullness are computed in the brain.

    And, if the brain is able to help dial the immune system up or down, it may be possible to develop new brain-targeted approaches to aid current immuno-modulatory therapies.

    Still, there’s much we don’t know. We need more studies investigating how this mechanism works in humans. These studies could prove challenging, as it isn’t possible yet to selectively activate specific neurons in the human brain with the same precision we can in experimental models.

    Interestingly, more than a century ago a Soviet psychiatrist, A. Tapilsky, conducted an unusual experiment where he used hypnosis to suggest feelings of hunger or fullness to patients. Remarkably, immune cell counts increased when patients were told they were full and decreased when they were told they were hungry.

    These early observations hinted at a powerful connection between the mind and body, well ahead of today’s scientific understanding and are eerily prescient of our current ability to use powerful genetic tools to artificially generate internal sensations like hunger or fullness in animal models.

    What’s clear is that the brain’s view of the body’s energy needs can shape the immune system – sometimes even before the body itself has caught up. This raises new questions about how conditions such as stress, eating disorders and even learned associations with food scarcity might drive inflammation and disease.

  • Bird Flu Is Evolving Dangerously, But We Can Prevent a Disaster

    Bird Flu Is Evolving Dangerously, But We Can Prevent a Disaster

    Disease forecasts are like weather forecasts: We cannot predict the finer details of a particular outbreak or a particular storm, but we can often identify when these threats are emerging and prepare accordingly.

    The viruses that cause avian influenza are potential threats to global health. Recent animal outbreaks from a subtype called H5N1 have been especially troubling to scientists.

    Although human infections from H5N1 have been relatively rare, there have been a little more than 900 known cases globally since 2003 – nearly 50 percent of these cases have been fatal – a mortality rate about 20 times higher than that of the 1918 flu pandemic. If the worst of these rare infections ever became common among people, the results could be devastating.

    Approaching potential disease threats from an anthropological perspective, my colleagues and I recently published a book called “Emerging Infections: Three Epidemiological Transitions from Prehistory to the Present” to examine the ways human behaviors have shaped the evolution of infectious diseases, beginning with their first major emergence in the Neolithic period and continuing for 10,000 years to the present day.

    Viewed from this deep time perspective, it becomes evident that H5N1 is displaying a common pattern of stepwise invasion from animal to human populations. Like many emerging viruses, H5N1 is making incremental evolutionary changes that could allow it to transmit between people.

    The periods between these evolutionary steps present opportunities to slow this process and possibly avert a global disaster.

    Spillover and viral chatter

    When a disease-causing pathogen such as a flu virus is already adapted to infect a particular animal species, it may eventually evolve the ability to infect a new species, such as humans, through a process called spillover.

    Spillover is a tricky enterprise. To be successful, the pathogen must have the right set of molecular “keys” compatible with the host’s molecular “locks” so it can break in and out of host cells and hijack their replication machinery.

    Because these locks often vary between species, the pathogen may have to try many different keys before it can infect an entirely new host species.

    For instance, the keys a virus successfully uses to infect chickens and ducks may not work on cattle and humans. And because new keys can be made only through random mutation, the odds of obtaining all the right ones are very slim.

    Given these evolutionary challenges, it is not surprising that pathogens often get stuck partway into the spillover process. A new variant of the pathogen might be transmissible from an animal only to a person who is either more susceptible due to preexisting illness or more likely to be infected because of extended exposure to the pathogen.

    Even then, the pathogen might not be able to break out of its human host and transmit to another person. This is the current situation with H5N1.

    For the past year, there have been many animal outbreaks in a variety of wild and domestic animals, especially among birds and cattle. But there have also been a small number of human cases, most of which have occurred among poultry and dairy workers who worked closely with large numbers of infected animals.

    Epidemiologists call this situation viral chatter: when human infections occur only in small, sporadic outbreaks that appear like the chattering signals of coded radio communications – tiny bursts of unclear information that may add up to a very ominous message. In the case of viral chatter, the message would be a human pandemic.

    Sporadic, individual cases of H5N1 among people suggest that human-to-human transmission may likely occur at some point. But even so, no one knows how long or how many steps it would take for this to happen.

    Influenza viruses evolve rapidly. This is partly because two or more flu varieties can infect the same host simultaneously, allowing them to reshuffle their genetic material with one another to produce entirely new varieties.

    These reshuffling events are more likely to occur when there is a diverse range of host species. So it is particularly concerning that H5N1 is known to have infected at least 450 different animal species. It may not be long before the viral chatter gives way to larger human epidemics.

    Reshaping the trajectory

    The good news is that people can take basic measures to slow down the evolution of H5N1 and potentially reduce the lethality of avian influenza should it ever become a common human infection. But governments and businesses will need to act.

    People can start by taking better care of food animals. The total weight of the world’s poultry is greater than all wild bird species combined. So it is not surprising that the geography of most H5N1 outbreaks track more closely with large-scale housing and international transfers of live poultry than with the nesting and migration patterns of wild aquatic birds.

    Reducing these agricultural practices could help curb the evolution and spread of H5N1.

    People can also take better care of themselves. At the individual level, most people can vaccinate against the common, seasonal influenza viruses that circulate every year.

    At first glance this practice may not seem connected to the emergence of avian influenza. But in addition to preventing seasonal illness, vaccination against common human varieties of the virus will reduce the odds of it mixing with avian varieties and giving them the traits they need for human-to-human transmission.

    At the population level, societies can work together to improve nutrition and sanitation in the world’s poorest populations. History has shown that better nutrition increases overall resistance to new infections, and better sanitation reduces how much and how often people are exposed to new pathogens. And in today’s interconnected world, the disease problems of any society will eventually spread to every society.

    For more than 10,000 years, human behaviors have shaped the evolutionary trajectories of infectious diseases. Knowing this, people can reshape these trajectories for the better.

  • Screen Time In Bed May Increase Insomnia Odds, Study Suggests

    Screen Time In Bed May Increase Insomnia Odds, Study Suggests

    If you’re reading this in bed on your phone, you’re not alone. Lots of people use their phones before and beyond bedtime, especially young adults and teens.

    Still, you might want to call it a night soon (after you finish reading this, of course). Extended screen time before bed – or in bed – is widely suspected to disrupt sleep, although key details about the dynamic remain unclear.

    In a new study, researchers tried to shed more light on the issue, using data from a large survey of 45,202 university students in Norway.

    Screen time in bed is associated with 59 percent higher odds of insomnia, the study found, leading to 24 fewer minutes of total sleep per night.

    But people use screens in many ways, some of which may affect sleep more than others. Would TV sabotage your slumber as much as social media?

    Some previous studies suggest social media is especially bad for sleep, even more than other types of screen time. Yet little research has directly compared various screen-based activities and their impact on sleep.

    Most studies that have done so focused on teenagers, the researchers note.

    The new study features a slightly older demographic, ranging in age from 18 to 28, and draws from vast data collected for the Students’ Health and Well-being Study 2022, a nationally representative study of Norwegian students.

    The survey contains demographic information about students as well as several health and lifestyle factors, including screen use and sleep.

    “Sleep problems are highly prevalent among students and have significant implications for mental health, academic performance, and overall well-being, but previous studies have primarily focused on adolescents,” says Gunnhild Johnsen Hjetland, clinical psychologist at the Norwegian Institute of Public Health.

    “Given the widespread use of screens in bed, we aimed to explore the relationship between different screen activities and sleep patterns,” she says. “We expected that social media use might be more strongly associated with poorer sleep, given its interactive nature and potential for emotional stimulation.”

    According to the findings, however, social media use was no more of a hindrance to sleep than other screen-based activities.

    “The type of screen activity does not appear to matter as much as the overall time spent using screens in bed,” Hjetland says.

    “We found no significant differences between social media and other screen activities, suggesting that screen use itself is the key factor in sleep disruption – likely due to time displacement, where screen use delays sleep by taking up time that would otherwise be spent resting.”

    Participants reported whether they used any electronic media in bed, and for how long. They specified if they were watching movies or TV, checking social media, browsing the internet, listening to audio, gaming, or reading study-related content.

    The researchers grouped these into three broader categories: just social media, no social media, or social media plus other screen-based activities.

    In addition, participants reported their bedtimes and rising times, how long it took them to fall asleep, how often they struggled to fall or stay asleep, how often they felt sleepy during the day, and duration of their sleep troubles.

    Those reporting more post-bedtime screen time were much more likely to report symptoms of insomnia, the study found.

    The specific activity seemed to matter less than total screen time, suggesting screen use might curtail sleep by displacing rest rather than boosting wakefulness.

    There are some notable caveats. The sample size is large, for example, yet lacks the cultural diversity to make the findings broadly generalizable.

    The study also grouped many screen-based activities together, obscuring possible nuance in narrower categories.

    And while the study shows correlation, it can’t reveal causality. People checking social media actually reported better sleep overall, but the influence could go either way.

    “Another interpretation is that social media use is not the preferred activity for students who struggle the most with their sleep,” the researchers write.

    Some students use technology as a sleep aid, and may choose activities commonly considered more calming, like watching a movie or listening to music instead of doomscrolling.

    “If you struggle with sleep and suspect that screen time may be a factor, try to reduce screen use in bed, ideally stopping at least 30 to 60 minutes before sleep,” Hjetland says. “If you do use screens, consider disabling notifications to minimize disruptions during the night.”

  • Researchers Identify New Blood Group After 50 Year Mystery

    Researchers Identify New Blood Group After 50 Year Mystery

    When a pregnant woman had her blood sampled back in 1972, doctors discovered it was mysteriously missing a surface molecule found on all other known red blood cells at the time.

    After 50 years, this strange molecular absence finally led to researchers from the UK and Israel describing a new blood group system in humans. In 2024, the team published their paper on the discovery.

    “It represents a huge achievement, and the culmination of a long team effort, to finally establish this new blood group system and be able to offer the best care to rare, but important, patients,” UK National Health Service hematologist Louise Tilley said last September, after nearly 20 years of personally researching this bloody quirk.

    While we’re all more familiar with the ABO blood group system and the Rh factor (that’s the plus or minus part), humans actually have many different blood group systems based on the wide variety of cell-surface proteins and sugars that coat our blood cells.

    Our bodies use these antigen molecules, amongst their other purposes, as identification markers to separate ‘self’ from potentially harmful not-selves.

    If these markers do not match up when receiving a blood transfusion, this life-saving tactic can cause reactions or even end up being fatal.

    Most major blood groups were identified early in the 20th century. Many discovered since, like the Er blood system first described by researchers in 2022, only impact a small number of people. This is also the case for the new blood group.

    Previous research found more than 99.9 percent of people have the AnWj antigen that was missing from the 1972 patient’s blood. This antigen lives on a myelin and lymphocyte protein, leading the researchers to call the newly described system the MAL blood group.

    When someone has a mutated version of both copies of their MAL genes, they end up with an AnWj-negative blood type, like the pregnant patient. Tilley and team identified three patients with the rare blood type that didn’t have this mutation, suggesting that sometimes blood disorders can also cause the antigen to be suppressed.

    “MAL is a very small protein with some interesting properties which made it difficult to identify and meant we needed to pursue multiple lines of investigation to accumulate the proof we needed to establish this blood group system,” explained University of the West of England cell biologist Tim Satchwell.

    To determine they had the correct gene, after decades of research, the team inserted the normal MAL gene into blood cells that were AnWj-negative. This effectively delivered the AnWj antigen to those cells.

    The MAL protein is known to play a vital role in keeping cell membranes stable and aiding in cell transport. What’s more, previous research found that the AnWj isn’t actually present in newborn babies but appears soon after birth.

    Interestingly, all the AnWj-negative patients included in the study shared the same mutation. However, no other cell abnormalities or diseases were found to be associated with this mutation.

    Now that the researchers have identified the genetic markers behind the MAL mutation, patients can be tested to see if their negative MAL blood type is inherited or due to suppression, which could be a sign of another underlying medical problem.

    These rare blood quirks can have devastating impacts on patients, so the more of them we can understand, the more lives can be saved.

  • Hello world!

    Welcome to WordPress. This is your first post. Edit or delete it, then start writing!