Credits: Canva
A recent study suggests that people who donate blood regularly may have genetic changes in their blood that could in fact reduce the risk of developing cancer. It is conducted by the researchers at the Francis Crick Institute, and the study has now provided new insights into how and why blood cancers develop. The study is published in the journal Blood and was conducted by the scientists from Heidelberg and the German Red Cross blood donation center. There is yet a need for further research to confirm these findings.
The researchers examined the blood of two groups of healthy male donors in their 60s:
The goal was to analyze genetic mutations in their blood and assess whether frequent donation had any impact on their genetic makeup.
As and when people age, their blood and other cells naturally develop mutations and some of them can also increase the risk of cancer. When anyone donates blood, his or her body compensates by producing new blood cells, which can influence the genetic diversity of stem cells in the bone marrow. The study also found that both groups had a similar number of mutations. For instance the frequent donors had 217 mutations, while the irregular donors had 212 mutations.
However, the nature of these mutations differed. In the frequent donors, 50% of the mutations were of a type not associated with a high risk of blood cancers, compared to only 30% in the irregular donors.
Further laboratory analysis showed that these specific mutations behaved differently from those linked to leukemia, a type of blood cancer. When human blood stem cells with these mutations were injected into mice, they were found to be highly effective at producing red blood cells, which is considered a positive outcome.
Dr. Hector Huerga Encabo, one of the study authors, emphasized that these mutations do not indicate an increased risk of leukemia. The findings suggest that regular blood donation may influence how stem cells evolve, but whether this translates into a lower cancer risk remains uncertain.
Read More: Who Can Donate Blood To Whom?
One notable disadvantage is the "healthy-donor effect"—because blood donors are often healthier than the general population, their lower cancer risk could be unrelated to blood donation.
Dominique Bonnet, senior researcher and head of a stem-cell laboratory at the Francis Crick Institute, stressed the need for larger studies with female volunteers to confirm the findings.
Despite ongoing research into potential health benefits for donors, the primary goal of blood donation remains saving lives. NHS Blood and Transplant emphasized that while the study is interesting, further research is required to draw firm conclusions. The organization also noted that blood supplies are currently critically low and encouraged eligible individuals to donate.
Also Read: How Long After a Tattoo or Piercing Can I Donate Blood?
Credit: Canva
Living under constant psychosocial stress can significantly raise the risk of developing dementia and a stroke, a JAMA Network study suggests.
Psychosocial stress is a type of stress related to our relationships with others, usually arising from feeling judged, excluded, or not enough in others' eyes. It can also put a person in fight-or-flight mode, causing both mental and physical symptoms.
According to Chinese researchers, people who experience this form of stress in childhood as well as adulthood face more than a threefold higher risk of developing dementia compared with those in other groups.
Similarly, young people experiencing stressful situations in their adulthood had a significantly higher risk of stroke incidence that their counterparts.
Based on these results, the study highlights that early identification of psychosocial stressors, combined with effective mental health support and depression prevention, may reduce the long-term burden of neurodegenerative and cerebrovascular disease.
The scientists defined adverse childhood experiences (ACEs) as traumatic exposures occurring during childhood, typically grouped into 3 categories: household dysfunction, social dysfunction and family death or disability.
On the other hand, traumatic exposures occurring during adulthood were defined as adverse adult experiences (AAEs), which include events such as the death of a child, lifetime discrimination, ever being confined to bed, ever being hospitalized for a month or longer and ever leaving a job due to health conditions.
While analyzing the data they collected from the participants, the researchers also found that depression partly explained the links in all major relationships as it accounted for more than one-third of the connection between childhood adversity and dementia, and about one-fifth of the link between adulthood adversity and both dementia and stroke.
READ MORE: Avoid Doing These 3 Things Prevent Dementia, According To Neurologist
These findings suggest that long-term psychological stress may lead to brain and blood vessel diseases by causing ongoing emotional distress, unhealthy behaviours, and biological changes like inflammation and abnormal stress responses.
Psychosocial stress can trigger physiological responses like increased heart rate, cortisol secretion, and inflammation, significantly increasing risks for hypertension, cardiovascular disease and mental health disorders.
This kind of stress can affect men, women, and people of all genders differently, but many of the symptoms are still the same. Common symptoms include:
These symptoms can be acute or chronic, meaning for some people they go away, and for others, they persist over a long period of time. Meeting with a therapist is often recommended for those living with chronic stress.
Experts typically suggest developing coping mechanisms include building support networks, utilizing relaxation techniques, and, in cases of severe mental impact, seeking professional support to help deal with psychosocial stress.
Credits: WBBJTV. News
Cardiovascular disease remains the leading cause of death among women, claiming more lives each year than all cancers combined. Yet, it continues to be misunderstood, underdiagnosed, and often dismissed as a “male” health problem. In Ohio and across the US, women are now using a striking visual message to confront this reality head-on, by quite literally dressing for the cause.
February, observed as American Heart Month, marks a renewed push to educate communities about heart disease, especially among women. Health advocates stress that while the spotlight is brightest this month, the risks and responsibilities extend far beyond the calendar.
“It’s our Super Bowl,” said Lauren Thomas, development director for the American Heart Association of North Central West Virginia and Ohio Valley. “It’s about awareness. Heart health is not a one-month conversation. It has to be a year-round priority where people actively put their hearts first.”
Across Ohio, women are wearing red, a color long associated with love but also with danger. The message is deliberate. Red symbolizes the urgency of cardiovascular disease, a condition responsible for one in every three deaths among women.
“When we wear red, we start conversations that many people avoid,” said Melissa Pratt, a heart disease survivor, reported WBBJ News. “One in three women die from cardiovascular disease. Wearing red encourages women to get checked, understand their risks, and take their health seriously.”
From landmarks lit in red to workplaces, neighborhoods, and social media feeds filled with crimson outfits, the visual campaign is meant to disrupt complacency. It asks a confronting question. If heart disease is killing women at this scale, why is it still not treated like a crisis?
Coinciding with American Heart Month, the Ohio Valley Women of Impact campaign launches this Friday. Six local women, Crissy Clutter, Jan Pattishall-Krupinski, Lacy Ferguson, Shelbie Smith, Jennifer Hall-Fawson, and Laurie Conway, are leading fundraising and awareness efforts aimed at improving women’s heart health.
Their work focuses on education, early detection, and supporting research that better understands how heart disease presents differently in women. Symptoms in women can be subtle, ranging from fatigue and nausea to jaw or back pain, which often delays diagnosis and treatment.
To mark the start of the month, the American Heart Association hosted a National Wear Red Day breakfast on Friday morning at the LIFT Wellness Center in Jackson Walk Plaza. The event brought together survivors, advocates, and health professionals to reinforce a simple but powerful message. Awareness must lead to action.
Health experts continue to urge women to prioritize regular checkups, manage blood pressure, cholesterol, and stress, and recognize early warning signs. Lifestyle changes, timely screenings, and informed conversations can significantly reduce risk.
The women of Ohio are not wearing red for fashion. They are wearing it as a warning, a remembrance, and a call to action. In dressing themselves in the color of urgency, they are confronting a disease that has taken too many lives quietly. This February, their message is clear. Heart disease is not inevitable, but ignoring it can be deadly.
Credits: South Magazine
If your roots trace back to the Indian subcontinent, your risk of developing type 2 diabetes is significantly higher than that of Europeans. Research shows that Indians, Pakistanis, and Bangladeshis are up to six times more likely to develop the condition, often at a younger age and at lower body weights. For years, carbohydrate-heavy diets were blamed. But growing scientific evidence points to a far deeper and darker cause: repeated famines during British colonial rule that may have altered metabolic resilience across generations.
The idea that starvation can leave a genetic imprint may sound extreme, but science supports it. Prolonged nutrient deprivation can permanently affect how the body stores fat, processes glucose, and responds to food abundance later in life. Even a single famine can raise the risk of metabolic disorders such as diabetes in future generations.
This understanding forms the basis of the “thrifty genotype hypothesis,” a concept widely discussed in evolutionary biology.
The thrifty genotype hypothesis suggests that populations exposed to repeated famines develop genetic traits that help conserve energy. These traits are lifesaving during scarcity but become harmful in times of plenty, increasing the risk of obesity and diabetes.
Economic historian Mike Davis documents that India experienced 31 major famines during 190 years of British rule between 1757 and 1947, roughly one every six years. By contrast, only 17 famines occurred in the previous 2,000 years. Davis estimates that 29 million people died in the Victorian era alone. Economic anthropologist Jason Hickel places the death toll from colonial policies between 1880 and 1920 at around 100 million.
A study published in Frontiers in Public Health titled The Elevated Susceptibility to Diabetes in India: An Evolutionary Perspective argues that these famines reshaped metabolic traits. The researchers note that Indians tend to have a higher fat-to-lean mass ratio, lower average birth weight, and reduced ability to clear glucose. This combination increases metabolic stress and lowers resilience, explaining earlier onset of diabetes compared to Europeans.
Colonial-era famines also affected physical growth. Studies show that average Indian height declined by about 1.8 cm per century during British rule. Historian accounts describe ancient Indians as tall and robust, with even Greek chroniclers noting their stature during Alexander’s invasion. By the 1960s, however, Indians were about 15 cm shorter than their Mesolithic ancestors.
Read: How Colonialism Continues To Bear An Impact On The South Asian Health Crisis
While the British did not cause early declines, widespread impoverishment under colonial rule sharply accelerated the trend. Only in the past 50 years has average height begun to recover.
Mike Davis argues that colonial famines were driven not by food shortages but by policy. Grain continued to be exported even as millions starved. During the 1876 famine, Viceroy Robert Bulwer-Lytton refused to halt exports, and relief work was deliberately discouraged. Davis describes these deaths as the result of state policy, not natural disaster.
Medical journal The Lancet estimated that 19 million Indians died during famines in the 1890s alone.
India now faces the consequences. According to the Indian Council of Medical Research, over 101 million Indians live with diabetes today. Experts argue that prevention must begin early, with reduced sugar intake, low-glycaemic diets, healthier fats, and compulsory physical activity in schools. Education about famine-linked intergenerational health risks could also help younger generations make informed choices.
India has avoided famine since Independence in 1947. The next challenge is ensuring that history’s biological scars do not continue to shape its future.
© 2024 Bennett, Coleman & Company Limited