The COVID-19 pandemic may be over, but our immune systems are still feeling the impact. After years of battling constant viral threats, from COVID-19 to seasonal flu and other infections, our body’s defense system is exhausted. Many people continue to experience lingering inflammation, frequent illnesses, and slower recovery times. This extended state of immune stress has compromised us further to chronic illness, including autoimmune diseases and even neurodegenerative diseases such as Parkinson's. So why is our immune system still in trouble? And how do we give it its power back? Understanding immune exhaustion is the beginning of rebuilding our body's natural immunity.
A weakened immune system makes people more susceptible to disease, mental illnesses, and even sleep disorders. Now, new research indicates that immune system depletion may play an important role in the onset of Parkinson's disease, a degenerative neurologic disorder that compromises movement and cognition.
Dysfunctional immune response is a leading cause of long-standing inflammation within the body, that has been found to contribute towards a multitude of conditions, including cardiovascular conditions, diabetes, depression, and neurodegenerative diseases such as Alzheimer's.
As people age, their immune system naturally becomes less effective. This deterioration, referred to as immune exhaustion, may be a key contributor to the onset and progression of Parkinson’s disease. Rebecca Wallings, a Parkinson’s Foundation Launch Award grant recipient and senior postdoctoral fellow at the University of Florida, believes that an accumulation of exhausted immune cells could be driving neurodegeneration in Parkinson’s patients.
Parkinson's disease is most commonly linked with the degeneration and loss of dopaminergic neurons—motor nerve cells that produce dopamine, an essential neurotransmitter for movement. While researchers have long suspected inflammation is involved in this neurodegeneration, the mechanisms are not yet well understood.
Wallings' study is on immune cell exhaustion, a process by which aging immune cells fail to control immune responses effectively. Her research indicates that instead of dampening inflammation in Parkinson's patients, attempts should be made to rejuvenate the immune system to regain its functionality.
One of the major findings of Wallings' work is the function of mitochondrial impairment in immune cell exhaustion. Mitochondria are commonly called the powerhouses of cells, as they are vital for generating energy. As mitochondria age and become inefficient, immune cells fail to function well, potentially accelerating neurodegeneration in Parkinson's disease.
Wallings has found that mutations in the LRRK2 gene, a recognized genetic risk factor for Parkinson's disease, are linked with defective mitochondrial function and immune cell exhaustion. Her current work includes testing various therapeutic approaches to restore mitochondrial function in immune cells with the potential to enhance the immune system and potentially prevent or treat Parkinson's disease.
For decades, the standard practice in treating Parkinson's has been to suppress brain inflammation. Yet Wallings' work indicates that instead of slowing down immune responses, restoring the immune system could be a more successful strategy. By addressing mitochondrial impairment and immune resilience, researchers can potentially reverse or slow down Parkinson's disease.
Wallings is now looking into how to rejuvenate immune cells by fixing mitochondria. She studies immune cells from patients with Parkinson's as well as from healthy subjects and performs experiments on animal models to determine if rejuvenation of the immune system could result in improved disease outcomes.
While there is no cure for Parkinson's disease, some lifestyle adjustments may decrease the chances of developing the illness. Since neurodegenerative diseases are associated with chronic inflammation and immune dysfunction, developing habits that enhance immune function might prove helpful.
Diet: There is evidence to suggest that eating in accordance with the Mediterranean or MIND diets, both high in antioxidants, healthy fats, and anti-inflammatory foods, can encourage brain wellness and reduce Parkinson's risk.
Avoiding Dangerous Substances: Restricting alcohol and nicotine use can maintain a robust immune system and suppress inflammation.
Reducing Stress: Chronic stress weakens immune function, so methods such as meditation, exercise, and sufficient sleep can lead to improved overall well-being.
Credit: Canva
Living under constant psychosocial stress can significantly raise the risk of developing dementia and a stroke, a JAMA Network study suggests.
Psychosocial stress is a type of stress related to our relationships with others, usually arising from feeling judged, excluded, or not enough in others' eyes. It can also put a person in fight-or-flight mode, causing both mental and physical symptoms.
According to Chinese researchers, people who experience this form of stress in childhood as well as adulthood face more than a threefold higher risk of developing dementia compared with those in other groups.
Similarly, young people experiencing stressful situations in their adulthood had a significantly higher risk of stroke incidence that their counterparts.
Based on these results, the study highlights that early identification of psychosocial stressors, combined with effective mental health support and depression prevention, may reduce the long-term burden of neurodegenerative and cerebrovascular disease.
The scientists defined adverse childhood experiences (ACEs) as traumatic exposures occurring during childhood, typically grouped into 3 categories: household dysfunction, social dysfunction and family death or disability.
On the other hand, traumatic exposures occurring during adulthood were defined as adverse adult experiences (AAEs), which include events such as the death of a child, lifetime discrimination, ever being confined to bed, ever being hospitalized for a month or longer and ever leaving a job due to health conditions.
While analyzing the data they collected from the participants, the researchers also found that depression partly explained the links in all major relationships as it accounted for more than one-third of the connection between childhood adversity and dementia, and about one-fifth of the link between adulthood adversity and both dementia and stroke.
READ MORE: Avoid Doing These 3 Things Prevent Dementia, According To Neurologist
These findings suggest that long-term psychological stress may lead to brain and blood vessel diseases by causing ongoing emotional distress, unhealthy behaviours, and biological changes like inflammation and abnormal stress responses.
Psychosocial stress can trigger physiological responses like increased heart rate, cortisol secretion, and inflammation, significantly increasing risks for hypertension, cardiovascular disease and mental health disorders.
This kind of stress can affect men, women, and people of all genders differently, but many of the symptoms are still the same. Common symptoms include:
These symptoms can be acute or chronic, meaning for some people they go away, and for others, they persist over a long period of time. Meeting with a therapist is often recommended for those living with chronic stress.
Experts typically suggest developing coping mechanisms include building support networks, utilizing relaxation techniques, and, in cases of severe mental impact, seeking professional support to help deal with psychosocial stress.
Credits: WBBJTV. News
Cardiovascular disease remains the leading cause of death among women, claiming more lives each year than all cancers combined. Yet, it continues to be misunderstood, underdiagnosed, and often dismissed as a “male” health problem. In Ohio and across the US, women are now using a striking visual message to confront this reality head-on, by quite literally dressing for the cause.
February, observed as American Heart Month, marks a renewed push to educate communities about heart disease, especially among women. Health advocates stress that while the spotlight is brightest this month, the risks and responsibilities extend far beyond the calendar.
“It’s our Super Bowl,” said Lauren Thomas, development director for the American Heart Association of North Central West Virginia and Ohio Valley. “It’s about awareness. Heart health is not a one-month conversation. It has to be a year-round priority where people actively put their hearts first.”
Across Ohio, women are wearing red, a color long associated with love but also with danger. The message is deliberate. Red symbolizes the urgency of cardiovascular disease, a condition responsible for one in every three deaths among women.
“When we wear red, we start conversations that many people avoid,” said Melissa Pratt, a heart disease survivor, reported WBBJ News. “One in three women die from cardiovascular disease. Wearing red encourages women to get checked, understand their risks, and take their health seriously.”
From landmarks lit in red to workplaces, neighborhoods, and social media feeds filled with crimson outfits, the visual campaign is meant to disrupt complacency. It asks a confronting question. If heart disease is killing women at this scale, why is it still not treated like a crisis?
Coinciding with American Heart Month, the Ohio Valley Women of Impact campaign launches this Friday. Six local women, Crissy Clutter, Jan Pattishall-Krupinski, Lacy Ferguson, Shelbie Smith, Jennifer Hall-Fawson, and Laurie Conway, are leading fundraising and awareness efforts aimed at improving women’s heart health.
Their work focuses on education, early detection, and supporting research that better understands how heart disease presents differently in women. Symptoms in women can be subtle, ranging from fatigue and nausea to jaw or back pain, which often delays diagnosis and treatment.
To mark the start of the month, the American Heart Association hosted a National Wear Red Day breakfast on Friday morning at the LIFT Wellness Center in Jackson Walk Plaza. The event brought together survivors, advocates, and health professionals to reinforce a simple but powerful message. Awareness must lead to action.
Health experts continue to urge women to prioritize regular checkups, manage blood pressure, cholesterol, and stress, and recognize early warning signs. Lifestyle changes, timely screenings, and informed conversations can significantly reduce risk.
The women of Ohio are not wearing red for fashion. They are wearing it as a warning, a remembrance, and a call to action. In dressing themselves in the color of urgency, they are confronting a disease that has taken too many lives quietly. This February, their message is clear. Heart disease is not inevitable, but ignoring it can be deadly.
Credits: South Magazine
If your roots trace back to the Indian subcontinent, your risk of developing type 2 diabetes is significantly higher than that of Europeans. Research shows that Indians, Pakistanis, and Bangladeshis are up to six times more likely to develop the condition, often at a younger age and at lower body weights. For years, carbohydrate-heavy diets were blamed. But growing scientific evidence points to a far deeper and darker cause: repeated famines during British colonial rule that may have altered metabolic resilience across generations.
The idea that starvation can leave a genetic imprint may sound extreme, but science supports it. Prolonged nutrient deprivation can permanently affect how the body stores fat, processes glucose, and responds to food abundance later in life. Even a single famine can raise the risk of metabolic disorders such as diabetes in future generations.
This understanding forms the basis of the “thrifty genotype hypothesis,” a concept widely discussed in evolutionary biology.
The thrifty genotype hypothesis suggests that populations exposed to repeated famines develop genetic traits that help conserve energy. These traits are lifesaving during scarcity but become harmful in times of plenty, increasing the risk of obesity and diabetes.
Economic historian Mike Davis documents that India experienced 31 major famines during 190 years of British rule between 1757 and 1947, roughly one every six years. By contrast, only 17 famines occurred in the previous 2,000 years. Davis estimates that 29 million people died in the Victorian era alone. Economic anthropologist Jason Hickel places the death toll from colonial policies between 1880 and 1920 at around 100 million.
A study published in Frontiers in Public Health titled The Elevated Susceptibility to Diabetes in India: An Evolutionary Perspective argues that these famines reshaped metabolic traits. The researchers note that Indians tend to have a higher fat-to-lean mass ratio, lower average birth weight, and reduced ability to clear glucose. This combination increases metabolic stress and lowers resilience, explaining earlier onset of diabetes compared to Europeans.
Colonial-era famines also affected physical growth. Studies show that average Indian height declined by about 1.8 cm per century during British rule. Historian accounts describe ancient Indians as tall and robust, with even Greek chroniclers noting their stature during Alexander’s invasion. By the 1960s, however, Indians were about 15 cm shorter than their Mesolithic ancestors.
Read: How Colonialism Continues To Bear An Impact On The South Asian Health Crisis
While the British did not cause early declines, widespread impoverishment under colonial rule sharply accelerated the trend. Only in the past 50 years has average height begun to recover.
Mike Davis argues that colonial famines were driven not by food shortages but by policy. Grain continued to be exported even as millions starved. During the 1876 famine, Viceroy Robert Bulwer-Lytton refused to halt exports, and relief work was deliberately discouraged. Davis describes these deaths as the result of state policy, not natural disaster.
Medical journal The Lancet estimated that 19 million Indians died during famines in the 1890s alone.
India now faces the consequences. According to the Indian Council of Medical Research, over 101 million Indians live with diabetes today. Experts argue that prevention must begin early, with reduced sugar intake, low-glycaemic diets, healthier fats, and compulsory physical activity in schools. Education about famine-linked intergenerational health risks could also help younger generations make informed choices.
India has avoided famine since Independence in 1947. The next challenge is ensuring that history’s biological scars do not continue to shape its future.
© 2024 Bennett, Coleman & Company Limited