
(Credit-Canva)
With the extra busy schedules people have these days, fitness tracking apps have helped many people time and track their exercise time, calories burnt and the time taken to do so. These apps also remind us to drink water, not exceed our calorie limits for the day, as well as keep exercising. However, sometimes these devices can have an adverse impact on our health. Instead of making us feel better, they can make us feel anxious or stressed. This generally happens when we become too focused on the numbers and start to worry if they aren't perfect. It's important to understand that while these trackers can be helpful, they shouldn't control our mood or make us feel bad about ourselves.
A recent study published in the Journal of the American Heart Association also found that people with heart problems who used trackers were more worried about their symptoms. With so many people using these devices, it's becoming a common problem. 20% of wearable users felt anxious and always contacted their doctors when they received an irregular rhythm notification. They also found that these devices meant increased monitoring and worrying, higher usage of AFib (atrial fibrillation) specific health care etc. These findings suggest that while wearables can help patients monitor their AFib, they may also lead to increased anxiety and health care use. More research is needed to fully understand the effects of these devices on patients, doctors, and the health care system.
We can become too focused on achieving certain goals, like a certain number of steps or a perfect sleep score. This can lead to stress and anxiety when we don't meet those goals, instead of simply using the information to support a healthy lifestyle.
If your happiness or sadness depends on the numbers your fitness tracker shows, it's a sign you might need a break. Getting a low score can make you feel like you've failed, even if you've been making healthy choices. It's normal to feel a little disappointed when you don't reach a goal, but your overall mood shouldn't be affected. If you notice that your mood changes a lot based on your tracker's data, it might be time to step away from it for a while.
If you find yourself constantly searching online to understand what your tracker's data means, it's a warning sign. Trying to interpret every number can lead to more worry and confusion. You might start to believe you have health problems that don't exist. It's important to remember that these trackers are tools, not medical professionals. Spending too much time trying to decode the data can increase your anxiety instead of helping you.
If you feel nervous or anxious when you forget your tracker or it's not working, you might be too dependent on it. You should be able to feel comfortable and relaxed without constant data. If you feel panicky when you can't see your numbers, it's a sign you need to learn to be okay without them. You should be able to trust your body's signals instead of relying only on the tracker.
If you rely only on your tracker and ignore what your body is telling you, it's a problem. Your body's signals are important. For example, you might feel well-rested, but if your tracker says your sleep quality is low, you might start to doubt yourself. It's important to listen to your body and not just the numbers. Your body knows when it's tired, hungry, or needs rest. The tracker is a tool to support your health, not replace your body’s signals.
Credit: Canva
Living under constant psychosocial stress can significantly raise the risk of developing dementia and a stroke, a JAMA Network study suggests.
Psychosocial stress is a type of stress related to our relationships with others, usually arising from feeling judged, excluded, or not enough in others' eyes. It can also put a person in fight-or-flight mode, causing both mental and physical symptoms.
According to Chinese researchers, people who experience this form of stress in childhood as well as adulthood face more than a threefold higher risk of developing dementia compared with those in other groups.
Similarly, young people experiencing stressful situations in their adulthood had a significantly higher risk of stroke incidence that their counterparts.
Based on these results, the study highlights that early identification of psychosocial stressors, combined with effective mental health support and depression prevention, may reduce the long-term burden of neurodegenerative and cerebrovascular disease.
The scientists defined adverse childhood experiences (ACEs) as traumatic exposures occurring during childhood, typically grouped into 3 categories: household dysfunction, social dysfunction and family death or disability.
On the other hand, traumatic exposures occurring during adulthood were defined as adverse adult experiences (AAEs), which include events such as the death of a child, lifetime discrimination, ever being confined to bed, ever being hospitalized for a month or longer and ever leaving a job due to health conditions.
While analyzing the data they collected from the participants, the researchers also found that depression partly explained the links in all major relationships as it accounted for more than one-third of the connection between childhood adversity and dementia, and about one-fifth of the link between adulthood adversity and both dementia and stroke.
READ MORE: Avoid Doing These 3 Things Prevent Dementia, According To Neurologist
These findings suggest that long-term psychological stress may lead to brain and blood vessel diseases by causing ongoing emotional distress, unhealthy behaviours, and biological changes like inflammation and abnormal stress responses.
Psychosocial stress can trigger physiological responses like increased heart rate, cortisol secretion, and inflammation, significantly increasing risks for hypertension, cardiovascular disease and mental health disorders.
This kind of stress can affect men, women, and people of all genders differently, but many of the symptoms are still the same. Common symptoms include:
These symptoms can be acute or chronic, meaning for some people they go away, and for others, they persist over a long period of time. Meeting with a therapist is often recommended for those living with chronic stress.
Experts typically suggest developing coping mechanisms include building support networks, utilizing relaxation techniques, and, in cases of severe mental impact, seeking professional support to help deal with psychosocial stress.
Credits: WBBJTV. News
Cardiovascular disease remains the leading cause of death among women, claiming more lives each year than all cancers combined. Yet, it continues to be misunderstood, underdiagnosed, and often dismissed as a “male” health problem. In Ohio and across the US, women are now using a striking visual message to confront this reality head-on, by quite literally dressing for the cause.
February, observed as American Heart Month, marks a renewed push to educate communities about heart disease, especially among women. Health advocates stress that while the spotlight is brightest this month, the risks and responsibilities extend far beyond the calendar.
“It’s our Super Bowl,” said Lauren Thomas, development director for the American Heart Association of North Central West Virginia and Ohio Valley. “It’s about awareness. Heart health is not a one-month conversation. It has to be a year-round priority where people actively put their hearts first.”
Across Ohio, women are wearing red, a color long associated with love but also with danger. The message is deliberate. Red symbolizes the urgency of cardiovascular disease, a condition responsible for one in every three deaths among women.
“When we wear red, we start conversations that many people avoid,” said Melissa Pratt, a heart disease survivor, reported WBBJ News. “One in three women die from cardiovascular disease. Wearing red encourages women to get checked, understand their risks, and take their health seriously.”
From landmarks lit in red to workplaces, neighborhoods, and social media feeds filled with crimson outfits, the visual campaign is meant to disrupt complacency. It asks a confronting question. If heart disease is killing women at this scale, why is it still not treated like a crisis?
Coinciding with American Heart Month, the Ohio Valley Women of Impact campaign launches this Friday. Six local women, Crissy Clutter, Jan Pattishall-Krupinski, Lacy Ferguson, Shelbie Smith, Jennifer Hall-Fawson, and Laurie Conway, are leading fundraising and awareness efforts aimed at improving women’s heart health.
Their work focuses on education, early detection, and supporting research that better understands how heart disease presents differently in women. Symptoms in women can be subtle, ranging from fatigue and nausea to jaw or back pain, which often delays diagnosis and treatment.
To mark the start of the month, the American Heart Association hosted a National Wear Red Day breakfast on Friday morning at the LIFT Wellness Center in Jackson Walk Plaza. The event brought together survivors, advocates, and health professionals to reinforce a simple but powerful message. Awareness must lead to action.
Health experts continue to urge women to prioritize regular checkups, manage blood pressure, cholesterol, and stress, and recognize early warning signs. Lifestyle changes, timely screenings, and informed conversations can significantly reduce risk.
The women of Ohio are not wearing red for fashion. They are wearing it as a warning, a remembrance, and a call to action. In dressing themselves in the color of urgency, they are confronting a disease that has taken too many lives quietly. This February, their message is clear. Heart disease is not inevitable, but ignoring it can be deadly.
Credits: South Magazine
If your roots trace back to the Indian subcontinent, your risk of developing type 2 diabetes is significantly higher than that of Europeans. Research shows that Indians, Pakistanis, and Bangladeshis are up to six times more likely to develop the condition, often at a younger age and at lower body weights. For years, carbohydrate-heavy diets were blamed. But growing scientific evidence points to a far deeper and darker cause: repeated famines during British colonial rule that may have altered metabolic resilience across generations.
The idea that starvation can leave a genetic imprint may sound extreme, but science supports it. Prolonged nutrient deprivation can permanently affect how the body stores fat, processes glucose, and responds to food abundance later in life. Even a single famine can raise the risk of metabolic disorders such as diabetes in future generations.
This understanding forms the basis of the “thrifty genotype hypothesis,” a concept widely discussed in evolutionary biology.
The thrifty genotype hypothesis suggests that populations exposed to repeated famines develop genetic traits that help conserve energy. These traits are lifesaving during scarcity but become harmful in times of plenty, increasing the risk of obesity and diabetes.
Economic historian Mike Davis documents that India experienced 31 major famines during 190 years of British rule between 1757 and 1947, roughly one every six years. By contrast, only 17 famines occurred in the previous 2,000 years. Davis estimates that 29 million people died in the Victorian era alone. Economic anthropologist Jason Hickel places the death toll from colonial policies between 1880 and 1920 at around 100 million.
A study published in Frontiers in Public Health titled The Elevated Susceptibility to Diabetes in India: An Evolutionary Perspective argues that these famines reshaped metabolic traits. The researchers note that Indians tend to have a higher fat-to-lean mass ratio, lower average birth weight, and reduced ability to clear glucose. This combination increases metabolic stress and lowers resilience, explaining earlier onset of diabetes compared to Europeans.
Colonial-era famines also affected physical growth. Studies show that average Indian height declined by about 1.8 cm per century during British rule. Historian accounts describe ancient Indians as tall and robust, with even Greek chroniclers noting their stature during Alexander’s invasion. By the 1960s, however, Indians were about 15 cm shorter than their Mesolithic ancestors.
Read: How Colonialism Continues To Bear An Impact On The South Asian Health Crisis
While the British did not cause early declines, widespread impoverishment under colonial rule sharply accelerated the trend. Only in the past 50 years has average height begun to recover.
Mike Davis argues that colonial famines were driven not by food shortages but by policy. Grain continued to be exported even as millions starved. During the 1876 famine, Viceroy Robert Bulwer-Lytton refused to halt exports, and relief work was deliberately discouraged. Davis describes these deaths as the result of state policy, not natural disaster.
Medical journal The Lancet estimated that 19 million Indians died during famines in the 1890s alone.
India now faces the consequences. According to the Indian Council of Medical Research, over 101 million Indians live with diabetes today. Experts argue that prevention must begin early, with reduced sugar intake, low-glycaemic diets, healthier fats, and compulsory physical activity in schools. Education about famine-linked intergenerational health risks could also help younger generations make informed choices.
India has avoided famine since Independence in 1947. The next challenge is ensuring that history’s biological scars do not continue to shape its future.
© 2024 Bennett, Coleman & Company Limited