I was interviewing a 72-year-old retired accountant who had unplugged his smart glucose monitor. He explained that he “didn’t know who was looking” at his blood sugar data.
This wasn’t a man unfamiliar with technology—he had successfully used computers for decades in his career. He was of sound mind. But when it came to his health device, he couldn’t find clear answers about where his data went, who could access it, or how to control it. The instructions were dense, and the privacy settings were buried in multiple menus. So, he made what seemed like the safest choice: he unplugged it. That decision meant giving up real-time glucose monitoring that his doctor had recommended.
The healthcare IoT (Internet of Things) market is projected to exceed $289 billion by 2028, with older adults representing a major share of users. These devices are fall detectors, medication reminders, glucose monitors, heart rate trackers, and others that enable independent living. Yet there’s a widening gap between deployment and adoption. According to an AARP survey, 34% of adults over 50 list privacy as a primary barrier to adopting health technology. That represents millions of people who could benefit from monitoring tools but avoid them because they don’t feel safe.
In my study at the University of Denver’s Ritchie School of Engineering and Computer Science, I surveyed 22 older adults and conducted in-depth interviews with nine participants who use health-monitoring devices. The findings revealed a critical engineering failure: 82% understood security concepts like two-factor authentication and encryption, yet only 14% felt confident managing their privacy when using these devices. In my research, I also evaluated 28 healthcare apps designed for older adults and found that 79% lacked basic breach-notification protocols.
One participant told me, “I know there’s encryption, but I don’t know if it’s really enough to protect my data.” Another said, “The thought of my health data getting into the wrong hands is very concerning. I’m particularly worried about identity theft or my information being used for scams.”
This is not a user knowledge problem; it’s an engineering problem. We’ve built systems that demand technical expertise to operate safely, then handed them to people managing complex health needs while navigating age-related changes in vision, cognition, and dexterity.
Measuring the Gap
To quantify the issues with privacy setting transparency, I developed the Privacy Risk Assessment Framework (PRAF), a tool that scores healthcare apps across five critical domains.
First, the regulatory compliance domain evaluates whether apps explicitly state adherence to the Health Insurance Portability and Accountability Act (HIPAA), the General Data Protection Regulation (GDPR), or other data protection standards. Just claiming to be compliant is not enough—they must provide verifiable evidence.
Second, the security mechanisms domain assesses the implementation of encryption, access controls, and, most critically, breach-notification protocols that alert users when their data may have been compromised. Third, in the usability and accessibility domain, the tool examines whether privacy interfaces are readable and navigable for people with age-related visual or cognitive changes. Fourth, data-minimization practices evaluate whether apps collect only necessary information and clearly specify retention periods. Finally, third-party sharing transparency measures whether users can easily understand who has access to their data and why.
When I applied PRAF to 28 healthcare apps commonly used by older adults, the results revealed systemic gaps. Only 25% explicitly stated HIPAA compliance, and just 18% mentioned GDPR compliance. Most alarmingly, 79% lacked breach notification protocols, which means that the users may never find out if their data was compromised. The average privacy policy readability scored at a 12th-grade level, even though research shows that the average reading level of older adults is at an 8th grade level. Not a single app included accessibility accommodations in their privacy interfaces.
Consider what happens when an older adult opens a typical health app. They face a multi-page privacy policy full of legal terminology about “data controllers” and “processing purposes,” followed by settings scattered across multiple menus. One participant told me, “The instructions are hard to understand, the print is too small, and it’s overwhelming.” Another explained, “I don’t feel adequately informed about how my data is collected, stored, and shared. It seems like most of these companies are after profit, and they don’t make it easy for users to understand what’s happening with their data.”
When protection requires a manual people can’t read, two outcomes follow: they either skip security altogether leaving themselves vulnerable, or abandon the technology entirely, forfeiting its health benefits.
Engineering for privacy
We need to treat trust as an engineering specification, not a marketing promise. Based on my research findings and the specific barriers older adults face, three approaches address the root causes of distrust.
The first approach is adaptive security defaults. Rather than requiring users to navigate complex configuration menus, devices should ship with pre-configured best practices that automatically adjust to data sensitivity and device type. A fall detection system doesn’t need the same settings as a continuous glucose monitor. This approach draws from the principle of “security by default” in systems engineering.
Biometric or voice authentication can replace passwords that are easily forgotten or written down. The key is removing the burden of expertise while maintaining strong protection. As one participant put it: “Simplified security settings, better educational resources, and more intuitive user interfaces will be beneficial.”
The second approach is real-time transparency. Users shouldn’t have to dig through settings to see where their data goes. Instead, notification systems should show each data access or sharing event in plain language. For example: “Your doctor accessed your heart-rate data at 2 p.m. to review for your upcoming appointment.” A single dashboard should summarize who has access and why.
This addresses a concern that came up repeatedly in my interviews: users want to know who is seeing their data and why. The engineering challenge here isn’t technical complexity, it’s designing interfaces that convey technical realities in language anyone can understand. Such systems already exist in other domains; banking apps, for instance, send immediate notifications for every transaction. The same principle applies to health data, where the stakes are arguably higher.
The third approach is invisible security updates. Manual patching creates vulnerability windows. Automatic, seamless updates should be standard for any device handling health data, paired with a simple status indicator so users can confirm protection at a glance. As one participant said, “The biggest issue that we as seniors have is the fact that we don’t remember our passwords… The new technology is surpassing the ability of seniors to keep up with it.” Automating updates removes a significant source of anxiety and risk.
What’s at Stake
We can keep building healthcare IoT the way we have: fast, feature-rich, and fundamentally untrustworthy. Or, we can engineer systems that are transparent, secure, and usable by design. Trust isn’t something you market through slogans or legal disclaimers. It’s something you engineer, line by line, into the code itself. For older adults relying on technology to maintain independence, that kind of engineering matters more than any new feature we could add. Every unplugged glucose monitor, every abandoned fall detector, every health app deleted out of confusion or fear represents not just a lost sale but a missed opportunity to support someone’s health and autonomy.
The challenge of privacy in healthcare IoT goes beyond fixing existing systems, it requires reimagining how we communicate privacy itself. My ongoing research builds on these findings through an AI-driven Data Helper, a system that uses large language models to translate dense legal privacy policies into short, accurate, and accessible summaries for older adults. By making data practices transparent and comprehension measurable, this approach aims to turn compliance into understanding and trust, thus advancing the next generation of trustworthy digital health systems.
From Your Site Articles
Related Articles Around the Web