top of page

Looking For Ethical Wearables? Here Are 10 Things You Should Know



Wearables are everywhere. They count steps, score sleep, flag stress, promise peace. But I’m going to lead with the part that matters most to me: I’m a parent first.


Our daughter, Phoebe (23), is autistic, has ADHD, and lives with epilepsy. I’ve been in the ER with her after seizures. I’ve watched her do the long, exhausting work of recovering from two craniotomies. In those moments, you don’t want tech that simply tracks; you want tech that helps. Not a dashboard that dazzles, but a device that delivers.


That’s why I care about “ethical wearables.” Not as a trend, not as a market, not as a buzzword, but as a real-world family question: will this tool reduce risk, reduce burden, reduce panic? Or will it just collect more data, sell more data, create more noise? (Mittelstadt, 2017; UK Parliamentary Office of Science and Technology, 2023)


If you’re a parent reading this, consider this advice from one parent to another. If you’re a CEO reading this, consider it a parent-scientist’s product brief: build for care, not capture; support the person, not the platform; measure to help, not to surveil (Ruttenberg, 2025a).


Your 10-Point Ethical Wearables Checklist

1. Check Who Owns Your Data


This is the big one. Parent-to-parent, here’s the question I ask before anything touches our daughter’s wrist: who owns the story your data tells?


Ethical wearables give you full ownership and control (Lanzing, 2016). That means you can download it, delete it, decide who sees it. If the company treats your child’s health metrics like their property, that’s not a “feature,” that’s a flashing red light.


Quick test: Can you export your data in a usable format, without paying, begging, or jumping through hoops? If not, walk away.


A hand gently holding a glowing sphere with health data symbols, representing user control and ethical wearables data privacy.

2. Understand What’s Being Collected


When your child has epilepsy (or any complex needs), “just data” becomes a map of their life: heart rate, sleep, location, movement patterns, even inferred mood. The problem is not that data exists, it’s that it expands, then it escapes.


Look for data minimization: collect what’s necessary, nothing extra (Mittelstadt, 2017). If a sleep tracker wants contacts, photos, or always-on mic access, something’s wrong.


Quick test: Can you list (in plain English) what it collects in under 60 seconds? If you can’t, the company isn’t being clear.

3. Demand Real Transparency


Transparency isn’t a buzzword, it’s a bedside manner for technology. If our daughter ends up in an ER, I can explain her meds, her history, her risks. A wearable company should be able to explain its data practices with the same clarity.


Ethical companies tell you exactly what they collect, how they use it, who they share it with, and why (Ruttenberg, 2025a). Especially for mental health, anxiety, fatigue, sensory sensitivity: the most sensitive signals deserve the most straightforward handling.


Quick test: Is there a clear, plain-language explanation of data practices that a busy parent can understand at 11:00 p.m.? No jargon, no legalese, no “trust us.”


4. Look for Strong Security Without Obscurity


Your wearable collects intimate details about body and brain. For families like mine, that can include seizure patterns, sleep disruption, medication routines. That data needs protection: real protection, not vibes.


Look for end-to-end encryption, secure storage, regular security audits (Mittelstadt, 2017). And be wary of security-through-silence: if they won’t explain how they protect it, assume they don’t.


Quick test: Can the company clearly state what happens if there’s a breach, how you’ll be notified, and what support you’ll get? If it’s vague, it’s a risk.


A smartwatch on a wrist with a glowing shield above it, symbolizing strong security and data protection for ethical wearables.

5. Verify Consent Is Actually Informed


Here’s a harsh truth: most “consent” in tech is a checkbox costume. Click, swipe, done, and suddenly you’re “agreeing” to things you never had time to understand.


Ethical wearables make consent meaningful: they explain, they offer options, they let you change your mind later (UK Parliamentary Office of Science and Technology, 2023). Consent should empower, not entrap. And if the user is a young adult, disabled adult, or someone under stress, the consent experience needs to be even more humane.


Quick test: Can you opt out of specific data collection without losing the core features you actually bought it for?

6. Assess the Sensory Experience


This one is personal for our family. If you live with autism, ADHD, anxiety, chronic fatigue, sensory sensitivity, the device itself matters as much as the software. A buzzing wrist can be “helpful” for one person and pure overload for another.


And sensory needs are not one-size-fits-all. In practice, you’ll often see three distinct profiles: HYPER (over-sensitive), HYPO (under-sensitive), and SENSORY SEEKING. HYPO isn’t the same as seeking, but in family conversations I’ve found this label helps keep it clear: “The Under-Sensitive Child (Sensory Seekers).” Design has to respect all three, not flatten them into one.


In my work on multi-sensory assistive wearable technology, I’ve seen how design choices can reduce or increase sensory burden (Ruttenberg, 2025b). The goal is not more signals, it’s better signals: adjustable haptics, customizable alerts, comfortable materials, predictable patterns.


Quick test: Can you control intensity, frequency, and timing of notifications? Does it feel comfortable after an hour, a day, a week?


7. Question the Algorithm’s Intentions


Here’s where it gets tricky. A lot of wearables don’t just record, they interpret. They label. They nudge. And if the algorithm is wrong, it can turn support into stress.


Algorithms can be biased, and bias in mental health tech can cause harm (Ruttenberg, 2020). Ethical wearables explain what the model is doing, what it is not doing, and how it was validated across different people. As a parent, I don’t need “AI magic.” I need accountable logic.


Quick test: Can the company explain, in plain language, why the device produced a recommendation, and what evidence supports it?


Profile of a human head with a glowing brain network, illustrating ethical wearable algorithms in decision-making and mental health.

8. Consider the Environmental Footprint


Ethics isn’t just about data, it’s about the whole lifecycle. Parents know this: “cheap and fast” usually becomes “broken and replaced.” That costs money, time, stress, waste.


Look for sustainable sourcing, responsible manufacturing, repairability (Lanzing, 2016). A truly ethical wearable protects privacy and reduces throwaway churn.


Quick test: Is there a real recycling or trade-in program, and can you replace parts (band, battery) without replacing the whole device?

9. Watch for Workplace Monitoring Red Flags


If you’re a CEO reading this: please hear me. “Wellness” wearables can become workplace surveillance in a hoodie. If you’re a parent: this matters for your kid’s first job, internship, or supported employment placement.


Ethical wearables empower individuals rather than enabling discrimination or inappropriate oversight (Mittelstadt, 2017). Stress scores and anxiety signals are health information, not productivity metrics.


Quick test: If an employer provides the device, who has access to the data, for what purpose, for how long? Get it in writing. If it’s not written, it’s not real.


10. Trust the Trustworthy


Finally, do your homework on the company itself. Parents do this instinctively: we check reviews, ask other families, read between the lines. Do the same with wearables.


The market is flooded with devices designed to maximize extraction, not maximize wellbeing. But there are companies doing it right: companies that understand innovation without consideration is just exploitation with better marketing.


Quick test: Look for third-party certifications, independent reviews, and specific, verifiable commitments (not just glossy values pages).


The Bottom Line

Finding ethical wearables isn’t impossible, but it does take a little detective work. The good news is that once you know what to look for, the red flags show up fast: vague policies, forced sharing, dark patterns, “trust us” language.


As Phoebe’s parent, I don’t want a device that just watches her. I want a device that helps her: supports recovery, reduces overload, respects her autonomy. Tracking isn’t caring. Measuring isn’t meaning. Data isn’t dignity, unless the design makes it so (Ruttenberg, 2025a; UK Parliamentary Office of Science and Technology, 2023).


Here’s the antithesis I come back to: care over capture, support over surveillance. And here’s the tricolon I wish every product team would tape to the wall: clear consent, clean data, real help.


So before you buy your next wearable, run it through this checklist. Ask the hard questions. Demand real answers. Protect the person, not just the platform. And if you’re a CEO building these products, build like your own family will wear it.


Want help designing ethical wearables that families can actually trust? Visit davidruttenberg.com to explore our Ethical AI and human-centered wearable consulting.

About the Author

Dr David Ruttenberg PhD, FRSA, FIoHE, AFHEA, HSRF is a neuroscientist, autism advocate, Fulbright Specialist Awardee, and Senior Research Fellow dedicated to advancing ethical artificial intelligence, neurodiversity accommodation, and transparent science communication. With a background spanning music production to cutting-edge wearable technology, Dr Ruttenberg combines science and compassion to empower individuals and communities to thrive. Inspired daily by their brilliant autistic daughter and family, Dr Ruttenberg strives to break barriers and foster a more inclusive, understanding world.

References

Lanzing, M. (2016). The transparent self: A normative investigation of changing selves and relationships in the age of the quantified self. Medicine, Health Care and Philosophy, 19(4), 545-556.

Mittelstadt, B. (2017). Ethics of the health-related internet of things: A narrative review. Ethics and Information Technology, 19(3), 157-175.

Ruttenberg, D. (2020). SensorAble: Sensory sensitivity, sound impairment, and machine learning approaches to assistive wearable technology. UCL Research Paper Series.

Ruttenberg, D. (2025a). Towards technologically enhanced mitigation of autistic adults’ sensory sensitivity experiences and attentional, and mental wellbeing disturbances [Doctoral thesis, University College London]. https://discovery.ucl.ac.uk/id/eprint/10210135/

Ruttenberg, D. (2025b). Multi-sensory assistive wearable technology [Patent application].

UK Parliamentary Office of Science and Technology. (2023). Invisible disabilities (POSTnote 689). UK Parliament.

 
 
 

Comments


bottom of page