The Future of Wearables: From Step-Counting to Sensory-Aware Support
- David Ruttenberg
- 4 days ago
- 4 min read
A decade ago, wearables mostly meant one thing: steps. Maybe a heart rate graph if you were feeling fancy. Now the space is exploding. MarketsandMarkets estimates the wearable technology market will grow from USD 84.53B in 2025 to USD 176.77B by 2030 (MarketsandMarkets, 2026). Fortune Business Insights similarly pegs 2025 at USD 86.78B with a projected rise to USD 191.58B by 2032 (Fortune Business Insights, 2025). That growth is real, and it is not slowing down.
But the bigger story is not "more sensors." It is better support. The next leap is moving from step-counting to sensory-aware tools that can help people navigate overload, avoid shutdowns, and recover faster, especially neurodivergent folks who have been treated like an edge case for far too long.
The step era: helpful, but shallow
Step counts were a great on-ramp. They gave people a simple metric and a gentle nudge. But if you have sensory sensitivities, your hardest moments are not always about movement. They are about the environment: noise, light flicker, crowd density, temperature, unpredictable touch, and the nonstop cognitive effort of "keeping it together."
That is why focusing on locomotion alone can feel like a mismatch. It is like measuring a person's wellbeing by counting how many times they blink.
Wearables already collect physiology that is indirectly tied to overwhelm (e.g., heart rate changes, stress proxies, sleep disruption). The missing piece is context (Ruttenberg, 2025).

Sensory-aware wearables: what "support" could look like
Sensory-aware support is not sci-fi. It is a design choice: measure sensory load, connect it to the body's response, and give the person options that reduce harm.
In practical terms, a sensory-aware wearable could:
Estimate sensory load by combining physiology with environmental signals (sound level, abrupt changes in light, temperature swings).
Detect early warning patterns unique to the user (e.g., rising arousal + increasing noise volatility) and prompt a micro-accommodation before the cliff edge (Ruttenberg, 2025).
Recommend real-time accommodations like: "Switch to noise reduction," "Take a 3-minute vestibular break," "Move to a quieter route," or "Lower display brightness." These are small interventions that can prevent big consequences.
Log what worked so accommodations get smarter over time, not more annoying.
This matters for neurodivergent people because "overwhelm" is often cumulative. The goal is not to label someone as stressed. The goal is to stop sensory load from stacking until it becomes shutdown, meltdown, panic, or exhaustion.
Why AI matters here (and why it can go wrong)
Sensory-aware wearables will likely use machine learning to personalize thresholds and forecast risk windows. But this is exactly where we need ethical guardrails.
If the model is trained mostly on neurotypical patterns, it can misread neurodivergent physiology and do the worst possible thing: blame the person, not the environment (Ruttenberg, 2025). The fix is not just "more data." It is better inclusion, transparent limitations, and user control over how predictions are made and used (NIST, 2023).
A non-negotiable: sensory wearables must avoid becoming compliance tools ("You are dysregulated, remove yourself"). The user should decide what support looks like. Full stop.

Form factors matter when sensory comfort is the point
A lot of neurodivergent people do not wear wearables because the wearable itself is sensory irritating. That is not a personal failure, it is a design failure.
Rings, patches, and soft textiles can reduce tactile burden and make adoption realistic (Fortune Business Insights, 2025). The goal is a menu of options: different materials, weights, placements, vibration patterns, and alert styles, so the device can fit the person instead of forcing the person to fit the device.
Privacy: sensory data is not "just health data"
When you track sensory triggers, you are effectively mapping a person's vulnerability: where they struggle, when they shut down, what environments disable them. That is extremely sensitive information and it should be treated like it.
A reasonable baseline for ethical sensory wearables includes: on-device processing when possible, minimal cloud dependence, clear retention policies, and genuine deletion (not "deactivation") (NIST, 2023). If a company cannot explain their data practices in plain English, they do not deserve your nervous system data.

What I want to see next (and what you can ask for now)
If you are shopping, building, or advising in this space, here are the questions that matter:
Can I set my own sensory thresholds and change them easily?
Does it support real accommodations, not just charts and scores?
Is the system transparent about uncertainty and error? (NIST, 2023)
Do I own my data, and can I delete it completely?
Are neurodivergent users included as co-designers, not just "participants"? (World Health Organization, 2023)
Call to action: If you are a parent, educator, CXO, or agency leader exploring sensory-aware wearables (or trying to build them responsibly), reach out through my website. I am always up for a practical conversation about ethical AI, neurodiversity accommodation, and what "support" should actually mean.
References
Fortune Business Insights. (2025). Wearable technology market size, share & industry trends, 2034. https://www.fortunebusinessinsights.com/wearable-technology-market-106000
MarketsandMarkets. (2026). Wearable technology market worth $176.77 billion by 2030 [Press release]. https://www.marketsandmarkets.com/PressReleases/wearable-electronics.asp
National Institute of Standards and Technology. (2023). Artificial Intelligence Risk Management Framework (AI RMF 1.0). https://www.nist.gov/itl/ai-risk-management-framework
Ruttenberg, D. (2025). Sensory sensitivity and wearable technology: Toward sensory-aware support in everyday environments [Thesis context]. University of Pennsylvania.
World Health Organization. (2023). Autism. https://www.who.int/news-room/fact-sheets/detail/autism-spectrum-disorders
Dr David Ruttenberg PhD, FRSA, FIoHE, AFHEA, HSRF is a neuroscientist, autism advocate, Fulbright Specialist Awardee, and Senior Research Fellow dedicated to advancing ethical artificial intelligence, neurodiversity accommodation, and transparent science communication. With a background spanning music production to cutting-edge wearable technology, Dr Ruttenberg combines science and compassion to empower individuals and communities to thrive. Inspired daily by their brilliant autistic daughter and family, Dr Ruttenberg strives to break barriers and foster a more inclusive, understanding world.
Comments