Page 55 - IJAMD-2-3
P. 55
International Journal of AI for
Materials and Design Intelligent interactive textile in healthcare
polymer-based fibers, such as polymethyl methacrylate, recognizes the potential of more ambient or contactless
into fabric structures to enable uniform side illumination. gesture solutions, particularly in clinical contexts. A
10
Light is transmitted through the fiber and emitted laterally key impetus for pursuing AI-based gesture recognition
through engineered surface modifications, allowing in textiles is its potential application in rehabilitation and
flexible, efficient, and interactive lighting within textiles. elderly care. With aging populations growing globally, there
53
Tan et al. designed a gesture-controlled illuminated is an urgent need for unobtrusive monitoring technologies
54
textile utilizing computer vision to recognize mid-air and interactive support for seniors. 60,61 Traditional camera-
hand gestures, triggering corresponding color changes in based systems may achieve high accuracy in controlled
the fabric. Prior research has primarily examined woven environments, but often face issues related to occlusion,
illuminative textiles designed for creating engaging lighting conditions, and perceived intrusiveness among
sensory environments. However, recent innovations have older adults. Consequently, intelligent textiles – integrated
62
begun exploring knitted textiles. Lam et al., investigated with conductive yarns, POFs, or advanced “smart glove”
55
various knitted structures to optimize the illuminative sensors – offer more user-friendly alternatives, minimizing
effects of POFs, demonstrating the feasibility of integrating external hardware and seamlessly blending into healthcare
such technologies into wearable and interior applications, environments. 4
favored for their superior flexibility and user comfort,
thereby enhancing interaction potential and application 2.3.1. Wearable and contactless frameworks
versatility, particularly in healthcare environments. 55,56 Early gesture-recognition textiles frequently relied on
Data presented at the Hong Kong Geriatrics Society wearable forms, such as sensor-equipped gloves, to track
Annual Scientific Meeting showed that the use of knitted finger angles or subtle hand movements. 10,61 These gloves
illuminative textiles, in the form of touch-and-proximity typically embed pressure sensors or strain gauges in
responsive cushions, substantially improved engagement conductive yarns along finger segments, capturing real-
57
in individuals with dementia. Notably, 90% of participants time flexion–extension data. Complementary approaches
with late-stage dementia demonstrated active participation integrate surface electromyography (sEMG) signals or
during sensory interventions, and all participants (100%) multiple inertial measurement units to enhance motion
reported positive experiences, providing strong empirical capture accuracy, especially for dynamic tasks, such
support for prior anecdotal observations. These findings as stroke rehabilitation. Such gloves excel in precision
affirm the potential effectiveness of interactive illuminative and support personalized rehabilitation by offering
textiles in supporting elderly sensory therapies. biofeedback on gesture performance. However, some older
Nevertheless, a notable research gap persists regarding patients or those with physical constraints may find gloves
the comprehensive integration of advanced AI capabilities cumbersome or difficult to put on and remove daily, thus
into knitted textile systems for wider healthcare limiting practical adoption.
applications. To address this gap, future research could Recent studies acknowledge the potential of “contactless”
emphasize AI integration to enhance user interactions and gesture detection textiles, where embedded optical or
usability, especially among elderly populations who may capacitive sensors track mid-air hand movements. Tan
experience reluctance due to perceptions of complexity et al. propose an illuminative textile system using
54
or internalized ageism. Continued collaborative co-design computer vision to detect mid-air number gestures without
efforts involving multidisciplinary teams and end-user direct physical contact. This approach points toward fabric-
participation are critical to developing solutions that closely based “wall panels” or “ambient curtains” capable of sensing
align with user needs and enhance overall acceptance and gestures in healthcare environments where cleanliness and
effectiveness in healthcare applications. infection control are paramount. In addition, contactless
designs can enhance accessibility for patients with limited
2.3. Gesture recognition dexterity or object-avoidance requirements.
Recent developments in smart textile research have opened
new possibilities for gesture recognition interfaces within 2.3.2. AI and machine learning integration
healthcare. By embedding sensor networks or computer- Gesture recognition systems commonly integrate multiple
vision modules directly into fabrics, researchers aim to stages: data collection from sensors or vision modules,
create seamless, intuitive systems that detect hand and preprocessing for noise reduction and feature extraction,
5
finger movements with minimal user discomfort. Early classification using machine learning models, and real-
studies typically integrated wearable sensors – such as time feedback or database storage. Databases containing
stretchable gloves or multiple inertial measurement units gesture data – either from annotated video streams or
– for real-time motion capture 58,59 ; however, the field still sensor readings – are vital for training robust AI models.
Volume 2 Issue 3 (2025) 49 doi: 10.36922/IJAMD025170013

