Page 59 - IJAMD-2-3
P. 59
International Journal of AI for
Materials and Design Intelligent interactive textile in healthcare
(vi) Question 6: How do you feel about the customized of the POF and yarns used for the fabric development. The
gesture design aligning with the center’s slogan? knitting process was conducted on a 14-gauge computerized
(vii) Question 7: In what ways did you feel engaged or v-bed knitting machine, which is an industrial machine and
connected to the community through the co-design is able to realize scalable production. To knit out the design
process? graphics, a knit structure of doubled jacquard in three
(viii) Question 8: Do you believe the panels will be adopted colors was utilized. The design graphics were separated
widely? Do you think elderly users will enjoy and into three panels. Three fabric panels were knitted, each
regularly engage with them? with a size of 123 cm in width and 139 cm in height. In
(ix) Question 9: Overall, how has participating in the the first version, the jacquard structure was only knitted
co-design process influenced your sense of belonging with wool yarns. However, the illuminated area was not
to the center and the broader community? apparent, so the illuminative effect of the fabric panels was
not satisfactory. Therefore, a transparent yarn was added
Questions for the end-users focused on interaction when developing the second version to highlight the
experience, intuitiveness, comfort, and social aspects of illuminated area.
engagement with the panels and included:
(i) Question 1: How was your overall experience 3.2.2. Integration of illuminative fabrics and a gesture
interacting with the illuminative panels? Did you find recognition system
the activity enjoyable and the instructions clear? Was The gesture recognition system consists of several key
the exercise easy or difficult to follow? components working in an integrated pipeline. An
(ii) Question 2: Was interaction with the panels intuitive embedded camera captured real-time images of the
or challenging? How did your peers respond? user’s hand, arm, or head in front of the textile panel. The
(iii) Question 3: Do you anticipate regularly using the visual input was processed by a single-board computer,
panels and engaging with them socially alongside which runs a deep learning model to detect 21 landmarks
others? on the hand, as well as 33 landmarks on the shoulder
(iv) Question 4: How do you feel about the color change and head. A self-developed algorithm interpreted these
feature that confirms your posture or gesture is landmark data coordinates to classify specific gestures.
correct? The recognized gestures were then converted into encoded
(v) Question 5: What do you think of the visual esthetics serial data, which was transmitted to a self-developed
of the panels, given that they were designed based on PCB incorporating an ESP32 microcontroller (Espressif
the features of the Wong Tai Sin District? Systems, China). The PCB decoded the serial signal and
(vi) Question 6: Have you interacted with similar transformed it into a pulse width modulation (PWM)
technologies or used AI-driven systems before? signal that controls the illumination effect of RGB LEDs
3.2. Design and development of a textile-based embedded in the illuminative textile panel. In addition,
gesture recognition system POFs were integrated into the fabric to emit the LED light,
enabling gesture-driven color changes directly on the
3.2.1. Illuminative fabric development textile surface.
Optical fiber made with polymethyl methacrylate, whose To clarify and expand upon the implementation of the
fiber diameter is 0.25 mm, was knitted with textile-based gesture recognition system, the system leveraged Google’s
yarns to create the illuminative fabrics. Wool yarn was MediaPipe framework, specifically the MediaPipe Hand
chosen to provide the users a soft and comfortable hand landmarks detection and Pose landmarks detection,
feel when touching the fabrics. A transparent yarn made which are pre-trained using over 30,000 real-world labeled
with nylon and polyester was added in the second version images covering diverse hand and body postures. These
65
to improve the illuminated effect. Table 1 shows the details models enabled the detection of 21 hand landmarks and
Table 1. Polymeric optical fiber and yarn used for the illuminative fabric development
Materials Details
Polymethyl methacrylate polymeric optical fiber Fiber diameter: 0.25 mm, transmission loss: 350 dB/km, temperature range: −55°C to 70°C
Wool yarn Count: Nm 2/48; composition: 100% extra fine merino wool; care label: machine wash cold or 40°C;
do not bleach; dry flat; iron at low heat
Transparent yarn Count: Nm 1/80; composition: 55% nylon, 45% polyester; care label: machine wash cold or 30°C; do
not bleach; do not tumble dry; iron at low heat
Volume 2 Issue 3 (2025) 53 doi: 10.36922/IJAMD025170013

