Page 140 - AIH-1-2
P. 140

Artificial Intelligence in Health                                    Movement detection with sensors and AI



            falls and seizures as follows: Adult-sized and infant-sized   (iv)  Evaluation metrics:  PyCaret-generated metrics such
            mannequins were employed to represent a range of patient   as accuracy, area under the curve (AUC), recall,
            demographics, ensuring that the collected movement data   precision, F1 score, and others are introduced. The
            spanned various relevant physiologies for our algorithm’s   definitions and significance of these metrics for model
            predictive capabilities. The use of such mannequins allows   evaluation are articulated, including the intricacies of
            for consistent and repeatable movement simulations,   how performance metrics, such as accuracy, precision,
            which are crucial for machine learning applications. The   recall, and F1 score, are calculated. Other important
            rationale behind selecting the PyCaret machine learning   metrics, such as the receiver operator characteristic
            library is twofold. First, PyCaret’s low-code environment   curve (ROC), AUC, Cohen’s Kappa, Matthews
            significantly streamlines the development process, thereby   correlation coefficient (MCC), and training time
            facilitating a more efficient exploration of different   (TT), are discussed, explaining each metric’s value
            predictive models. Second, it offers a comprehensive suite   range and its implications for the model’s predictive
            of  evaluation  metrics  and  algorithms  suitable  for  both   performance.
            binary  and multiclass  classification problems,  making it
            particularly well-suited for the complex task of classifying   2.1. Data collection
            the nuanced movements indicative of potential falls or   The data were collected using the Xsens DOT sensors
            seizures. This adaptability and ease of use render PyCaret   capable of capturing Euler angles in the X-, Y-, and Z-axes,
            highly suitable for health-care settings, where rapid and   also known as roll, pitch, and yaw, respectively, to depict
            accurate decision-making is paramount for patient safety.  real-time  3D  orientation  in  space.  In  addition,  these
                                                               sensors have demonstrated efficacy in capturing distinctive
              This section also provides detailed descriptions of the                            15
            approaches used for data collection, preprocessing, and the   data related to various patient movements.  As illustrated
                                                               in  Figure  1, one sensor was placed on the mannequin’s
            setup of the machine learning model as follows:    chest, specifically at the center of the sternum, to evaluate
            (i)  Data  collection:  This  segment  describes  the  use  of   the aforementioned six movements of interest. The sensor
               Xsens DOT sensors to gather real-time motion data   was securely affixed to the mannequin using duct tape
               reflecting 3D orientations in space, which is capable   arranged in a cross-shaped configuration. Subsequently, it
               of detecting Euler angles in the X-, Y-, and Z-axes.   was wirelessly connected through Bluetooth through the
               It discusses the placement of the sensor and the   Movella Dot App. The application allows for continuous
               mechanics of its securement to the mannequin’s chest.   streaming and data collection once initiated by the user.
               The methodology elucidates the specific movements   The sensors were activated and deactivated for each overall
               imitated by the mannequins (e.g., breathing, seizures,   movement of interest, with the collected data immediately
               rolls, and falls) to collect diverse movement data while   transferred to the device connected to the sensor. To
               distinguishing  between  the  use  of  adult  and  infant   collect the data on breathing and seizures, an adult
               mannequins for different movements.             mannequin  was  used.  Conversely,  an infant mannequin
            (ii)  Data preprocessing: This section outlines the process   was used to collect data on the remaining four movements.
               of managing raw datasets, which involves the    Data pertaining to breathing involved the mannequin
               segregation of collected data into subsets correlating   performing one full cycle of tidal volume inhalations and
               to  specific  movements  of interest. Focus  is given to   exhalations  continuously  for  3  min.  Seizure  data  were
               the significance of the Euler angle points in the X-axis   collected  by inducing a  seizure  in  the  mannequin  for
               and the quantification method for capturing distinct   10 min. For the collection of data on rolling to the side,
               movement data, including simulated rolling and
               falling off a bed by a mannequin.
            (iii) Machine learning model setup: Details are provided
               on the utilization of PyCaret, a supervised machine-
               learning module for the study, emphasizing its
               streamlined workflow and five key steps: setup,
               compare models, analyze model, save model, and
               prediction. The process of setting up PyCaret,
               providing  data,  labeling the  target,  and  ensuring
               reproducibility  through  session  IDs  is  described.
               Detailed information about the dataset, including data
               shape before and after transformations and division   Figure 1. Depiction of the sensor placement and the respective angular
               into training and test sets, is provided.       motions. Image created using Inkscape


            Volume 1 Issue 2 (2024)                        134                               doi: 10.36922/aih.2790
   135   136   137   138   139   140   141   142   143   144   145