Page 53 - ARNM-1-2
P. 53

Advances in Radiotherapy
            & Nuclear Medicine                                                    Image fusion’s significance in medicine



            Section 2 mainly introduces the application of traditional   They employed two different fusion rules based on a non-
            image fusion methods in the field of NMMI. The studies   parametric density model and variable weighting theory
            reporting  on  the  application  of  traditional  image  fusion   for the fusion of low-frequency and high-frequency
            methods in NMMI are provided in Table 1. A comparison   coefficients. The fusion image is constructed by applying the
            of different image fusion methods is provided in Table S1.  inverse  non-subsampled contourlet transform operation
                                                               to all composite coefficients. Haddadpour  et al.  used
                                                                                                       [10]
            2.1. Spatial fusion
                                                               MRI and PET as input images and fused them based on the
            Spatial fusion methods mainly rely on pixel-level fusion,   combination of two-dimensional Hilbert transform (2-D
            directly merging pixels from  different  images.  These   HT)  and  intensity-hue-saturation  (IHS)  method,  which
            methods  include  simple  average,  weighted  average,   preserves both spatial and spectral features of input images.
            maximum value, minimum value, and high-pass filtering .   Stokking et al.  proposed a hue-saturation-value (HSV)
                                                        [8]
                                                                           [11]
            Although these methods are simple and easy to implement,   model for fusing anatomical and functional information
            they usually require pre-processing and post-processing   obtained from MRI and SPECT modes using a color
            and can easily cause spatial distortion in the fused image.  coding scheme. This model outperforms the RGB model
              Liu et al.  proposed a new image fusion method based   and allows for quick, simple, and intuitive retrospective
                     [9]
            on a multi-resolution and non-parametric density model.   determination of color coding in the fused image. Chen
                                                                                                           [12]
            Table 1. Traditional image fusion methods in NMMI
             Authors          Year    Fusion methods  Multimodal     Fusion Techniques                Organ
                                                      images
            Liu et al. [9]    2019    Spatial fusion  PET-MRI       Non-parametric density model and variable   Brain
                                                                    weighting theory
            Haddadpour        2017    Spatial fusion  PET-MRI       Two-dimensional Hilbert transform (2-D HT)   Brain
            et al. [10]                                             and intensity-hue-saturation (IHS) method
            Chen [12]         2017    Spatial fusion  PET-MRI       Combined the IHS model with Log-Gabor   Brain
                                                                    transform
            He et al. [41]    2010    Spatial fusion  PET-MRI       IHS and principal component analysis (PCA)  Brain
            Liu et al. [14]   2010    Frequency fusion  PET-CT      Multiwavelet transform            Lung
            Xiong et al. [42]  2017   Frequency fusion  PET-CT      Shift-invariant Shearlet Transform (SIST)   Brain
                                                                    and adaptive Pulse coupled neural network
                                                                    (PCNN)
            Bhavana and       2015    Frequency fusion  PET-MRI     Discrete Wavelet Transform (DWT)  Brain
            Krishnappa [43]
            Wang et al. [44]  2006    Frequency fusion  PET-MRI     DWT                               Brain
            Du et al. [45]    2018    Frequency fusion  PET-CT      Parallel significant features     Brain
            Shabanzade et al. [18]  2019  Decision-level   PET-MRI  Non-parametric Bayesian technique  Brain
                                      fusion
            Zong and Qiu [20]  2017   sparse          SPECT-MRI,    Ssparse representation of classified image   Brain, lung
                                      representation   PET-CT,      patches
                                      fusion          MRI-CT
            Shahdoosti and    2018    Sparse          SPECT-MRI,    Tetrolet transform                Brain
            Mehrabi [46]              representation   CT-MRI,
                                      fusion          PET-MRI
            Zhu et al. [21]   2018    Hybrid fusion   PET-MRI,      Spatial method for cartoon component and   Brain
                                                      SPECT-MRI     sparse representation method for texture
                                                                    components
            Chaitanya et al. [47]  2017  Hybrid fusion  PET-MRI     Shearlet transformation and discrete cosine   Brain
                                                                    transform
            Daneshvar and     2010    Hybrid fusion   PET-MRI       IHS and the retina-inspired model (RIM)   Brain
            Ghassemian [22]                                         fusion technique
            Abbreviations: CT: Computed tomography; MRI: Magnetic resonance imaging; PET: Positron emission tomography; SPECT: Single-photon emission
            computed tomography.


            Volume 1 Issue 2 (2023)                         3                       https://doi.org/10.36922/arnm.0870
   48   49   50   51   52   53   54   55   56   57   58