Page 347 - IJB-9-4
P. 347
International Journal of Bioprinting Bioprinting with machine learning
The convolutional neural network is a kind of needs to be processed by pooling. The pooling function will
feedforward neural network that contains convolution carry out statistical selection and information filtering on
and pooling computation as well as a depth structure. It the input features to adjust the output data. Pooling layers
is one of the classical algorithms of deep learning. The select the pooling area for pooling operations. Common
convolutional neural network framework is generally operations include mean pooling, maximum pooling, and
composed of the input layer, convolutional layer, pooling mixed pooling. The use of pooling reduces the size of the
layer, fully connected layer, and output layer . The depth feature data, as well as the size of the input data in the next
[46]
of the network depends on the number of the convolutional layer, improves the efficiency of data statistics, and reduces
layer, pooling layer, and fully connected layers, in which the quantity storage space .
[49]
the order of pooling layer and convolutional layer can be
changed. In the traditional neural network, the connections 3.3.3. Fully connected layer
between neurons at each layer are fully connected, while Fully connected layers in convolutional neural networks
in the convolutional neural network, the connections act as “classifiers,” which can map the learned features
between neurons on each feature map are only connected and distributed representations to the label space. It can
with neurons in a small region of the upper layer. The be simply understood as combining the features extracted
hidden layer is alternately composed of convolutional from the previous layers into a single output value, which
layer and pooling layer. Features are extracted through can reduce the influence of feature location on classification.
convolution operation, and then more abstract features are Convolutional neural networks connect the data to one or
obtained through pooling operation. Finally, the obtained more fully connected layers after passing through several
feature map is input to the fully connected layer, and the convolutional and pooling layers. Each neuron in a fully
result of the last fully connected layer operation is input to connected layer is fully connected to all neurons in the
the output layer . previous layer for output. The local information with class
[47]
distinguishing features in the convolutional layer and the
3.3.1. Convolution layer pooling layer will be integrated by the fully connected
The convolutional layer is a basic component of the layer, and the output value of the last fully connected layer
[50]
convolutional neural network architecture. It mainly is the corresponding probability .
performs shallow feature extraction on the data
transmitted to the network, such as image edge, texture, 3.4. Long short-term memory
shape, and other features. Feature extraction usually refers A long short-term memory network is a modified
to the combination of linear and nonlinear calculations, recurrent neural network, which can remember long- and
i.e., convolution operation and activation function. The short-term information. It can not only deal with the long
convolutional layer has two important properties. The distance dependence problem that the recurrent neural
neurons between the convolutional layers are connected network cannot manage, but also solve common issues
to the local receptive field of the previous layer employing such as gradient explosion or gradient disappearance in
[51]
local connection and weight sharing. Compared with the the neural network . Therefore, it is very outstanding
fully connected network, the local connection mode can in handling sequence data. Long short-term memory
greatly reduce the number of network training parameters networks are suitable for treating and evaluating critical
and speed up the training speed. Local connection means information with long distance and delays in time series.
that neurons are only associated with a small number of A long short-term memory network is a variant of the
pixels in the input image. For image data, the correlation recurrent neural network, whose core concepts are cell
[52]
between adjacent pixels is greater than that between pixels state and gate mechanism (Figure 4).
far apart, i.e., the local correlation of the image. A local 3.4.1. Cell state
connection aims to extract the local features of the image The cell state corresponds to the way information is being
by using this feature and combine it into the global features conveyed, so that information can be transported in the
of the image at the deep level of the network. Although the sequence. It can be regarded as the memory of the network.
network using local connection will reduce the parameters, In theory, the cell state can transfer the related knowledge
the order of magnitude of the parameters is still large, so during the sequence handling. As a result, the information
the concept of weight sharing is proposed to further reduce in the earlier time stage can even be carried into the cells
the network training parameters . during the later time stage, which conquers the impact of
[48]
[53]
3.3.2. Pooling layer short-term memory .
The pooling layer is associated with the convolutional layer, The cell state of the previous layer is multiplied by the
and the feature maps output by the convolutional layer forgetting vector point by point. If it is multiplied by a
Volume 9 Issue 4 (2023) 339 https://doi.org/10.18063/ijb.739

