Page 91 - AIH-2-4
P. 91
Artificial Intelligence in Health Federated learning health stack against pandemics
batch. Nevertheless, the proposed framework exhibited For n local servers, the total communication cost is
lower training time across all scenarios. This improvement represented as:
is attributed to the hierarchical structure: whereas FedAvg Cost = O(n·(|G | + |pk |)) (XVIII)
i
i
requires all clients to transmit model updates to the central total-local-to-central r r
server concurrently, the hierarchical approach organizes 3.2.3. Total communication cost
clients into clusters. Each client communicates only with Combining the costs for all client-to-local server and
their respective local servers, and the aggregated updates local-to-central server communications, the overall
are subsequently forwarded to the central server, thereby communication cost for the framework is represented as:
reducing communication overhead. Given its time
i
ij
i
ij
efficiency, the hierarchical model is better suited for global Cost total = (p. |G | + n. |G | + p. |pk | + n. |pk |) (XIX)
r
r
r
r
deployment and offers greater scalability compared to The parameter size at local server L, denoted as |G |,
i
i
r
FedAvg, particularly in addressing future pandemics. As includes the presence of p clients, i.e., |G | = p·|G |, where
i
ij
r
r
the hierarchical model outperforms standard FL protocols |G | represents the parameter size from an individual
ij
r
with complex data types, such as images, it is anticipated client j. Therefore, to reflect the total cost across n local
to perform effectively with other data types as well, servers, n·|G | is used instead of n·p·|G |.
i
ij
including categorical, numerical, and time-series data, r r
such as cancer classifications, biophysical parameters, and 3.3. Computation cost analysis
electrocardiography, respectively. The computation cost of the proposed framework was
3.2. Communication cost analysis analyzed by evaluating the operations performed at each
level: clients, local servers, and the central server. This
In the proposed framework, communication occurs at two section provides a detailed computation cost analysis.
levels: from client to local server and from local server
to central server. All communication is bidirectional, 3.3.1. Client-side computation
meaning both entities exchange messages during each Each client C performed local model training on its
ij
round. The communication cost of the aforementioned secured data. The training cost was proportional to the
level was analyzed. local dataset D and the complexity of the model M. Then, it
ij
ij
3.2.1. Client-to-local server communication encrypted the model gradients G using the encryption key
r
pk . Similarly, it also decrypted the aggregated gradients
ij
r
Each client C communicated with its assigned local server received from the central server at the end of each round.
ij
L by transmitting encrypted model gradients [[G ]] and The computation cost per client is represented as:
ij
r
i
its associated public key pk during each round r. The ij ij
ij
r
communication cost per client is represented using big-O Cost client = O(|D |·M + Enc(|G |) + Dec(|G |)) (XX)
r
r
ij
ij
notation: where Enc(|G |) is the encryption cost and Dec(|G |)
ij
r
r
is the decryption cost for the gradients. Suppose p clients
Cost client-to-local = O(|G | + |pk |) (XV) were present under a local server; the total client-side
ij
ij
r
r
where |G | is the size of the encrypted gradient and computation cost is represented as:
ij
r
|pk | is the size of the encryption key for p clients under Cost = O(p·(|D |·M + Enc(|G |) + Dec(|G |))) (XXI)
ij
ij
ij
r
a single local server. The parameter was included total-client ij r r
in the calculation, as it was assumed to represent the 3.3.2. Local server-side computation
maximum number of clients under a local server. The total
communication cost is represented as: Each local server L performed the aggregation of encrypted
i
gradients received from all clients in the cluster using
Cost total-client-to-local = O(p·(|G | + |pk |)) (XVI) homomorphic addition and multiplication, ensuring that the
ij
ij
r
r
aggregation was performed without decrypting the gradients.
3.2.2. Local server-to-central server communication
If the number of clients assigned to L was p, then p additions
i
Each local server L aggregated encrypted gradients from and a single multiplication were required for aggregation. For
i
its clients and transmitted the aggregated model updates simplicity, the time taken for multiplication was assumed as
[[G ]] to the central server. The communication cost per unity. The computation cost per local server is represented as:
i
r
local server is represented as:
Cost local-server = O(p·T ) (XXII)
add
i
i
Cost local-to-central = O(|G | + |pk |) (XVII) where T represents the time taken for homomorphic
r
r
add
where |G | is the size of the aggregated gradient and addition. For n local servers in the framework, the total
i
r
|pk | is the size of the encryption key for the local server. local server-side computation cost is represented as:
i
r
Volume 2 Issue 4 (2025) 85 doi: 10.36922/AIH025080013

