3rd Cognitive Models and Artificial Intelligence Conference, AICCONF 2025, Prague, Çek Cumhuriyeti, 13 - 14 Haziran 2025, (Tam Metin Bildiri)
Federated learning facilitates distributed collaborative model training across decentralized computational nodes, ensuring data privacy through the maintenance of sensitive information locality. This paradigm enables the development of robust machine learning models by aggregating heterogeneous datasets distributed across numerous client devices, obviating the need for centralized data repositories. Federated learning systems employ a spectrum of aggregation algorithms, including weighted averaging (FedAvg), proximal optimization (FedProx), and adaptive optimization (FedOpt), to achieve a convergent synthesis of locally computed model parameter vectors, thereby enabling distributed model refinement while preserving data locality. Fuzzy logic furnishes a computational framework for approximate reasoning, facilitating the modeling of imprecise and uncertain information via the utilization of fuzzy sets and linguistic variables. This research articulates the development of a novel fuzzy logic-mediated aggregation mechanism, FedFZY, within a federated learning architecture. The implemented methodology obviates the requirement for computationally intensive mathematical operations and derivative calculations.