Continual federated machine learning under concept drift
Federated learning (FL) is a machine learning paradigm that allows training models in a distributed way, among multiple devices, without compromising user privacy. In this PhD thesis, we propose new FL strategies that, while maintaining all the advantages that this technology already provides, also allow us to handle continual scenarios with non-stationary data and concept drift. We formulate two complementary strategies: CDA-FedAvg, which enables the training of a global deep neural network, and ECFL, which poses the model as an ensemble of local learners. We evaluated our solutions in different use cases, including activity recognition in smartphones and assistance to robotic wheelchair users. The results highlight the relevance of continual FL and, in particular, the advantages and impact of our contributions.
keywords: Federated learning, Continual learning, Smartphones, smart wheelchairs, Distributed Systems