https://github.com/Rohit-Kundu/Two-Step-Feature-Enhancement
- https://www.tensorflow.org/federated/get_started
- https://github.com/OpenMined/PySyft/tree/syft_0.2.x/examples/tutorials
- https://flower.dev/docs/index.html
- https://github.com/WwZzz/easyFL/blob/main/algorithm/README.md#refer-anchor-1
- https://github.com/Yangfan-Jiang/Federated-Learning-with-Differential-Privacy
- https://github.com/AshwinRJ/Federated-Learning-PyTorch
- https://blog.openmined.org/federated-learning-additive-secret-sharing-pysyft/
- https://towardsdatascience.com/federated-learning-3097547f8ca3
- https://developers.sherpa.ai/tutorials/federated-learning-paradigms/ftl
- https://towardsdatascience.com/preserving-data-privacy-in-deep-learning-part-1-a04894f78029
- https://learnopencv.com/federated-learning-using-pytorch-and-pysyft/
- https://github.com/cantonioupao/cervical_cancer_detection
- https://github.com/AshwinRJ/Federated-Learning-PyTorch
- https://blog.openmined.org/federated-learning-additive-secret-sharing-pysyft/
- https://towardsdatascience.com/federated-learning-3097547f8ca3
- https://developers.sherpa.ai/tutorials/federated-learning-paradigms/ftl
- https://towardsdatascience.com/preserving-data-privacy-in-deep-learning-part-1-a04894f78029
- https://learnopencv.com/federated-learning-using-pytorch-and-pysyft/
- https://github.com/cantonioupao/cervical_cancer_detection'
- https://github.com/Koukyosyumei/AIJack
- https://medium.com/geekculture/why-i-stopped-using-jupyter-notebook-and-why-you-should-too-f5a3b00e90a
- https://github.com/Yangfan-Jiang/Federated-Learning-with-Differential-Privacy
- https://github.com/ArturoDiez/FederatedLearning-Cancer-Image-Classification
- top7-open-source-frameworks-for-federated-learning
- EasyFL [Github] [Paper]
- PySyft [Github]
- A Generic Framework for Privacy Preserving Peep Pearning [Paper]
- Tensorflow Federated [Web]
- FATE [Github]
- FedLearner [Github] ByteDance
- Baidu PaddleFL [Github]
- Nvidia Clara SDK [Web]
- Flower.dev
- OpenFL
- FEDn [Github]
- A modular and model agnostic framework for hierarchical federated machine learning [Paper]
- The Future of Digital Health with Federated Learning [Paper]
- General guide for FL in healthcare. Nice written paper.
- HHHFL: Hierarchical Heterogeneous Horizontal Federated Learning for Electroencephalography [Paper] [NIPS 2019 Workshop]
- Federated learning in medicine: facilitating multi-institutional collaborations without sharing patient data [Paper - Nature Scientific Reports 2020] [News]
- Learn Electronic Health Records by Fully Decentralized Federated Learning [Paper] [NIPS 2019 Workshop]
- Patient Clustering Improves Efficiency of Federated Machine Learning to predict mortality and hospital stay time using distributed Electronic Medical Records [Paper] [News]
- MIT CSAI, Harvard Medical School, Tsinghua University
- Federated learning of predictive models from federated Electronic Health Records. [Paper]
- Boston University, Massachusetts General Hospital
- FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare [Paper]
- Microsoft Research Asia
- NVIDIA Clara Federated Learning to Deliver AI to Hospitals While Protecting Patient Data [Blog]
- Nvidia
- What is Federated Learning [Blog]
- Nvidia
- Split learning for health: Distributed deep learning without sharing raw patient data [Paper]
- Two-stage Federated Phenotyping and Patient Representation Learning [Paper] [ACL 2019]
- Federated Tensor Factorization for Computational Phenotyping [Paper] SIGKDD 2017
- FedHealth- A Federated Transfer Learning Framework for Wearable Healthcare [Paper] [ICJAI19 workshop]
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation [Paper] [MICCAI'18 Workshop] [Intel]
- Federated Patient Hashing [Paper] [AAAI'20]
- Federated Learning in Distributed Medical Databases: Meta-Analysis of Large-Scale Subcortical Brain Data [Paper]
- Confederated Machine Learning on Horizontally and Vertically Separated Medical Data for Large-Scale Health System Intelligence [Paper]
- Privacy-Preserving Deep Learning Computation for Geo-Distributed Medical Big-Data Platform [Paper]
- Institutionally Distributed Deep Learning Networks [Paper]
- Federated semi-supervised learning for COVID region segmentation in chest CT using multi-national data from China, Italy, Japan [Paper]
- Multi-Institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation
- Federated Learning in Distributed Medical Databases: Meta-Analysis of Large-Scale Subcortical Brain Data
- Privacy-Preserving Technology to Help Millions of People: Federated Prediction Model for Stroke Prevention
- A Federated Learning Framework for Healthcare IoT devices Keywords: Split Learning + Sparsification
- Federated Transfer Learning for EEG Signal Classification
- The Future of Digital Health with Federated Learning
- Anonymizing Data for Privacy-Preserving Federated Learning. ECAI 2020.
- Federated machine learning with Anonymous Random Hybridization (FeARH) on medical records
- Stratified cross-validation for unbiased and privacy-preserving federated learning
- Multi-site fMRI Analysis Using Privacy-preserving Federated Learning and Domain Adaptation: ABIDE Results
- Learn Electronic Health Records by Fully Decentralized Federated Learning
- Preserving Patient Privacy while Training a Predictive Model of In-hospital Mortality
- Federated Learning for Healthcare Informatics
- Federated and Differentially Private Learning for Electronic Health Records
- A blockchain-orchestrated Federated Learning architecture for healthcare consortia
- Federated Uncertainty-Aware Learning for Distributed Hospital EHR Data
- Stochastic Channel-Based Federated Learning for Medical Data Privacy Preserving
- Differential Privacy-enabled Federated Learning for Sensitive Health Data
- LoAdaBoost: Loss-based AdaBoost federated machine learning with reduced computational complexity on IID and non-IID intensive care data
- Privacy Preserving Stochastic Channel-Based Federated Learning with Neural Network Pruning
- Confederated Machine Learning on Horizontally and Vertically Separated Medical Data for Large-Scale Health System Intelligence
- Privacy-preserving Federated Brain Tumour Segmentation
- HHHFL: Hierarchical Heterogeneous Horizontal Federated Learning for Electroencephalography
- FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
- Patient Clustering Improves Efficiency of Federated Machine Learning to predict mortality and hospital stay time using distributed Electronic Medical Records
- LoAdaBoost:Loss-Based AdaBoost Federated Machine Learning on medical Data
- FADL:Federated-Autonomous Deep Learning for Distributed Electronic Health Record
- federated-learning-meets-blockchain
- Blockchain-based federated learning methodologies in smart environments
- A Method of Federated Learning Based on Blockchain
- BVFLEMR: an integrated federated learning and blockchain technology for cloud-based medical records recommendation system
- Blockchain for federated learning toward secure distributed machine learning systems: a systemic survey
- Blockchain-based Federated Learning: A Comprehensive Survey
- Using Blockchain Technologies to Improve Security in Federated Learning Systems
- Terms: Independent and Identically Distributed (IID)
- Introduction to Neural Networks - The Nature of Code - video tuts
- fl videos
- Federated Learning in Healthcare Use Cases | Intel Software
- Work on the algorithm
- Introduction to Neural Networks - The Nature of Code - web tuts
- enumerate-in-python
- top-resources-to-learn-about-federated-learning
- How to get started with FL
- first fl implementation to follow
- dl-fl-with-differential-privacy
Federated learning aims to train a single model from multiple data sources, under the constraint that data stays at the source and is not exchanged by the data sources (a.k.a. nodes, clients, or workers) nor by the central server orchestrating training, if present.
In a typical federated learning scheme, a central server sends model parameters to a population of nodes (also known as clients or workers). The nodes train the initial model for some number of updates on local data and send the newly trained weights back to the central server, which averages the new model parameters (often with respect to the amount of training performed on each node). In this scenario the data at any one node is never directly seen by the central server or the other nodes, and additional techniques, such as secure aggregation, can further enhance privacy.
Imagine having access to text messages, emails, WhatsApp chats, Facebook and LinkedIn messages from millions of distinct accounts across the world in a bid to build a keypad next-word predictor. Or having an unrestricted access to billions of medical records across continents while predicting the chance of diabetes in a patient. These hypothetical scenarios underscore what data quantity and quality means in machine learning, but they are nowhere near being realisable in today’s world. Thanks to the tough data protection laws now in place nearly worldwide.
Just as the image on the left, quality data exist like islands across edge devices around the world. But harnessing them into one piece to leverage their predictive power without contravening privacy laws is a herculean task. This challenge is what necessitated the FL technology. FL provides a clever way to connect machine learning models to the data required to effectively train them. So how does FL achieve this without breaching data protection laws? Read on as I take you through one of the hottest subjects in machine learning (ML) today.
The FL architecture in it’s basic form consists of a curator or server that sits at its centre and coordinates the training activities. Clients are mainly edge devices which could run into millions in number. These devices communicate at least twice with the server per training iteration. To start with, they each receive the current global model’s weights from the server, train it on each of their local data to generate updated parameters which are then uploaded back to the server for aggregation. This cycle of communication persists until a pre-set epoch number or an accuracy condition is reached. In the Federated Averaging Algorithm, aggregation simply means an averaging operation. That is all there is to the training of a FL model. I hope you caught the most salient point in the process — rather than moving raw data around, we now communicate model weights.