MC23 INCYBER Scalable federate learning and multi-party computation for preserving pricacy
In medical research, to train statistical models one often desires more data than is available in a single hospital, while transferring data to a central place increases data protection risks. The TRUMPET and FLUTE projects aim at developing a platform for scalable, secure, privacy-preserving federated learning between known owners of medical data. This presentation outlines the setting and challenges of these projects, and discusses some of the key contributions we are making in terms of scalability and the privacy-utility trade-off.
Allez à la source