This Is AuburnElectronic Theses and Dissertations

Show simple item record

On the Robustness and Privacy of Distributed Machine Learning


Metadata FieldValueLanguage
dc.contributor.advisorShu, Tao
dc.contributor.authorLiu, Tian
dc.date.accessioned2022-07-26T20:45:55Z
dc.date.available2022-07-26T20:45:55Z
dc.date.issued2022-07-26
dc.identifier.urihttps://etd.auburn.edu//handle/10415/8336
dc.description.abstractMachine learning has recently gained tremendous interest due to its capabilities in producing predictive models in a wide variety of applications, such as objective detection and recommendation services. Meanwhile, the development of the Internet of Things (IoT), which enables the connection to the Internet and the computation capability to a wide range of devices, makes it possible for machine learning algorithms to gain insight from an aggregation of physically separated devices. However, due to its distributed nature, one cannot guarantee the legitimacy of the received data or parameters, which provides a venue for new attacks. Therefore, it is necessary to better understand vulnerabilities and identify potential threats, so as to propose countermeasures to eliminate the impacts of such threats before applications are put into use. This dissertation focuses on improving the robustness and privacy of distributed learning algorithms and covers both traditional distributed learning systems, in which a central server collects the data and performs the training, and the modern federated learning scheme, in which the training is performed on individual devices. In the background of the transition from traditional power grid to smart grid, the first proposed research studies the robustness of the artificial neural network (ANN) based state estimator by adversarial false data injection attacks. The state estimation of the grid can be misled by injecting noise-like data into a small portion of electricity meters. Focusing on the modern federated learning (FL) scheme, the second proposed research overcomes the ineffectiveness of the backdoor attacks on FL due to the dilution effect from normal users, by utilizing the information leakage from the shared model. The third proposed research provides a high-accuracy and low-cost solution for privacy preservation in mobile edge computing (MEC) systems, in which the key challenges come from computation and power constraints. This dissertation could help people better understand these vulnerabilities and design a safer and more efficient distributed learning system.en_US
dc.subjectComputer Science and Software Engineeringen_US
dc.titleOn the Robustness and Privacy of Distributed Machine Learningen_US
dc.typePhD Dissertationen_US
dc.embargo.statusNOT_EMBARGOEDen_US
dc.embargo.enddate2022-07-26en_US

Files in this item

Show simple item record