Speaker: Thomas Sherson
Affiliation: Delft University of Technology
Abstract: Over the last few decades, methods of parallel and distributed computation have become essential tools in a wide range of applications such as machine learning, wireless sensor network processing and big data signal processing. In this vein, in this presentation we demonstrate how a number of distributed optimisation algorithms from within the literature can be derived via their relation to monotone operator theory. This includes the likes of the proximal gradient algorithm, the alternating method of multipliers (ADMM) as well as more recent approaches such as the primal dual method of multipliers (PDMM). Finally we highlight some of the practical benefits of these methods (resilience to packet loss, asynchronous operation, the effect of quantisation on convergence, etc.) as well as some open questions for future research.
Biography: Thomas Sherson was born in the town of Peterfield, in Hampshire England. He received his Bachelor of Engineering with First Class Honours, majoring in Electrical and Computer Systems Engineering, from Victoria University of Wellington in New Zealand, in 2015. He was also awarded the Victoria University Medal of Academic Excellence in the same year. Following his graduation, he joined the Department of Microelectronics at Delft University of Technology to continue his studies towards a Doctor of Philosophy (PhD) in the field of Electrical Engineering. His general interests include the likes of signal processing in wireless sensor networks, distributed/decentralised optimisation, monotone operator theory and audio signal processing. Additionally, he is an avid outdoorsman with a passion for nature and a love for music.
Date(s) - Apr 26, 2018
3:00 pm - 5:00 pm
E-IV Tesla Room #53-125
420 Westwood Plaza - 5th Flr., Los Angeles CA 90095