BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//5.95//EN
TZID:America/Los_Angeles
X-WR-TIMEZONE:America/Los_Angeles
BEGIN:VEVENT
UID:600@ee.ucla.edu
DTSTART;TZID=America/Los_Angeles:20190617T100000
DTEND;TZID=America/Los_Angeles:20190617T120000
DTSTAMP:20190612T161007Z
URL:https://www.ee.ucla.edu/events/exact-diffusion-learning-over-networks/
SUMMARY:Exact Diffusion Learning over Networks
DESCRIPTION:Abstract: In this dissertation\, we study optimization\, adapta
tion\, and learning problems over connected networks. In these problems\,
each agent collects and learns from its own local data and is able to comm
unicate with its local neighbors. While each single node in the network ma
y not be capable of sophisticated behavior on its own\, the agents collabo
rate to solve large-scale and challenging learning problems.\nDifferent ap
proaches have been proposed in the literature to boost the learning capabi
lities of networked agents. Among these approaches\, the class of diffusio
n strategies has been shown to be particularly well suited due to their en
hanced stability range over other methods and improved performance in adap
tive scenarios. However\, diffusion implementations suffer from a small in
herent bias in the iterates. When a constant step-size is employed to solv
e deterministic optimization problems\, the iterates generated by the diff
usion strategy will converge to a small neighborhood around the desired gl
obal solution but not to the exact solution itself. This bias is not due t
o any gradient noise arising from stochastic approximation\; it is instead
due to the update structure in diffusion implementations. The existence o
f the bias leads to three questions: (1) What is the origin of this inhere
nt bias?\; (2) Can we design new approaches that eliminate the bias?\; and
(3) Does the correction of the bias bring benefits to distributed optimiz
ation\, distributed adaptation\, or distributed learning?\nThis dissertati
on provides affirmative solutions to these questions. Specifically\, we de
sign a new {\\em exact diffusion} approach that eliminates the inherent bi
as in diffusion. Exact diffusion has almost the same structure as diffusio
n\, with the addition of a “correction” step between the adaptation an
d combination steps. Next\, this dissertation studies the performance of e
xact diffusion for the scenarios of distributed optimization\, distributed
adaptation\, and distributed learning\, respectively. For distributed opt
imization\, exact diffusion is proven to converge exponentially fast to th
e {\\em exact} global solution under proper conditions. For distributed ad
aptation\, exact diffusion is proven to have better steady-state mean-squa
re-error than diffusion\, and this superiority is analytically shown to be
more evident for sparsely connected networks such as line\, cycle\, grid\
, and other topologies. In distributed learning\, exact diffusion can be i
ntegrated with the amortized variance-reduced gradient method (AVRG) so th
at it converges exponentially fast to the exact global solution while empl
oying stochastic gradients per iteration. This dissertation also compares
exact diffusion with other state-of-the-art methods in literature. Intensi
ve numerical simulations are provided to illustrate the theoretical result
s derived in the dissertation.\nBiography: Kun Yuan is a Ph.D. candidate s
tudent in the Department of Electrical and Computer Engineering\, UCLA\, s
upervised by Prof. Ali H. Sayed. He got his M.S. degree from the Universit
y of Science and Technology of China (USTC) in 2014\, and B.E. degree from
Xidian University in 2011. His research interest is to design efficient
parallel or distributed algorithms for large-scale optimization\, adaptat
ion\, and machine learning problems. He is the recipient of the 2017 IEEE
Signal Processing Society Young Author Best Paper Award.\n\nFor more infor
mation\, contact Prof. Ali H. Sayed (sayed@ee.ucla.edu)
ATTACH;FMTTYPE=image/jpeg:https://www.ee.ucla.edu/wp-content/uploads/ee/ku
n.jpg
CATEGORIES:seminar
LOCATION:E-IV Elliott Room #53-135\, 420 Westwood Plaza - 5th Flr.\, Los An
geles\, CA\, 90095\, United States
GEO:34.0687701;-118.443962
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=420 Westwood Plaza - 5th Fl
r.\, Los Angeles\, CA\, 90095\, United States;X-APPLE-RADIUS=100;X-TITLE=E
-IV Elliott Room #53-135:geo:34.0687701,-118.443962
END:VEVENT
BEGIN:VEVENT
UID:601@ee.ucla.edu
DTSTART;TZID=America/Los_Angeles:20190617T160000
DTEND;TZID=America/Los_Angeles:20190617T180000
DTSTAMP:20190612T222313Z
URL:https://www.ee.ucla.edu/events/distributed-stochastic-optimization-in-
non-differentiable-and-non-convex-environments/
SUMMARY:Distributed Stochastic Optimization in Non-Differentiable and Non-C
onvex Environments
DESCRIPTION:Abstract: The first part of this dissertation considers distrib
uted learning problems over networked agents. The general objective of dis
tributed adaptation and learning is the solution of global\, stochastic op
timization problems through localized interactions and without information
about the statistical properties of the data. Regularization is a usefu
l technique to encourage or enforce structural properties of the resulting
solution\, such as sparsity or constraints. A substantial number of regul
arizers are inherently non-smooth\, while many cost functions are differen
tiable. We propose distributed and adaptive strategies\, that are able to
minimize aggregate sums of objectives. In doing so\, we exploit the struct
ure of the individual objectives as sums of differentiable costs and non-d
ifferentiable regularizers. Resulting algorithms are adaptive in nature an
d able to continuously track drifts in the problem\; their recursions\, ho
wever\, are subject to persistent perturbations arising from the stochasti
c nature of the gradient approximations\, and from disagreement across age
nts in the network. The presence of non-smooth\, and potentially unbounded
\, regularizers enriches the dynamics of these recursions. We quantify the
impact of this interplay and draw implications for steady-state performan
ce as well as algorithm design and present applications in distributed mac
hine learning and image reconstruction.\nDriven by the need to solve incre
asingly complex optimization problems in signal processing and machine lea
rning\, there has also been increasing interest in understanding the behav
ior of gradient-descent algorithms in non-convex environments. Most availa
ble works on distributed non-convex optimization problems focus on the det
erministic setting where exact gradients are available at each agent. In t
his work\, we consider stochastic cost functions\, where exact gradients a
re replaced by stochastic approximations and the resulting gradient noise
persistently seeps into the dynamics of the algorithm. We establish that t
he diffusion learning algorithm continues to yield meaningful estimates in
these more challenging\, non-convex environments\, in the sense that (a)
despite the distributed implementation\, individual agents cluster in a sm
all region around the network centroid\, and (b) the network centroid inhe
rits many properties of the centralized\, stochastic gradient descent recu
rsion\, including the escape from strict saddle points in O(1/μ) iteratio
ns and return of approximately second-order stationary points in a polynom
ial number of iterations.\nIn the second part of this thesis\, we consider
centralized learning problems over networked feature spaces. Rapidly grow
ing capabilities to observe\, collect and process ever increasing quantiti
es of information\, necessitate methods for identifying and exploiting str
ucture in high-dimensional feature spaces. Networks\, frequently referred
to as graphs in this context\, have emerged as a useful tool for modeling
interrelations among different parts of a data set. We consider graph si
gnals that evolve dynamically according to a heat diffusion process and ar
e subject to persistent perturbations. The model is not limited to heat di
ffusion but can be applied to modeling other processes such as the evoluti
on of interest over social networks and the movement of people in cities.
We develop an online algorithm that is able to learn the underlying graph
structure from observations of the signal evolution and derive expressions
for its performance. The algorithm is adaptive in nature and in particula
r able to respond to changes in the graph structure and the perturbation s
tatistics.\nIn order to incorporate prior graphical knowledge to improve c
lassification performance\, we propose a BRAIN strategy for learning\, whi
ch enhances the performance of traditional algorithms\, such as logistic r
egression and SVM learners\, by incorporating a graphical layer that track
s and learns in real-time the underlying correlation structure among featu
re subspaces. In this way\, the algorithm is able to identify salient su
bspaces and their correlations\, while simultaneously dampening the effect
of irrelevant features.\nBiography: Stefan Vlaski is a Ph.D. candidate in
the UCLA Electrical and Computer Engineering Department under the supervi
sion of Professor Ali. H. Sayed. He received his B.S. degree from Techni
cal University Darmstadt in 2013\, the M.S. degree from UCLA in 2014 and h
as interned at Apple Inc. as well as Amazon Lab126. His research interes
ts focus on the development and study of learning algorithms with a partic
ular emphasis on adaptive and distributed solutions.\nFor more information
\, contact Prof. Ali H. Sayed (sayed@ee.ucla.edu)
ATTACH;FMTTYPE=image/jpeg:https://www.ee.ucla.edu/wp-content/uploads/ee/st
efan-1.jpg
CATEGORIES:seminar
LOCATION:E-IV Faraday Room #67-124\, 420 Westwood Plaza - 6th Flr.\, Los An
geles\, CA\, 90095\, United States
GEO:34.0687701;-118.443962
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=420 Westwood Plaza - 6th Fl
r.\, Los Angeles\, CA\, 90095\, United States;X-APPLE-RADIUS=100;X-TITLE=E
-IV Faraday Room #67-124:geo:34.0687701,-118.443962
END:VEVENT
BEGIN:VEVENT
UID:556@ee.ucla.edu
DTSTART;TZID=America/Los_Angeles;VALUE=DATE:20190915
DTEND;TZID=America/Los_Angeles;VALUE=DATE:20190921
DTSTAMP:20190306T171818Z
URL:https://www.ee.ucla.edu/events/itqw-2019-infrared-terahertz-quantum-wo
rkshop/
SUMMARY:ITQW 2019 Infrared Terahertz Quantum Workshop
DESCRIPTION:Welcome to ITQW 2019\nITQW is a workshop-style conference that
aims to bring together academic\, government\, and industry scientists in
an intimate venue to encourage close interaction and collaboration. The co
nference will feature a mixture of oral presentations\, poster sessions\,
invited and tutorial presentations.\n\nThe theme of the conference is broa
dly defined as the exploration of novel physical phenomena in quantum- and
electromagnetically-engineered photonic materials in the infrared and ter
ahertz frequency range and the exploitation of these phenomena to create n
ovel optoelectronic devices and applications. The infrared and terahertz f
requency range is particularly interesting for realizing practical devices
based on these design principles owing to relaxed fabrication tolerances\
, low loss of metals\, controllable plasmonic and nonlinear optical proper
ties of semiconductors and 2D materials\, and our ability to engineer inte
rsubband transitions in semiconductor heterostructures.\n\nITQW was former
ly known as Intersubband Transitions in Quantum Wells\, and has been held
every two years since 1991. This year we have renamed it to better reflec
t the evolving scope of topics that fall under the central theme.\n\nFocus
ed Session on Polaritonics and Strong Coupling Phenomena\nOn one day of th
e conference\, we will have a half-to-full day focused session on Polarito
nics and Strong-Coupling Phenomena in the infrared and THz range. Click f
or more information.\n\n\n\nConference Scope\nPhysics of intersubband rela
ted low-dimensional systems\n\n Physics of intersubband transitions and s
cattering mechanisms\n Numerical modeling of intersubband devices\n\nMid-
infrared and THz Sources and Detectors\n\n Quantum cascade lasers\, inter
band cascade lasers\, diode lasers\, nonlinear generation\, frequency comb
s in the mid-infrared and THz spectral regions\n Quantum well infrared ph
otodetectors (QWIPs)\, quantum-dot infrared photodetectors (QDIPs)\, quant
um cascade detectors\, type-II superlattice detectors\, phase sensitive de
tection\, single photon devices\, other novel detectors operating in the m
id-infrared and THz spectral regions\n Heterodyne detection\, novel detec
tion schemes (plasmonic\, 2D materials\, antenna-coupled\, nanodetectors)\
n Sources and detectors employing novel principles (metamaterial lasers\,
parity-time symmetric laser systems\, and others)\n\nCollective effects i
n 2D systems Strong and Ultra-strong coupling\n\n Intersubband polaritons
\, ultrastrong light-matter coupling in intersubband systems\, polariton l
asers and condensates\, intersubband plasmons.\n Mid-infrared and THz pla
smonics\, metamaterials\, metasurfaces\, and engineered electromagnetic st
ructures\, particularly as coupled to low-dimensional quantum systems.\n
Superradiant emission\, Landau quantification\, high B-field physics.\n\nM
id-infrared and THz Materials\n\n III-V semiconductors including III-Nitr
ides and III-Sb\n Two-dimensional materials (graphene\, MoS2\, Black Phos
phorus\, etc.)\,\n Novel material systems (wide bandgap\, SiGe\, II-VI\,
Si/Ge photonics for QC devices)\n Zero-D and 1-D low dimensional structur
es (quantum dots\, boxes\, nanowires)\n Photonic topological insulators\,
metamaterials and metasurfaces\, and other mesoscopic mid-infrared and TH
z photonic materials (both active and passive).\n\nMid-infrared and THz Ap
plications\n\n Spectroscopy (sensing\, heterodyne receivers\, time resolv
ed spectroscopy with QCLs)\n Imaging (detector arrays\, infrared and THz
imaging\, QCL-based active imaging\, novel mechanisms\, sub-wavelength ima
ging)\n Communications (free-space laser communication\, THz communicatio
ns)\n Frequency metrology (QCL frequency combs\, laser stabilization\, ul
trafast laser characterization)\n\n\n \;\n\nwww.itqw2019.com \n
END:VEVENT
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
X-LIC-LOCATION:America/Los_Angeles
BEGIN:DAYLIGHT
DTSTART:20190310T030000
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
END:DAYLIGHT
END:VTIMEZONE
END:VCALENDAR