document.write("<style type=\"text/css\" media=\"screen\"> #lw { width:100%;background-color:white;font-family:arial;overflow:hidden;} #lw ul {list-style:none;} #lw li.lwe {padding:10px;border-bottom: 1px dotted #ccc;} .lwn {font-weight:bold;color:#222;font-size:16px;line-height:19px;overflow:hidden;} .lwn0 {font-weight:normal;float:right;font-size:14px;color:#444;} .lwd {padding-top: 5px;color:#222;font-size:12px;line-height:15px;overflow:hidden;} .lwi0 {float:left;padding:2px 6px 0px 0px;margin:0;} .lwi1 {float:left;padding:0px 6px 0px 0px;margin:0;} .lwl{ font-weight:bold;padding-top:5px;color:#222;font-size:12px;line-height:14px;overflow:hidden;clear:both;} .lwl0 {font-weight:normal;padding-right:3px;} .lwi { padding:2px; border: 1px solid #ccc; } </style> <div id=lw> <div class=lw> <ul id=lwe> <li class=lwe> <div class=lwn> <span class=lwn0> Tuesday, April 20, 2021 1pm </span> <a href=\"https://calendar.mit.edu/event/thesis_defense_-_alexey_balitskiy?utm_campaign=widget&utm_medium=widget&utm_source=MIT+Events+\">Thesis Defense - Alexey Balitskiy</a> </div> <div class=lwd><div class=\"description\"><p><b>Speaker: </b>Alexey Balitskiy</p> <p><b>Title:</b> Bounds on Urysohn width</p> <p><b>Zoom Link:</b> <a href=\"https://mit.zoom.us/j/4206650769\">https://mit.zoom.us/j/4206650769</a></p> </div></div> </li> <li class=lwe> <div class=lwn> <span class=lwn0> Wednesday, April 21, 2021 4:30pm </span> <a href=\"https://calendar.mit.edu/event/mit_lie_groups_seminar_20210421?utm_campaign=widget&utm_medium=widget&utm_source=MIT+Events+\">MIT Lie Groups Seminar</a> </div> <div class=lwd><div class=\"description\"><p><span style=\"font-size:12pt\"><span><span><b>Speaker:</b> <a href=\"http://math.umn.edu/directory/tsao-hsien-chen-0\">Tsao-Hsien Chen</a>, <a href=\"http://cse.umn.edu/\">University of Minnesota</a></span></span></span></p> <p><span style=\"font-size:12pt\"><span><span><b>Title: </b>Hitchin fibration and commuting schemes</span></span></span></p> <p><span style=\"font-size:12pt\"><span><span><b>Abstract:</b> The commuting scheme has always been of great interest in invariant theory but it was only recent that it appears as a primordial object in the study of the Hitchin fibration for higher dimensional varieties. I will explain how the invariant theory for the commuting scheme, in particular the Chevalley restriction theorem for the commuting scheme, is used in the study of Hitchin fibration and the proof of the Chevalley restriction theorem in the case of symplectic Lie algebras. The talk is based on joint work with Ngo Bao Chau.</span></span></span></p> </div></div> </li> <li class=lwe> <div class=lwn> <span class=lwn0> Wednesday, April 21, 2021 4:30pm </span> <a href=\"https://calendar.mit.edu/event/seminar_numerical_methods_for_partial_differential_equations?utm_campaign=widget&utm_medium=widget&utm_source=MIT+Events+\">Seminar: Numerical Methods for Partial Differential Equations</a> </div> <div class=lwd><div class=\"description\"><p><b>Name</b>:&nbsp; Ben Adcock&nbsp; (Simon Fraser University)</p> <p><b>Title</b>:&nbsp; Approximation of high-dimensional functions via sparse polynomials and deep neural networks, with application to parametric PDEs</p> <p><b>Abstract</b>:&nbsp;</p> <p>Driven by its various applications in scientific computing &ndash; in particular, the solution of parametric and stochastic DEs arising in uncertainty quantification &ndash; the approximation of smooth, high-dimensional functions via sparse polynomial expansions has received significant attention in the last decade. In the first part of this talk I will give a brief survey of recent progress in this area. In particular, I will show how the proper use of compressed sensing tools leads to algorithms for high-dimensional approximation which, unlike other approaches, provably possess near-optimal error bounds and moderate sample complexities. In particular, these techniques mitigate the curse of dimensionality to a substantial degree. The second part of the talk is devoted to most recent approaches based on deep neural networks and deep learning. Such tools are currently garnering substantial attention in the scientific computing community. Nonetheless, I will show evidence of a gap between current theory and practice. I will then discuss recent theoretical contributions showing that deep neural networks matching the performance of best-in-class schemes can be computed. This highlights the potential of deep neural networks, and sheds light on achieving robust, reliable and overall improved practical performance.</p> <p style=\"text-align:center\"><span style=\"font-size:12pt\"><span><a href=\"https://mit.zoom.us/j/93669870341\" style=\"text-decoration:underline\"><b>https://mit.zoom.us/j/93669870341</b></a></span></span></p> <p style=\"text-align:center\"><span style=\"font-size:12pt\"><span><b>Meeting ID: 936 6987 0341</b></span></span></p> </div></div> </li> <li class=lwe> <div class=lwn> <span class=lwn0> Thursday, April 22, 2021 10am </span> <a href=\"https://calendar.mit.edu/event/thesis_defense_-_paxton_turner?utm_campaign=widget&utm_medium=widget&utm_source=MIT+Events+\">Thesis Defense - Paxton Turner</a> </div> <div class=lwd><div class=\"description\"><p><b>Speaker: </b>Paxton Turner</p> <p><b>Title:</b>&nbsp;Combinatorial methods in statistics</p> <p><b>Zoom Link:&nbsp;</b><a href=\"https://mit.zoom.us/j/99191136897\">https://mit.zoom.us/j/99191136897</a></p> </div></div> </li> <li class=lwe> <div class=lwn> <span class=lwn0> Thursday, April 22, 2021 2pm </span> <a href=\"https://calendar.mit.edu/event/thesis_defense_-_vishal_patil?utm_campaign=widget&utm_medium=widget&utm_source=MIT+Events+\">Thesis Defense - Vishal Patil</a> </div> <div class=lwd><div class=\"description\"><p><b>Speaker:</b>&nbsp;&nbsp;Vishal Patil</p> <p><b>Title:&nbsp;</b>Geometry, topology and mechanics of twisted elastic fibers</p> <p><b>Zoom Link:</b>&nbsp;&nbsp;<a href=\"https://mit.zoom.us/j/99259649777\">https://mit.zoom.us/j/99259649777</a></p> </div></div> </li> </ul> </div> </div>");
