Give us a call or drop by anytime, we endeavor to answer all inquiries within 24 hours.
PO Box 16122 Collins Street West Victoria, Australia
info@domain.com / example@domain.com
Phone: + (066) 0760 0260 / + (057) 0760 0560
When Wednesday July 24th 10:00am *Updated
Where: HDSI 123 * Updated
Zoom Info: https://ucsd.zoom.us/j/99334315002 *Updated
Title: What Kinds of Functions do Neural Networks Learn? Theory and Practical Applications
Abstract: This talk presents a theory characterizing the types of functions neural networks learn from data. Specifically, the function space generated by deep ReLU networks consists of compositions of functions from the Banach space of second-order bounded variation in the Radon transform domain. This Banach space includes functions with smooth projections in most directions. A representer theorem associated with this space demonstrates that finite-width neural networks suffice for fitting finite datasets. The theory has several practical applications. First, it provides a simple and theoretically grounded method for network compression. Second, it shows that multi-task training can yield significantly different solutions compared to single-task training, and that multi-task solutions can be related to kernel ridge regressions. Third, the theory has implications for improving implicit neural representations, where multi-layer neural networks are used to represent continuous signals, images, or 3D scenes. This exploration bridges theoretical insights with practical advancements, offering a new perspective on neural network capabilities and future research directions.
Bio: Robert Nowak is the Grace Wahba Professor of Data Science and Keith and Jane Nosbusch Professor in Electrical and Computer Engineering at the University of Wisconsin-Madison. His research focuses on machine learning, optimization, and signal processing. He serves on the editorial boards of the SIAM Journal on the Mathematics of Data Science and the IEEE Journal on Selected Areas in Information Theory.meeting with him. If…