Seminar- Dr. Narayana Santhanam | Electrical & Computer Engineering

Wednesday, April 28 at 4:30pm

Virtual Event

Title: Data-derived formulations: regularization vs. learning

Dr. Narayana Santhanam, University of Hawaii

Abstract: Regularization is often used to match available training sample sizes to model complexity. As training sample sizes increase, regularization constraints are usually relaxed when choosing the model. A natural question then arises: as the constraints relax, does the selected model keep varying or is the procedure stable in the sense that at some point, no further relaxation of constraints changes the selected model substantially? To understand this, we develop a statistical framework of eventually-almost sure prediction. Using only samples from a probabilistic model, we predict properties of the model and of future observations.  The prediction game continues in an online fashion as the sample size grows with new observations. After each prediction, the predictor incurs a binary (0-1) loss. The probability model underlying a sample is otherwise unknown except that it belongs to a known class of models. The goal is to make finitely many errors (i.e. loss of 1) with probability 1 under the generating model, no matter what it may be in the known model class.

We characterize problems that can be predicted with finitely many errors. Our characterization is through regularization, and answers precisely the question of when regularization eventually settles on a model and when it does not. Furthermore, we also characterize when a universal stopping rule can identify (to any given confidence) at what point no further errors will be made. We specialize these general results to a number of problems---online classification, entropy prediction, Markov processes, risk management---of which we will focus on online classification in this talk.

 

Bio: Narayana Santhanam is an Associate Professor at the University of Hawaii with research interests in the intersection of learning theory, statistics and information theory, and applications thereof. He obtained his PhD from the University of California, San Diego and held a postdoctoral position at the University of California, Berkeley, before taking up a faculty position at the University of Hawaii. He is currently an Associate Editor of the IEEE Transactions of Information Theory, a member of the Center for Science of Information (a NSF Science and Technology center), and is a recipient of the IEEE IT Society Best Paper Award. Among his current pedagogical priorities is developing a robust data science curriculum grounded in engineering fundamentals aimed at electrical engineering students.

Register here: https://lehigh.zoom.us/meeting/register/tJcoce-tpj8sHNSPtEVXU2aMnKT-VIwSVZa0

Topic

Academics & Colleges

Target Audience

Prospective Students, Undergrad Students, Graduate Students, Faculty, Alumni

Department
Electrical and Computer Engineering Department, P.C. Rossin College of Engineering and Applied Science
Contact Information

Jessica Berton Jeb717@lehigh.edu

Add to my calendar
Google Calendar iCal Outlook

You're not going yet!

This event requires registration.