Teaching

I teach classes in ECE, Computer Science, and McCombs School of Business.

Undergraduate Teaching in ECE

Data Science Lab:

Most recently taught

Course Description: The emerging field of data analytics and, more broadly data science, is transforming engineering, healthcare, scientific discovery and many industries ranging from Agriculture to Telecommunications. In this class we discuss how to use data to build models to perform prediction and inference. Topics: Predictive modeling. Regression and Classification. Data cleaning and preprocessing. Feature engineering. Unsupervised methods. Principal Component Analysis. Data clustering. Model selection and feature selection. Entropy and Information theory. We spend quite some time on Neural Networks and Deep Learning. Time permitting, we also cover Machine learning for signals and time-series data.

The main feature of this class, as the name suggests, is a hands-on approach. We spend significant time working with real and large scale data sets, playing on Kaggle, etc. Though there are labs (homeworks), a midterm and a Kaggle competition, the main deliverable for this class is a group final project.

Data Science Lab vs Data Science Principles: Necessarily, this class has significant overlap with the Data Science Principles course, as we have worked hard to ensure that they can be taken in any order. Both classes cover things like regression, classification, decision trees, bagging and boosting. This class will focus more on hands-on issues and challenges, whereas the principles course is more focused on analysis and derivations. Also, this course will have a significantly more pronounced focus on Neural Nets.

Some of the amazing final projects students have done in the past can be found here.

For undergraduates interested in ML and possibly ML research, I encourage you to check out the Machine Learning and Data Science Club (MLDS) at UT.


Graduate Teaching in ECE

Optimization I

Most recently taught

Optimization II: Algorithms for Large Scale Convex Optimization

Most recently taught

Course Description: This is intended to be a second course in optimization, pitched to advanced graduate students who have already taken a first course (covering topics like duality, formulations, etc.) and are interested in using optimization in their research. The course focuses on the details of algorithms and their analysis, for many different problems in convex optimization. The main topics covered are as follows. Note that lecture duration/emphasis is not evenly divided for each of these topics:

  1. Convex Sets, functions, basic definitions. Optimality conditions for constrained possibly non-differentiable convex problems.
  2. Gradient and Subgradient descent. Convergence rates for convex functions, for convex and smooth functions, for convex, smooth and strongly convex functions.
  3. Oracle Lower Bounds and Accelerated Methods
  4. Proximal Gradient. ISTA and FISTA.
  5. Mirror Descent
  6. Frank Wolfe and Conditional Gradient
  7. Stochastic methods. SVRG.
  8. Newton and Quasi-Newton Methods
  9. Interior Point Methods
  10. Legendre-Fenchel Duality
  11. Dual Decomposition Algorithms: Proximal Point Algorithm, Prox Grad in the Dual, Augmented Lagrangian Method
  12. Monotone Operators, Contractive Operators, Non-Expansive and Firmly Non-Expansive Operators.
  13. Operator Splitting, Douglas-Rachford and ADMM

You can find a full record of this class at this YouTube link.

Combinatorial Optimization

Most recently taught

Course Description: This course is intended to be an advanced graduate course for students that have significant mathematical maturity and also interest in optimization. Though we go through LP duality from the beginning, prior exposure to LP and Convex Optimization would be greatly beneficial.

The focus is on some classical problems in polyhedral combinatorial optimization, as well as basic results in linear programming, such as the basic geometry of polytopes, LP duality, the primal dual framework, and the ellipsoid algorithm. After that, we focus on Matching, Matroids and Submodular optimization, with a variety of other problems and results thrown in along the way. We conclude with a series of lectures on extension complexity and lower bounds coming from communication complexity.

The course is intended to be self-contained, and there is no required textbook. The material for the course is drawn from many sources. But primary among these are the following four textbooks:

  1. Combinatorial Optimization, by Papadimitriou and Steiglitz,
  2. Geometric Algorithms and Combinatorial Optimization, by Grotschel, Lovasz and Schrijver,
  3. Theory of Linear and Integer Programming, by Schrijver,
  4. Combinatorial Optimization, by Schrijver.

You can find a full record of this class at this YouTube link.


Graduate Teaching in CS

I teach two courses in the online MS program in CS and AI. These are aligned with the courses Optimization I and Optimization II described above.


Teaching in McCombs: Business Data Science and Introduction to Deep Learning

I teach two courses as part of the Master’s in IT and Management program (MSITM) at McCombs Business School.

This class teaches fundamental concepts in machine learning and data science, with a few towards applications in business, blending intuition, applicability and business impact, and technical rigor.

This class builds on the first semester. We focus on the modern tools and applications of machine learning, developing a working knowledge of neural networks using PyTorch, including convolutional neural networks and transformers. We also spend considerable time discussing foundation models, their use, and their potential impact.