Most recently taught
Course Description: The emerging field of data analytics and, more broadly data science, is transforming engineering, healthcare, scientific discovery and many industries ranging from Agriculture to Telecommunications. In this class we discuss how to use data to build models to perform prediction and inference. Topics: Predictive modeling. Regression and Classification. Data cleaning and preprocessing. Feature engineering. Unsupervised methods. Principal Component Analysis. Data clustering. Model selection and feature selection. Entropy and Information theory. We spend quite some time on Neural Networks and Deep Learning. Time permitting, we also cover Machine learning for signals and time-series data.
The main feature of this class, as the name suggests, is a hands-on approach. We spend significant time working with real and large scale data sets, playing on Kaggle, etc. Though there are labs (homeworks), a midterm and a Kaggle competition, the main deliverable for this class is a group final project.
Data Science Lab vs Data Science Principles: Necessarily, this class has significant overlap with the Data Science Principles course, as we have worked hard to ensure that they can be taken in any order. Both classes cover things like regression, classification, decision trees, bagging and boosting. This class will focus more on hands-on issues and challenges, whereas the principles course is more focused on analysis and derivations. Also, this course will have a significantly more pronounced focus on Neural Nets.
Some of the amazing final projects students have done in the past can be found here.
For undergraduates interested in ML and possibly ML research, I encourage you to check out the Machine Learning and Data Science Club (MLDS) at UT.
Most recently taught
Most recently taught
Course Description: This is intended to be a second course in optimization, pitched to advanced graduate students who have already taken a first course (covering topics like duality, formulations, etc.) and are interested in using optimization in their research. The course focuses on the details of algorithms and their analysis, for many different problems in convex optimization. The main topics covered are as follows. Note that lecture duration/emphasis is not evenly divided for each of these topics:
You can find a full record of this class at this YouTube link.
Most recently taught
Course Description: This course is intended to be an advanced graduate course for students that have significant mathematical maturity and also interest in optimization. Though we go through LP duality from the beginning, prior exposure to LP and Convex Optimization would be greatly beneficial.
The focus is on some classical problems in polyhedral combinatorial optimization, as well as basic results in linear programming, such as the basic geometry of polytopes, LP duality, the primal dual framework, and the ellipsoid algorithm. After that, we focus on Matching, Matroids and Submodular optimization, with a variety of other problems and results thrown in along the way. We conclude with a series of lectures on extension complexity and lower bounds coming from communication complexity.
The course is intended to be self-contained, and there is no required textbook. The material for the course is drawn from many sources. But primary among these are the following four textbooks:
You can find a full record of this class at this YouTube link.
I teach two courses in the online MS program in CS and AI. These are aligned with the courses Optimization I and Optimization II described above.
I teach two courses as part of the Master’s in IT and Management program (MSITM) at McCombs Business School.
This class teaches fundamental concepts in machine learning and data science, with a few towards applications in business, blending intuition, applicability and business impact, and technical rigor.
This class builds on the first semester. We focus on the modern tools and applications of machine learning, developing a working knowledge of neural networks using PyTorch, including convolutional neural networks and transformers. We also spend considerable time discussing foundation models, their use, and their potential impact.