Lecture Materials
This book was developed from two courses. Lecture material is provided here from recent versions.
Foundations of Data Analysis
This course (UofU Fall 2020) covers most of Chapters 1,2,3,5,6,7,8, and 9 and is taught to mainly college sophomores and juniors. The Lecture Video Playlist is on Youtube, and was taught over Zoom. The following has links to slides written on iPad during the lecture, and sometimes Colab python code.
- Lecture 1 : Class Overview w/ Video
- Lecture 2 : Probability Review : Sample Space, Random Variables, Independence (Chap 1-1.2) w/ Video and Code
- Lecture 3 : Probability Review : PDFs, CDFs, Expectation, Variance, Joint and Marginal Distributions (Chap 1.3-1.6) w/ Video
- Lecture 4 : Bayes’ Rule : Maximum Likelihood (Chap 1.7) w/ Video
- Lecture 5 : Bayesian Inference (Chap 1.8) w/ Video
- Lecture 6 : Convergence : Central Limit Theorem and Estimation (Chap 2.1-2.2) w/ Video and Code
- Lecture 7 : Convergence : PAC Algorithms and Concentration of Measure (Chap 2.3) w/ Video
- Lecture 8 : Linear Algebra Review : Vectors, Matrices, Multiplication and Scaling (Chap 3.1-3.2) w/ Video
- Lecture 9 : Linear Algebra Review : Norms, Linear Independence, Rank (Chap 3.3-3.5) w/ Video and Code
- Lecture 10 : Linear Algebra Review : Inverse, Orthogonality (Chap 3.6-3.8) w/ Video
- Lecture 11 : Linear Regression : explanatory & dependent variables (Chap 5.1) w/ Video and Code
- Lecture 12 : Linear Regression : multiple regression, polynomial regression (Chap 5.2-5.3) w/ Video and Code for multi-regression and for poly regression
- Lecture 13 : Linear Regression : overfitting and cross-validation (Chap 5.4) w/ Video and Code
- Lecture 14 was a review
- Lecture 15 : Gradient Descent : functions, minimum, maximum, convexity & gradients (Chap 6.1-6.2) w/ Video
- Lecture 16 : Gradient Descent : algorithmic & convergence (Chap 6.3) w/ Video and Code
- Lecture 17 : Gradient Descent : fitting models to data and stochastic gradient descent (Chap 6.4) w/ Video
- Lecture 18 : Dimensionality Reduction : SVD (Chap 7.1-7.2) w/ Video
- Lecture 19 : Dimensionality Reduction : rank-k approximation and eigenvalues (Chap 7.2-7.3) w/ Video and Code
- Lecture 20 : Dimensionality Reduction : power method (Chap 7.4) w/ Video and Code
- Lecture 21 : Dimensionality Reduction : PCA, centering, and MDS (Chap 7.5-7.6) w/ Video and Code for centering and for MDS
- Lecture 22 : Clustering : Voronoi Diagrams + Assignment-based Clustering (Chap 8.1) w/ Video and Code
- Lecture 23 : Clustering : k-means (Chap 8.3) w/ Video
- Lecture 24 : Clustering : EM, Mixture of Gaussians, Mean-Shift (Chap 8.4,8.7) w/ Video
- Lecture 25 : Classification : Linear prediction (Chap 9.1) w/ Video
- Lecture 26 : Classification : Perceptron Algorithm (Chap 9.2) w/ Video
- Lecture 27 : Classification : Kernels and SVMs (Chap 9.3) w/ Video
- Lecture 28 : Classification : KNN, Decision Trees, Neural Nets (Chap 9.5,9.6,9.7) w/ Video
Data Mining
This course (mostly from UofU Spring 2020) covers most of Chapters 4, 5, 7, 8, 10, and 11 and is taught to college seniors and first year graduate students. It mostly follows this Lecture Playlist on YouTube, partially taught on Zoom. The following links to slides written on iPad during the lecture.