Vector properties, magnitudes, and algebraic operations including addition and scalar multiplication. Introduces matrix representations, arithmetic, and computational techniques for solving linear systems.
A grading rubric for the "Data Containers" project. It evaluates students on conceptual abstraction of n-dimensional vectors, accuracy in modeling linear combinations, computational precision, and the depth of their critical reflection on linear algebra's role in data science.
A student worksheet for an undergraduate linear algebra lesson focusing on vectors as data containers. It includes a spreadsheet hook, video-guided notes on n-dimensional vectors, and a "Nutrition Ledger" project where students apply linear combinations to real-world data.
A conceptual exit ticket for graduate students to synthesize their learning across the entire sequence, focusing on the differences between analytical and stochastic optimization and the geometry of high-dimensional landscapes.
Seminar-style discussion cards for graduate students to explore the nuances of Stochastic Gradient Descent, the role of noise, and the geometric challenges of high-dimensional optimization.
A slide deck exploring Stochastic Gradient Descent (SGD) and its role in navigating high-dimensional, non-convex landscapes in machine learning. Covers mini-batching, noise, and convergence.
A teacher-facing guide for the lesson on Lagrange multipliers, including instructional goals, full solutions for the resource allocation problem, and notes on the interpretation of shadow prices.
A student worksheet for practicing constrained optimization using the method of Lagrange multipliers. Features a Cobb-Douglas production problem and questions on the interpretation of the multiplier.
Student face compression project for Lesson 5, focusing on eigenvalue interpretation, scree plots, and dimensionality reduction synthesis.
A slide deck explaining constrained optimization using Lagrange multipliers. Covers the geometric interpretation of aligned gradients, the Lagrangian function, and the meaning of the multiplier lambda.
Slide deck for Lesson 5 introduction eigenvectors, linear transformations, and Principal Component Analysis (PCA) for dimensionality reduction.
Teacher facilitation guide for Lesson 5, focusing on the instructional narrative of eigenvectors, linear transformations, and Principal Component Analysis (PCA).
A comprehensive reference sheet for graduate students summarizing key optimization algorithms, including Vanilla Gradient Descent, Momentum, Nesterov Accelerated Gradient, and AdaGrad. Includes a troubleshooting table for common convergence issues.
Student signal decomposition worksheet for Lesson 4, involving manual Gram-Schmidt calculations in R^3 and critical thinking on linear independence.
An implementation-focused worksheet where students write gradient descent pseudocode and manually trace the algorithm's path for a simple quadratic function to observe convergence behavior.
Slide deck for Lesson 4 introduction orthogonal projections, Gram-Schmidt algorithm, and the significance of orthonormal bases in numerical computing.
A slide deck introducing gradient descent, learning rate tuning, momentum, and convergence criteria. Designed for graduate students focusing on numerical optimization.
Teacher facilitation guide for Lesson 4, focusing on the instructional narrative of orthogonal projections, the Gram-Schmidt process, and orthonormal bases.
Student inquiry-based activity for Lesson 3, focusing on volume ratios, distance distribution interpretation, and high-dimensional orthogonality.
A teacher's solution key for the Point Classification Lab, including full derivations of eigenvalues, critical point classifications, and a high-dimensional statistical discussion for graduate students.
Slide deck for Lesson 3 introducing the counter-intuitive geometric properties of high-dimensional spaces, including volume concentration and distance indistinguishability.