Vector properties, magnitudes, and algebraic operations including addition and scalar multiplication. Introduces matrix representations, arithmetic, and computational techniques for solving linear systems.
An undergraduate-level introduction to visualizing vectors as multi-dimensional data containers. Students explore real-world applications from catering costs to nutritional data, shifting the perspective of vectors from geometric arrows to abstract data structures.
Exploring Stochastic Gradient Descent (SGD) and its role in navigating high-dimensional, non-convex landscapes in machine learning.
Solving optimization problems under constraints using the method of Lagrange multipliers, focusing on the alignment of gradient vectors.
Implementation of iterative numerical methods, focusing on the geometry of convergence, learning rates, and momentum in gradient descent.
An examination of second-order derivatives via the Hessian matrix to understand surface curvature and classify critical points using eigenvalues.
Students analyze the gradient vector as a directional quantity, establishing its geometric relationship with level sets and proving it indicates the steepest ascent.
Identify eigenvectors as the invariant directions of linear operators. Students apply these concepts to Principal Component Analysis (PCA) to extract the most significant 'directions of variance' in high-dimensional datasets.
Master the decomposition of vectors into orthogonal components. Students implement the Gram-Schmidt process to build orthonormal bases, providing the foundation for numerical stability and signal processing.
A deep dive into the counter-intuitive geometric properties of R^n as n grows large. Students explore the concentration of measure, the 'thin skin' of high-dimensional spheres, and why random vectors tend toward orthogonality.
Analyze vector direction independently of magnitude using cosine similarity. Students explore semantic similarity in high-dimensional spaces, differentiating between Euclidean distance and angular separation in document clustering and text analysis.
Investigate the geometry of vector magnitudes through various Lp norms (L1, L2, Linf). Students explore how the choice of norm defines 'size' and 'distance' in computational contexts like signal processing and regularization.
The sequence culminates with the application of higher-order vector quantities (tensors) to describe internal forces. Students analyze the Cauchy stress tensor and solve equilibrium equations in continuous media.
Students generalize standard vector calculus operations using the covariant derivative. They derive the expressions for gradient, divergence, and curl in arbitrary coordinate systems using Christoffel symbols.
Students utilize the metric tensor to raise and lower indices, effectively mapping between vector and covector quantities. The lesson focuses on calculating arc lengths and angles in curved spaces.
This lesson introduces the rigorous definitions of covariant (dual) and contravariant vectors based on transformation properties. Students practice index notation and Einstein summation convention.
Students analyze vector quantities in non-Cartesian systems (cylindrical, spherical, and general curvilinear). They distinguish between holonomic and non-holonomic bases and calculate local basis vectors.
Analyze complete normed and inner product spaces (Banach and Hilbert) and their applications in physics.
Investigate dual spaces, linear functionals, and the Riesz Representation Theorem.
Explore inner product spaces, orthogonality, and the Gram-Schmidt process in general spaces.
Introduce Normed Linear Spaces to define magnitude, comparing L-p norms and analyzing convergence.