Fundamental concepts of limits, derivatives, and integrals for modeling change and motion. Examines techniques for differentiation and integration alongside applications in optimization, area calculation, and differential equations.
This mathematical physics sequence explores the coordinate systems necessary for solving problems involving complex shapes, moving beyond Cartesian coordinates to General Curvilinear systems. Students derive scale factors, volume elements, and differential operators, culminating in solving Laplace's equation and understanding metric tensors.
A graduate-level exploration of the Calculus of Variations, focusing on optimizing functionals. Students derive the Euler-Lagrange equation and apply it to physics and geometry problems like the Brachistochrone and Isoperimetric challenges.
This sequence establishes the rigorous mathematical underpinnings necessary for advanced optimization work, moving beyond procedural calculus to analysis-based proofs. Students explore the intersection of topology, set theory, and multivariate calculus to determine the existence and uniqueness of optimal solutions.
A comprehensive graduate-level exploration of series solutions for differential equations with variable coefficients, focusing on power series, the Method of Frobenius, and the properties of Bessel and Legendre functions within the framework of Sturm-Liouville theory.
A graduate-level exploration of dynamical systems, focusing on the qualitative analysis of stability, phase portraits, and topological changes in nonlinear differential equations. Students move from linear classification to advanced stability proofs using Lyapunov functions and bifurcation theory.
A rigorous graduate-level sequence exploring the existence, uniqueness, and stability of solutions to ordinary differential equations using functional analysis and metric space theory.
An advanced graduate sequence exploring vector calculus from 3D fields to differential forms on manifolds, focusing on fluid dynamics and electromagnetic theory. It moves from parameterizing static fields to understanding global topological constraints on curved surfaces.
An advanced graduate-level exploration of stochastic processes, covering discrete and continuous-time Markov chains, Poisson processes, and queueing theory. The sequence bridges theoretical rigor with computational application through simulations and real-world modeling.
A graduate-level sequence exploring continuous-time stochastic processes through the lens of computational simulation. Students transition from discrete to continuous time models, focusing on Poisson processes, CTMCs, and queuing theory with a strong emphasis on empirical validation and theoretical rigor.
An undergraduate-level sequence exploring Poisson processes as continuous-time counting models, covering derivations, inter-arrival times, superposition, order statistics, and non-homogeneous variations.
A graduate-level exploration of expected value applications in finance, covering utility theory, portfolio optimization, risk-neutral pricing, and tail risk metrics. Students transition from theoretical foundations to computational implementation using Monte Carlo methods.
A graduate-level sequence on constrained optimization, covering Lagrange Multipliers, KKT conditions, and sensitivity analysis for economics and engineering applications.
A comprehensive graduate-level exploration of numerical optimization algorithms, moving from first-order gradient descent to second-order Newton methods and computationally efficient Quasi-Newton approaches. Students analyze convergence rates, stability, and strategies for navigating complex, non-convex landscapes.
A comprehensive calculus sequence for undergraduate students focused on the rigorous application of derivatives to industrial, geometric, and economic optimization problems. Students progress from basic modeling to multi-constraint capstone analysis.
A graduate-level sequence exploring the gradient vector as the foundational tool for modern optimization. Students move from the geometric interpretation of multivariate derivatives to the implementation of stochastic algorithms used in machine learning.