Course

Convex Optimization I

Stanford University

This course, Convex Optimization I, focuses on identifying and addressing convex optimization problems prevalent in engineering disciplines. Key topics include:

  • Convex sets and functions
  • Optimization problems
  • Basics of convex analysis
  • Least-squares, linear, and quadratic programming
  • Semidefinite programming and minimax problems
  • Optimality conditions and duality theory
  • Interior-point methods and various applications

Prerequisites include a solid understanding of linear algebra. While experience in numerical computing and optimization is advantageous, the course will maintain a straightforward approach to engineering applications.

Course Lectures
  • This module introduces the foundational concepts of convex optimization, providing examples and techniques for solving optimization problems. Key areas of focus include:

    • Defining convex optimization
    • Understanding least-squares problems
    • Linear programming techniques
    • Identifying convex optimization criteria
    • Establishing course goals and expectations
  • This module features a guest lecture by Jacob Mattingley, who discusses various concepts of convex sets and operations preserving convexity. Topics include:

    • The nature of convex sets and cones
    • Polyhedra and positive semidefinite cones
    • Affine functions and generalized inequalities
    • Minimum and minimal elements
    • Supporting hyperplane theorem
  • Logistics
    Stephen Boyd

    This module centers on logistics and the critical properties of convex functions. It includes:

    • Understanding convex functions through examples
    • Analyzing first-order conditions
    • Exploring epigraphs and sublevel sets
    • Applying Jensen's inequality
    • Operations that maintain convexity
  • Vector Composition
    Stephen Boyd

    This module delves into vector composition and its implications in convex optimization. The focus areas include:

    • The concept of perspective in vector composition
    • Understanding the conjugate function
    • Exploring quasiconvex functions and their properties
    • Investigating log-concave and log-convex functions
    • Examples showcasing applications of these concepts
  • This module focuses on optimal and locally optimal points within convex optimization problems. Key topics include:

    • The feasibility problem in optimization
    • Distinguishing between local and global optima
    • Optimality criteria for differentiable functions
    • Examining equivalent convex problems
    • Exploring quasiconvex optimization and problem families
  • This module introduces generalized linear-fractional programming along with related concepts. Topics covered include:

    • Understanding generalized linear-fractional programs
    • Quadratic programs (QP) and their characteristics
    • Quadratically constrained quadratic programs (QCQP)
    • Second-order cone programming and robust linear programming
    • Geometric programming with practical examples
  • This module covers generalized inequality constraints in optimization problems. It focuses on:

    • Understanding semidefinite programming (SDP)
    • The relationship between LP and SOCP as SDP
    • Eigenvalue minimization techniques
    • Vector optimization and Pareto optimal points
    • Multicriterion optimization and risk-return trade-off in portfolio optimization
  • Lagrangian
    Stephen Boyd

    This module introduces the Lagrangian method and its significance in convex optimization. Key topics include:

    • Understanding the Lagrange dual function
    • Finding least-norm solutions to linear equations
    • Setting up standard form linear programming
    • Exploring dual problems and duality concepts
    • Complementary slackness and its applications
  • This module elaborates on complementary slackness and Karush-Kuhn-Tucker (KKT) conditions relevant to convex problems. It includes:

    • Understanding complementary slackness in optimization
    • Examining KKT conditions and their implications
    • Application of perturbation and sensitivity analysis
    • Exploring duality and problem reformulations
    • Introduction of new variables and equality constraints
  • This module focuses on the applications of convex optimization in various fields. It covers:

    • Norm approximation techniques
    • Penalty function approximation
    • Least-norm problems and their implications
    • Regularized approximation methods
    • Robust approximation and stochastic robust least squares
  • Statistical Estimation
    Stephen Boyd

    This module dives into statistical estimation, covering essential concepts and methodologies. Key topics include:

    • Maximum likelihood estimation techniques
    • Real-world examples illustrating concepts
    • Logistic regression techniques
    • Binary hypothesis testing fundamentals
    • Experiment design, including D-optimal design
  • This module continues the discussion on experiment design with a focus on geometric problems. It addresses:

    • Minimum volume ellipsoid surrounding a set
    • Maximum volume inscribed ellipsoid concepts
    • Efficiency of ellipsoidal approximations
    • Centering and analytic center of inequalities
    • Linear discrimination techniques
  • This module continues the examination of linear discrimination, emphasizing robust techniques and applications. It includes:

    • Robust linear discrimination methods
    • Approximate linear separation of non-separable sets
    • Support vector classifiers and their use
    • Nonlinear discrimination techniques
    • Applications in placement and facility location
  • This module continues the discussion on LU factorization, exploring advanced concepts and methods. Key topics include:

    • Understanding sparse LU factorization techniques
    • Cholesky factorization and its applications
    • LDLT factorization concepts
    • Dealing with equations containing structured sub-blocks
    • Evaluating dominant terms in flop count
  • This module explores the algorithmic aspects of convex optimization, focusing on unconstrained minimization techniques. It covers:

    • Initial points and sublevel sets
    • Understanding strong convexity and its implications
    • Descent methods including gradient descent
    • Steepest descent method applications
    • Newton's method and classical convergence analysis
  • This module continues the exploration of unconstrained minimization, introducing self-concordance and its significance. Key points include:

    • Understanding self-concordance and its impact on convergence
    • Convergence analysis for self-concordant functions
    • Implementation techniques and challenges
    • Examining examples of dense Newton systems
    • Equality constrained minimization and its methods
  • This module continues the discussion on Newton's method, focusing on its applications in convex optimization. Topics include:

    • Newton step at infeasible points
    • Solving KKT systems effectively
    • Equality constrained analytic centering techniques
    • Evaluating the complexity of various methods
    • Network flow optimization techniques
  • Logarithmic Barrier
    Stephen Boyd

    This module focuses on logarithmic barriers and their implications for convex optimization. Key areas include:

    • Understanding the central path in optimization
    • Dual points on the central path
    • Interpreting KKT conditions in this context
    • Barrier methods and their convergence analysis
    • Feasibility and phase I methods
  • This module concludes the course with a review of interior-point methods and their complexities. It includes:

    • Examples illustrating the barrier method
    • Complexity analysis through self-concordance
    • Evaluating the total number of Newton iterations
    • Understanding generalized inequalities
    • Final remarks on the course and future topics