princetonlogo
Bartolomeo Stellato

ORF522: Linear and Nonlinear Optimization

Previous years: 2020 2021 2022

Description

This course introduces analytical and computational tools for linear, convex, nonconvex, and stochastic optimization. Topics include linear optimization modeling, duality, the simplex method, degeneracy, sensitivity analysis. Nonlinear optimality conditions, KKT conditions, first order and operator splitting methods for large-scale convex optimization, nonconvex optimization algorithms, and stochastic optimization. A broad spectrum of applications in engineering, finance and statistics is presented.

Learning objectives

This course introduces analytical and computational tools for mathematical optimization.
Upon successful completion of this course you should be able to:

  • Model decision-making problems across different disciplines as mathematical optimization problems.

  • Apply the most appropriate optimization tools when faced with a concrete problem.

  • Implement optimization algorithms and prove their convergence.

Office hours

Instructor

Name: Bartolomeo Stellato
Office hours: Sherrerd 123
Time: Thu 2:00pm – 4:00pm EST
Email: bstellato@princeton.edu

Assistant in instruction

Name: Yanjun Liu
Office hours: Sherrerd 003
Time: Mon 4:30pm - 6:30pm EST
Email: yanjun.liu@princeton.edu

Name: Jake Freeman
Office hours: Sherrerd 123
Time: Wed 2:00pm - 4:00pm EST
Email: jake.freeman@princeton.edu

Schedule

All lectures will take place on Tuesdays and Thursdays 11:00am - 12:20pm in Friend 008.

Linear optimization

# Date Topic HW References
1 09/03 Introduction
2 09/05 Linear optimization [Ch 1, LO]
3 09/10 Geometry and polyhedra [Ch 2, LO]
4 09/12 Simplex method I 1 Out [Ch 3, LO]
5 09/17 Simplex method II [Ch 3, LO]
6 09/19 Linear algebra and simplex implementation [Ch 3, LO] [Ch 13, NO] [Ch 8, LP]
7 09/24 Duality I [Ch 4, LO] [Ch 5, LP]
8 09/26 Duality II 2 Out [Ch 4, LO] [Ch 11, LP] [Ch 5, LO]

Large-scale convex optimization

# Date Topic HW References
9 10/01 Introduction [Ch 2 to 4 and 6, CO] [Ch1 LSMO] [Ch A and B, FCA]
10 10/03 Optimality conditions [Ch 2 and 12, NO] [Ch 4 and 5, CO]
11 10/08 Gradient descent [Ch 1 and 2, ICLO] [Ch 9, CO] [Ch 5, FMO]
10/10 Midterm
12 10/22 Subgradients [Ch 1 LSMO] [Ch 3 and 8, FMO] [ee364b] [Ch 3, ILCO]
13 10/24 Subgradient method and proximal algorithms 3 Out [Ch 3 and 6, FMO] [PA]
14 10/29 Operator theory I [Ch 4, FMO] [PA] [LSMO]
15 10/31 Operator theory II [Ch 4, FMO] [PA] [LSMO]
16 11/05 Operator splitting algorithms [PA] [LSMO]
17 11/07 Alternating Direction Method of Multipliers 4 Out [PA] [LSMO] [ADMM]
18 11/12 Acceleration schemes [Ch 1, FMO] [Ch 2, ILCO] [Ch 3, COAC]
19 11/14 Computer-aided analysis of first-order methods Tutorials, pepit toolbox

Nonconvex and stochastic optimization

# Date Topic HW References
20 11/19 Sequential convex programming [Ch 4 and 17, NO] [ee364b]
21 11/21 Branch and bound algorithms 5 Out [ee364b] [MINLO]
22 12/03 Robust optimization
23 12/05 Distributionally robust optimization
12/18 Final (12/18 - 12/20)

Material

The code 💻 used in the lectures is available on the github companion repo.

The lecture notes are available from the course website and intended to be self contained. The following books and monographs are useful as reference texts.
They are either free or digitally available via Princeton University library:

  • [LP] R. J. Vanderbei: Linear Programming: Foundations & Extensions (available on SpringerLink)

  • [LO] D. Bertsimas, J. Tsitsiklis: Introduction to Linear Optimization (available Princeton Controlled Digital Lending)

  • [NO] J. Nocedal, S. J. Wright: Numerical Optimization (available on SpringerLink)

  • [CO] S. Boyd, L. Vandenberghe: Convex Optimization (available for free)

  • [FMO] A. Beck: First-order methods in optimization (available on SIAM)

  • [LSMO] E. K. Ryu and W. Yin: Large-Scale Convex Optimization via Monotone Operators (available for free)

  • [FCA] J. B. Hiriart-Hrruty, C. Lemarechal: Fundamentals of Convex Analysis (available on SpringerLink)

  • [ILCO] Y. Nesterov: Introductory Lectures to Convex Optimization (available on SpringerLink)

  • [e364b] S. Boyd: Convex Optimization II Lecture Notes (available online)

  • [PA] N. Parikh, S. Boyd: Proximal Algorithms (available for free)

  • [ADMM] S. Boyd, N. Parikh, B. Peleato, J. Eckstein: Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers (available for free)

  • [COAC] S. Bubeck: Convex Optimization: Algorithms and Complexity (available for free)

  • [MINLO] P. Belotti, C. Kirches, S. Leyffer, J. Linderoth, J. Luedtke, A. Mahajan: Mixed-integer nonlinear optimization (available online)

In this course we strictly follow the crimes against matrices laws!

Prerequisites

  • Good knowledge of linear algebra and calculus. For a refresher, we recommend reading Appendices A and C of the S. Boyd and L. Vandenberghe Convex Optimization (available online).
  • Familiarity with Python.

Software

Students will use the Python-based modeling software CVXPY (cvxpy.org) to solve optimization problems arising in several applications in operations research, finance, machine learning and engineering.

The assignments will be using jupyter notebooks running in the jupyterlab environment.

Follow the instructions in this repository to install the complete the environment setup required to run and export all your notebooks.

Grading

All submissions should take place on Gradescope (accessible from the canvas website).

  • 30% Homeworks. 5 bi-weekly homeworks. Almost all of them will include a computational component. Homeworks are due at 11:59pm EST of the date specified. Requests for extension on homework will not be accepted, unless there is an extremely valid reason. Homeworks must always be submitted as a single pdf file which includes your written exercises (typed or handwritten) and code (pdf-exported jupyter notebook). To export your notebooks to pdf from jupyterlab, you should go to: "File" → "Save and Export Notebook As..." → "PDF".
  • 30% Midterm. 80 minutes written exam. No coding required.
  • 40% Final. Take-home final exam with written and computational questions.

Questions and discussions

Students are encouraged to discuss and ask questions on Ed (link accessible via canvas).
Please make sure to specify if questions are about General information of the course, about the Lectures or about Homeworks by assigning them to the corresponding category.

Collaboration policy

  • Homeworks. Students are allowed, and even encouraged, to collaborate on homeworks. When submitting your homework, you are required to list the name of the students you worked with. Also, please write the textbooks, notes or websites that were helpful to you.

  • Midterm and final. No collaborations allowed.

Honor code and generative AI

All work in this course must uphold the University’s commitment to academic integrity. This includes the Honor Code (for written examinations, tests, and quizzes) and guidelines for the submission of original work on all other assignments. More guidance can be found in Rights, Rules, and Responsibilities as well as the handbook Academic Integrity at Princeton.

The use of AI-based tools, like ChatGPT and GitHub Copilot, is allowed but discouraged, as they often produce incorrect answers, including flawed proofs and invalid counterexamples. While these tools can sometimes offer new ideas or accelerate workflows, they frequently make subtle mathematical and logical errors that could hinder your learning. If you use an AI tool, you must declare it on your solution sheet and are responsible for verifying its accuracy. Remember, your primary goal is to learn, and practicing skills independently is crucial. We reserve the right to adjust this policy as we better understand these tools' impacts on learning.

Attendance

Students are expected to attend each scheduled class on time and ready to participate fully. An excused absence will only be granted in the case of a religious observance, an ODS-approved accommodation, or -- as verified by your residential college -- a serious illness or an exceptional circumstance.