Nonlinear and Stochastic Optimization
Contents
Nonlinear and Stochastic Optimization#
Computational optimization is essential to many subdomains of engineering, science, and business such as design, control, operations, supply chains, game theory, data science/analytics, and machine learning.
This course provides a practical introduction to models, algorithms, and modern software for large-scale numerical optimization. Topics include (nonconvex) nonlinear programming, deterministic global optimization, integer programming, dynamic optimization, and stochastic programming. Multi-objective optimization and mathematical programs with complementarity constraints may be covered based on time and student interests. The class is design for advanced undergraduate/graduate students from engineering, science, and mathematics who wish to incorporate optimization methods into their research. The course will begin with an introduction to modeling and the Python-based Pyomo computational environment. Optimization theory and algorithms are emphasized.
What am I going to get out of this class?#
At the end of the semester, you should be able to…
Mathematically formulate optimization problems relevant to your discipline (and research!)
Program optimization models in Pyomo and compute numerical solutions with state-of-the-art (e.g., commercial) solvers
Explain the main theory for nonlinear constrained and unconstrained optimization
Describe basic algorithm elements in pseudocode and implement them in Python
Analyze results from an optimization problem and communicate key finds in a presentation
Write and debug 200 lines of Python code using best practices (e.g., publication quality figures, doc strings)
Content#
Organization
Optimization Modeling in Pyomo
Algorithms and Theory
- 6. Unconstrained Nonlinear Optimization
- 7. Constrained Nonlinear Optimization
- 7.1. Convexity Revisited
- 7.2. Local Optimality Conditions
- 7.3. Analysis of KKT Conditions
- 7.4. Constraint Qualifications
- 7.5. Second Order Optimality Conditions
- 7.6. NLP Diagnostics with Degeneracy Hunter
- 7.7. Simple Netwon Method for Equality Constrained NLPs
- 7.8. Inertia-Corrected Netwon Method for Equality Constrained NLPs
- 8. Special Topics
Student Contributions
- Derivative-Free Optimization
- Stochastic Gradient Descent
- Stochastic Gradient Descent Tutorial 1
- Stochastic Gradient Descent Tutorial 2
- Functions and Utilities
- Implementation of the Algorithm
- Regression Example: Fitting a Quadratic Function to a 2D Data Set
- Binary Classification: Logistic Regression Example
- Conclusions
- References
- Stochastic Gradient Descent Tutorial 3
- Machine Learning and Applied Statistics