CBE60499¶

Chapter 1.0 Getting Started with Pyomo¶

  • 1.0.1 Local Installation
    • 1.0.1.1 Install anaconda
    • 1.0.1.2 Create a new conda environment
    • 1.0.1.3 Install IDAES-PSE
    • 1.0.1.4 Install Ipopt
    • 1.0.1.5 Install Additional Solvers
    • 1.0.1.6 Install Notebook Spellcheker
  • 1.0.2 Cloud Computing with Google Colab
  • 1.0.3 Your First Optimization Problem
    • 1.0.3.1 Mathematical Model
    • 1.0.3.2 Define the Model in Pyomo
    • 1.0.3.3 Solve using Ipopt
    • 1.0.3.4 Inspect the Solution
    • 1.0.3.5 Visualize the Solution
  • Markdown Links
    • spellchecker

1.1 60 Minutes to Pyomo: An Energy Storage Model Predictive Control Example¶

  • 1.1.1 Problem Setup
    • 1.1.1.1 Background
    • 1.1.1.2 Pandas and Energy Prices
    • 1.1.1.3 Optimization Mathematical Model
      • 1.1.1.3.1 Sets
      • 1.1.1.3.2 Variables
      • 1.1.1.3.3 Parameters
      • 1.1.1.3.4 Objective and Constraints
    • 1.1.1.4 Degree of Freedom Analysis
  • 1.1.2 Pyomo Modeling Components
    • 1.1.2.1 Create ConcreteModel
    • 1.1.2.2 Sets
    • 1.1.2.3 Variables
    • 1.1.2.4 Parameters (Constants / Data))
    • 1.1.2.5 Objectives
    • 1.1.2.6 Constraints
    • 1.1.2.7 Printing the Model
    • 1.1.2.8 Another Approach: Build the Model in a Function
  • 1.1.3 Calling Optimization Solver
    • 1.1.3.1 SolverFactory and Solver Options
    • 1.1.3.2 Interpreting Ipopt Output - Verifying Degree of Freedom Analysis
    • 1.1.3.3 Try Another Solver
  • 1.1.4 Inspecting the Solution
    • 1.1.4.1 Extracting Solution from Pyomo
    • 1.1.4.2 Visualizing the Solution
    • 1.1.4.3 Accessing Dual Variables
  • 1.1.5 References
  • Markdown Figures
    • US_markets
    • battery-optimization
    • pyomo-var-arguments
    • pyomo-var-domain
    • pyomo-param-arguments
    • pyomo-objective-arguments
    • pyomo-constraint-arguments
  • Figure files used
    • figures/battery.png
    • figures/pyomo-table-4.1.png
    • figures/pyomo-table-4.3.png
    • figures/pyomo-table-4.2.png
    • figures/pyomo-table-4.6.png
    • figures/pyomo-table-4.4.png

1.2 Pyomo Mini-Project: Receding Horizon Stochastic Control¶

  • 1.2.1 Assignment Goals
  • 1.2.2 Model Refinement and Code Reorganization
    • 1.2.2.1 Revised Model
    • 1.2.2.2 Degree of Freedom Analysis
    • 1.2.2.3 Python Code
    • 1.2.2.4 Optimal Control Over 3 Day Horizon
      • 1.2.2.4.1 Original Model
      • 1.2.2.4.2 With Constraint Enforcing No Simultaneous Charging or Discharging
      • 1.2.2.4.3 Comparison
  • 1.2.3 Receding Horizon Control
    • 1.2.3.1 Write Pseudocode
    • 1.2.3.2 Python Code
    • 1.2.3.3 Simulate Performance Over 3 Days
    • 1.2.3.4 Impact of Horizon Length
    • 1.2.3.5 Impact of Periodic Boundary Condition
  • 1.2.4 Forecast Uncertainty (required for CBE 60499, optional for CBE 40499))

Chapter 2.0 Optimization Modeling with Applications¶

  • 2.0.1 Recommended Reading
  • 2.0.2 Taxonomy of Optimization Problems
  • Markdown Figures
    • pyomo_book
    • nlp_book
    • taxonomy
    • minlp_formulation
    • optimization_examples
  • Figure files used
    • figures/classification.png
    • figures/pyomo_book_cover.jpg
    • figures/nlp_book_cover.jpg
    • figures/minlp.png
    • figures/apps_table.png

2.1 Continuous Optimization¶

  • 2.1.1 Linear Programs: Student Diet Example
    • 2.1.1.1 Propose an Optimization Model
    • 2.1.1.2 Solve in Pyomo
    • 2.1.1.3 Analyze Results
  • 2.1.2 Nonlinear Programs: Circle Packing Example
    • 2.1.2.1 Propose an Optimization Model
    • 2.1.2.2 Implement in Pyomo
    • 2.1.2.3 Visualize Initial Point
    • 2.1.2.4 Solve and Inspect the Solution
    • 2.1.2.5 Reinitialize and Resolve
  • 2.1.3 Take Away Messages
  • Markdown Figures
    • picture
    • picture
    • picture
  • Figure files used
    • figures/pack1.png
    • figures/pack3.png
    • figures/pack2.png

2.2 Integer Programs¶

  • 2.2.1 Optimizing Across Process Alternatives
    • 2.2.1.1 Develop the optimization model
    • 2.2.1.2 Solve with Continuous Cost Model in Pyomo
    • 2.2.1.3 Initialize to Favor Reaction 1 and Solve
    • 2.2.1.4 Initialize to Favor Reaction 2 and Solve
    • 2.2.1.5 Discrete Cost Model
    • 2.2.1.6 Enumerate the solutions
    • 2.2.1.7 Solve with Pyomo
  • 2.2.2 Is rounding good enough?
    • 2.2.2.1 Linear Program (Relaxation))
    • 2.2.2.2 Rounding
    • 2.2.2.3 Integer Program
    • 2.2.2.4 Why rounding does not always work
  • Markdown Figures
    • feasible
  • Figure files used
    • figures/feasible.png

2.3 Logical Modeling and Generalized Disjunctive Programs¶

  • 2.3.1 Logical Modeling
  • 2.3.2 Pyomo.GDP: Strip Packing Problem
    • 2.3.2.1 Define model in Pyomo with GDP
    • 2.3.2.2 Transform and Solve with Big M Relaxation
    • 2.3.2.3 Transform and Solve with Convex Hull Relaxation
  • Markdown Figures
    • logic_rules1
    • logic_rules2
    • logic_table
    • packing
  • Figure files used
    • figures/strip_packing.png
    • figures/logical_modeling2.png
    • figures/logical_rules.png
    • figures/logical_modeling1.png

2.4 Dynamic Optimization: Differential Algebraic Equations (DAEs)¶

  • 2.4.1 Dynamic Optimization Overview
  • 2.4.2 DAE Index Reduction
  • 2.4.3 DAE Formulations for Simple Pendulum Example
    • 2.4.3.1 Formulation 1: Index-3 DAE
    • 2.4.3.2 Formulation 2: Pure ODE Model
    • 2.4.3.3 Formulation 3: Index-1 DAE Model
    • 2.4.3.4 Formulation 4: Index-1 DAE Model
  • 2.4.4 Take Away Messages
  • Markdown Figures
    • dynamic-optimization-strategies
    • sequential_dae_optimization
    • dae-form1
    • dae-form2
    • dae-reduction
    • casadi-error-1
    • casadi-error-2
  • Figure files used
    • figures/dae_reduction.png
    • figures/dynamic_optimization_strategies.png
    • figures/casadi-error1.png
    • figures/sequential_dae_optimization.png
    • figures/casadi-error2.png
    • figures/dae_form2.png
    • figures/dae_form1.png

2.5 Numeric Integration for DAEs¶

  • 2.5.1 Single-Step Runge-Kutta Methods
    • 2.5.1.1 General Form: Index 0 DAE
    • 2.5.1.2 Explicit (Forward) Euler-Euler)
    • 2.5.1.3 Implicit (Backward) Euler-Euler)
    • 2.5.1.4 Comparison
    • 2.5.1.5 Stability
    • 2.5.1.6 Error Analysis
    • 2.5.1.7 Extensions to DAEs
  • 2.5.2 Quadrature Rules
    • 2.5.2.1 Main Idea
    • 2.5.2.2 Degree and order of a polynomial
    • 2.5.2.3 Gauss-Legrenge Quadrature
    • 2.5.2.4 Code
    • 2.5.2.5 Visualize Weights
    • 2.5.2.6 Why $2L-1$?
    • 2.5.2.7 How to determine the nodes and weights?
    • 2.5.2.8 Polynomial Example
    • 2.5.2.9 Generalization
    • 2.5.2.10 Another Example
    • 2.5.2.11 A More Complicated Example
    • 2.5.2.12 Radau Quadrature
    • 2.5.2.13 Differential Equations
  • Markdown Links
    • Biegler (2010)
    • McClarren (2018)
    • Butcher block
    • consistent
    • McClarren (2018)
    • Biegler (2010)

2.6 Dynamic Optimization with Pyomo.DAE¶

  • 2.6.1 Car Example
    • 2.6.1.1 Orthogonal Collocation on Finite Elements: Manual Approach
    • 2.6.1.2 Orthogonal Collocation on Finite Elements: Pyomo.dae
      • 2.6.1.2.1 Discretize/Transcribe and Solve
      • 2.6.1.2.2 Plot Results

2.7 Stochastic Programming¶

  • 2.7.1 Key Concepts
    • 2.7.1.1 Infinite Dimensional Formulation
    • 2.7.1.2 Discrete (Finite Dimensional) Approximations-Approximations)
    • 2.7.1.3 Sample Average Approximation
    • 2.7.1.4 Sparse Grids (Worst Case))
  • 2.7.2 Farmers Example
  • 2.7.3 PID Controller Tuning Example

2.8 Parameter estimation with parmest¶

  • 2.8.1 What is parameter estimation?
  • 2.8.2 What is parmest?
  • 2.8.3 Example: Reaction Kinetics
    • 2.8.3.1 Batch Reactor
    • 2.8.3.2 Experimental Data
    • 2.8.3.3 Pyomo model
  • 2.8.4 Parameter estimation with a single dataset
  • 2.8.5 Parameter estimation with multiple datasets
    • 2.8.5.1 Generate list of dataset
    • 2.8.5.2 Parameter estimation with parmest
    • 2.8.5.3 Plotting fitted model simulation with 'experimental' data
  • 2.8.6 Using parmest with pyomo.dae
    • 2.8.6.1 Parameter estimation with parmest
    • 2.8.6.2 Plotting fitted model simulation with 'experimental' data
  • 2.8.7 Local uncertainty analysis
    • 2.8.7.1 Covariance matrix
    • 2.8.7.2 Parameter identifiability
  • 2.8.8 Bootstrap resampling
  • 2.8.9 Nonlinear confidence regions
  • Markdown Links
    • Kanishka Ghosh
    • Jialu Wang
    • Prof. Alex Dowling
    • Pyomo optimization modeling language
    • PySP
    • here
    • here
    • link
    • here
    • here
    • link
    • link
    • here

2.9 Supplementary material: data for parmest tutorial¶

  • 2.9.1 Reaction Kinetics Example
    • 2.9.1.1 Simulating experimental data
    • 2.9.1.2 Simulate data for multiple experiments
  • Markdown Links
    • Kanishka Ghosh
    • Jialu Wang
    • Prof. Alex Dowling

2.10 Pyomo Homework 1¶

  • 2.10.1 Pyomo Fundamentals
    • 2.10.1.1 Knapsack example
    • 2.10.1.2 Knapsack example with improve printing
    • 2.10.1.3 Changing data
    • 2.10.1.4 Loading data from Excel
    • 2.10.1.5 NLP vs. MIP
  • 2.10.2 More Pyomo Fundamentals
    • 2.10.2.1 Knapsack problem with rules
    • 2.10.2.2 Integer formulation of the knapsack problem

2.11 Pyomo Homework 2¶

  • 2.11.1 Some Advanced Pyomo Tricks
    • 2.11.1.1 Using the decorator notation for rules
    • 2.11.1.2 Changing parameter values
    • 2.11.1.3 Integer cuts
    • 2.11.1.4 Putting it all together: Lot sizing example (Hart et al., 2017))
  • 2.11.2 Nonlinear programs: initial and problem formulation are very important!
    • 2.11.2.1 Alternative initialization
    • 2.11.2.2 Evaluation errors
    • 2.11.2.3 Alternative formulations
      • 2.11.2.3.1 Formulation 1
      • 2.11.2.3.2 Formulation 2
      • 2.11.2.3.3 Formulation 3
    • 2.11.2.4 Reactor design problem (Hart et al., 2017; Bequette, 2003))
  • Markdown Links
    • Rosenbrock problem
    • Rosenbrock problem

2.12 Pyomo Homework 3¶

  • 2.12.1 Pyomo.DAE: Reaction Kinetics
    • 2.12.1.1 Index analysis
    • 2.12.1.2 Model reformulation
    • 2.12.1.3 Implement index 1 model in Pyomo
      • 2.12.1.3.1 Create model and set initial conditions
    • 2.12.1.4 Simulate, discretize, and initialize collocation model
    • 2.12.1.5 Plot results
    • 2.12.1.6 Simulate and solve Pyomo model with initialization
    • 2.12.1.7 Simulate and solve Pyomo model without initialization
    • 2.12.1.8 Discussion: Does initialization matter?
    • 2.12.1.9 Degree of Freedom Analysis

Chapter 3.0 Unconstrained Nonlinear Optimization: Theory and Algorithms¶

3.1 Linear Algebra Review and SciPy Basics¶

  • 3.1.1 Notation (for textbook))
  • 3.1.2 Determinant
  • 3.1.3 Rank
  • 3.1.4 Inverse
  • 3.1.5 Solving Linear Systems
    • 3.1.5.1 Explicit Inverse
    • 3.1.5.2 LU Decomposition
      • 3.1.5.2.1 Is P orthogonal?
      • 3.1.5.2.2 MATLAB
      • 3.1.5.2.3 SciPy
      • 3.1.5.2.4 Verify our answer with linalg.solve
  • 3.1.6 Invertable Matrix Theorem
  • 3.1.7 Eigenvectors and Eigenvalues
  • 3.1.8 Singular Value Decomposition
  • 3.1.9 Vector and Matrix Norms
  • 3.1.10 Condition Number
  • Markdown Figures
    • Matrix Operations
    • Matrix Operations
    • Book
    • DET
    • DET
    • Book
    • Book
    • IMT
    • IMT
    • IMT
    • IMT
    • Book
    • Book
    • Book
  • Markdown Links
    • NumPy Documentation - Main Page
    • SciPy Documentation - Main Page
    • SciPy Linear Algebra - Tutorial
    • SciPy Linear Algebra - API
    • SciPy Lecture Notes
  • Figure files used
    • figures/eig.png
    • figures/det.png
    • figures/rank.png
    • figures/inv.png
    • figures/det_ex2.png
    • figures/norm1.png
    • figures/det_ex1.png
    • figures/imt4.png
    • figures/norm2.png
    • figures/mat_op1.png
    • figures/imt1.png
    • figures/mat_op2.png
    • figures/imt2.png
    • figures/imt3.png

3.2 Mathematics Primer¶

  • 3.2.1 Eigenvalues and Quadratic Programs
    • 3.2.1.1 Analysis Algorithm
    • 3.2.1.2 Excercise 2.8 in Biegler (2010))
    • 3.2.1.3 Activity 1
    • 3.2.1.4 Activity 2
  • 3.2.2 Classifying Functions (Key Concepts from Real Analysis))
  • 3.2.3 Taylor Series Approximation
  • 3.2.4 Finite Difference Approximation
    • 3.2.4.1 Forward Finite Difference
    • 3.2.4.2 Backward Finite Difference
    • 3.2.4.3 Central Finite Difference
    • 3.2.4.4 Activity
      • 3.2.4.4.1 Original: $f(x) = e^{x}$ at $x=1$-=-e^{x}$-at-$x=1$)
      • 3.2.4.4.2 Variant: $f(x) = e^{x}$ at $x=10$-=-e^{x}$-at-$x=10$)
      • 3.2.4.4.3 Your own test function
      • 3.2.4.4.4 Discussion
  • Markdown Figures
    • Book
  • Figure files used
    • figures/quad1.png

3.3 Unconstrained Optimality Conditions¶

  • 3.3.1 Local and Global Solutions
  • 3.3.2 Necessary Conditions for Optimality
  • 3.3.3 Sufficient Conditions for Optimality
  • 3.3.4 Example 2.19 in Biegler (2010))
    • 3.3.4.1 The Test Function
    • 3.3.4.2 Finite Difference Gradient and Hessian
    • 3.3.4.3 Analysis of Optimality Conditions
  • 3.3.5 Continuous Optimization Algorithms
  • Markdown Figures
    • Book
  • Figure files used
    • figures/ex2-19.png

3.4 Newton-type Methods for Unconstrained Optimization¶

  • 3.4.1 Test Problem: Example 2.19
  • 3.4.2 Helper Functions
  • 3.4.3 Algorithm 2.1: Basic Newton Method
  • 3.4.4 Starting Point Near Optimal Solution
  • 3.4.5 Activity: A Different Starting Point
  • 3.4.6 Activity: Let's break it
  • 3.4.7 Activity: Use $I$ in place of Hessian
  • 3.4.8 Adjust Hessian with Levenberg-Marquardt Correction
  • Markdown Figures
    • ex2-19
    • alg2-1
  • Figure files used
    • figures/alg2-1.png
    • figures/ex2-19.png

3.5 Quasi-Newton Methods for Unconstrained Optimization¶

  • 3.5.1 Unconstrained Optimization with Approximate Hessian
    • 3.5.1.1 Library of helper functions
    • 3.5.1.2 Symmetric Rank 1 (SR1) Update-Update)
  • 3.5.2 Test Case: Simple quadratic program
    • 3.5.2.1 Near solution
    • 3.5.2.2 Far from solution
    • 3.5.2.3 Activity/Discussion
  • 3.5.3 Test Case: Example 2.19
    • 3.5.3.1 $x_0$ somewhat near solution
    • 3.5.3.2 $x_0$ far from solution
    • 3.5.3.3 Activity/Discussion
  • 3.5.4 Broyden update with Cholesky factorization
  • Markdown Figures
    • alg3-1
    • alg3-1
  • Figure files used
    • figures/ex2-19.png
    • figures/alg3-1.png

3.6 Descent and Globalization¶

  • 3.6.1 Define Test Function and Derivatives
  • 3.6.2 Geometric Insights into Newton Steps
    • 3.6.2.1 Motivation
    • 3.6.2.2 Compute and Plot Steps
    • 3.6.2.3 Consider $x_0 = -3$
    • 3.6.2.4 Consider $x_0 = 0$
    • 3.6.2.5 Descent Properties
  • 3.6.3 Line Search
    • 3.6.3.1 Visualization Code
    • 3.6.3.2 Newton Step, $x^k = -3$
    • 3.6.3.3 Steepest Descent Step, $x^k = -3$
    • 3.6.3.4 Newton Step, $x^k = 0$
    • 3.6.3.5 Steepest Descent Step, $x^k = 0$
  • 3.6.4 Trust Regions
    • 3.6.4.1 Main Idea and General Algorithm
    • 3.6.4.2 Trust Region Variations
      • 3.6.4.2.1 Levenburg-Marquardt
      • 3.6.4.2.2 Powell Dogleg
  • Markdown Figures
    • book
    • book
    • book
    • book
    • book
    • book
    • book
    • book
    • book
    • book
    • book
  • Figure files used
    • figures/linesearch_conditions.png
    • figures/trust-region-intro.png
    • figures/Thm3-3.png
    • figures/PD-TR.png
    • figures/Thm3-4.png
    • figures/LM-TR.png
    • figures/TR-visual.png
    • figures/descent_direction.png
    • figures/Alg3-3.png
    • figures/Alg3-2.png
    • figures/PD-TR2.png

3.7 Algorithms Homework 1¶

  • 3.7.1 Linear Algebra Review
    • 3.7.1.1 Exact solution
    • 3.7.1.2 Solve using the inverse
    • 3.7.1.3 Solve using LU decomposition
    • 3.7.1.4 Solve using linalg.solve
  • 3.7.2 Eigenvalues
    • 3.7.2.1 Calculate eigenvalues by hand (on paper))
    • 3.7.2.2 Calculate eigenvalues using linalg.eig
    • 3.7.2.3 Definiteness
    • 3.7.2.4 Singular Value Decomposition
    • 3.7.2.5 SVD calculation using linalg.svd
    • 3.7.2.6 Condition number
    • 3.7.2.7 Linear system
    • 3.7.2.8 Make it singular
  • 3.7.3 Convexity
    • 3.7.3.1 Determine if the following functions are convex
    • 3.7.3.2 Prove the following properties
      • 3.7.3.2.1 PSD implies Convexity
      • 3.7.3.2.2 Convexity implies PSD
      • 3.7.3.2.3 PD implies Strictly Convex
  • Tags: gradescope

3.8 Algorithms Homework 2¶

  • 3.8.1 Finite Difference Approximations
    • 3.8.1.1 Finite difference order
    • 3.8.1.2 Provided Codes
      • 3.8.1.2.1 Finite Difference Code
      • 3.8.1.2.2 Analytic Gradient
      • 3.8.1.2.3 Analytic Hessian
    • 3.8.1.3 Gradient Finite Difference Comparison
    • 3.8.1.4 Hessian Finite Difference using Approximate Gradient
    • 3.8.1.5 Hessian Finite Difference using Exact Gradient
    • 3.8.1.6 Final Answers
    • 3.8.1.7 Discussion
  • 3.8.2 Analysis of possible optimization solutions
    • 3.8.2.1 Point 1
    • 3.8.2.2 Point 2
    • 3.8.2.3 Point 3
    • 3.8.2.4 Point 4
    • 3.8.2.5 Visualize in 3D
    • 3.8.2.6 Convexity
  • 3.8.3 Multivariable Taylor Series
    • 3.8.3.1 Create a function to plot the first order Taylor series using $\nabla f$
    • 3.8.3.2 Taylor Series using my_grad_approx
    • 3.8.3.3 Taylor Series using my_grad_exact
    • 3.8.3.4 Create a function to plot the second order Taylor series using $\nabla f$ and $\nabla^2 f$
    • 3.8.3.5 Taylor series using my_grad_approx and my_hes_approx
    • 3.8.3.6 Taylor series using my_grad_exact and my_hes_exact
    • 3.8.3.7 Discussion
  • Tags: gradescope

3.9 Algorithms Homework 3¶

  • 3.9.1 Pseudocode
  • 3.9.2 Unconstrained NLP Algorithm in Python
    • 3.9.2.1 Library of Helper Functions
    • 3.9.2.2 Main Function
    • 3.9.2.3 Feature Status
  • 3.9.3 Benchmark Tests
    • 3.9.3.1 Quadratic Test Problem
      • 3.9.3.1.1 Benchmark Cases
      • 3.9.3.1.2 Discussion and Analysis
    • 3.9.3.2 One-Dimensional Example
      • 3.9.3.2.1 Benchmark Cases with $x_0 = -3$
      • 3.9.3.2.2 Discussion
    • 3.9.3.3 Benchmark Case with $x_0 = 0$
      • 3.9.3.3.1 Discussion
    • 3.9.3.4 Return of Example 2.19
      • 3.9.3.4.1 Benchmark with $x_0$ Near Solution
      • 3.9.3.4.2 Discussion
      • 3.9.3.4.3 Benchmark with $x_0$ Far From the Solution
      • 3.9.3.4.4 Discussion
  • Markdown Figures
    • ex2-19
  • Figure files used
    • figures/ex2-19.png

Chapter 4.0 Constrained Nonlinear Optimization: Theory and Applications¶

4.1 Convexity Revisited¶

  • 4.1.1 Background
    • 4.1.1.1 Canonical Nonlinear Program
    • 4.1.1.2 Types of Constrained Optimal Solutions
    • 4.1.1.3 Key Questions
  • 4.1.2 Convexity for Constrained Optimization
  • 4.1.3 Circle Packing Example
    • 4.1.3.1 Optimization Model and Pyomo Implementation
    • 4.1.3.2 Visualize Initial Point
    • 4.1.3.3 Solve and Inspect the Solution
    • 4.1.3.4 Reinitialize and Resolve
  • 4.1.4 Take Away Messages
  • Markdown Figures
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
  • Figure files used
    • figures/pack1.png
    • figures/pack3.png
    • figures/pack2.png
    • figures/4_1_questions.png
    • figures/def_4_1_a.png
    • figures/def_4_1_b.png
    • figures/general_nlp.png
    • figures/thm_4_2.png
    • figures/thm_4_3.png

4.2 Local Optimality Conditions¶

  • 4.2.1 Unconstrained Optimality Conditions
  • 4.2.2 Karush-Kuhn-Tucker (KKT) Necessary Conditions-Necessary-Conditions)
  • 4.2.3 Kinematic Interpretation via Example
    • 4.2.3.1 Define Function for Visualization
    • 4.2.3.2 Define function to solve optimization problem with Pyomo
    • 4.2.3.3 Take 1: Unconstrained
    • 4.2.3.4 Take 2. With $g(x) \leq 0$-\leq-0$)
    • 4.2.3.5 Take 3. With $g(x) \leq 0$ and $h(x) = 0$-\leq-0$-and-$h(x)-=-0$)
    • 4.2.3.6 Discussion
    • 4.2.3.7 Analysis without Constraints
    • 4.2.3.8 Analysis with Constraints
  • Markdown Figures
    • picture
    • picture
    • picture
    • picture
    • picture
  • Figure files used
    • figures/constrained_analysis.png
    • figures/unconstrained_opt.png
    • figures/unconstrained_analysis.png
    • figures/kkt_2.png
    • figures/kkt_1.png

4.3 Analysis of KKT Conditions¶

  • 4.3.1 Active Sets
  • 4.3.2 Sensitivity Analysis
  • 4.3.3 Multipliers in Pyomo
    • 4.3.3.1 Solve without warm starting
    • 4.3.3.2 Solve with warm starting
  • Markdown Figures
    • picture
    • picture
    • picture
    • picture
    • picture
  • Figure files used
    • figures/ex-4-7b.png
    • figures/active_constraints2.png
    • figures/def-4-6.png
    • figures/active_constraints.png
    • figures/ex-4-7.png

4.4 Constraint Qualifications¶

  • 4.4.1 Feasible Sequences and Limiting Directions
    • 4.4.1.1 Concepts
    • 4.4.1.2 Linear Constrainted Optimization Problems
  • 4.4.2 Constraint Qualifications
    • 4.4.2.1 Nonlinear Constrained Optimization Problems
    • 4.4.2.2 Linearly Independent Constraint Qualification (LICQ))
    • 4.4.2.3 Mangasarian-Fromovitz Constraint Qualification (MFCQ))
  • 4.4.3 Example
    • 4.4.3.1 Visualize Feasible Set
    • 4.4.3.2 Solve with Pyomo
  • Markdown Figures
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
  • Figure files used
    • figures/def-4-12.png
    • figures/thm-4-9.png
    • figures/thm-4-8.png
    • figures/limiting_directions.png
    • figures/nonlinear-opt-4-3.png
    • figures/nonlinear-opt-4-3b.png
    • figures/thm-4-15.png
    • figures/thm-4-14.png
    • figures/thm-4-16.png
    • figures/linear-opt-4-3.png
    • figures/thm-4-11.png

4.5 Second Order Optimality Conditions¶

  • 4.5.1 Helpful Cones
  • 4.5.2 Second Order Necessary Conditions
  • 4.5.3 Second Order Sufficient Conditions
  • 4.5.4 Reduced Hessian
  • 4.5.5 Example
    • 4.5.5.1 Calculation with numpy
    • 4.5.5.2 Calculate with Pyomo
      • 4.5.5.2.1 Define and solve the model
      • 4.5.5.2.2 Extract the dual variable for the constraint
      • 4.5.5.2.3 Extract the reduced Hessian
  • Markdown Figures
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
    • picture
  • Figure files used
    • figures/projected-reduced-Hessian2.png
    • figures/thm-4-18.png
    • figures/projected-reduced-Hessian1.png
    • figures/ex-4-20.png
    • figures/def-4-19.png
    • figures/errata_83.png
    • figures/thm-4-17.png
    • figures/cone1.png
    • figures/cone2.png

4.6 NLP Diagnostics with Degeneracy Hunter¶

  • 4.6.1 Setup
  • 4.6.2 Example 1: Well-Behaved Nonlinear Program
    • 4.6.2.1 Define the model in Pyomo
    • 4.6.2.2 Evaluate the initial point
    • 4.6.2.3 Identify the constraint residuals larger than 0.1
    • 4.6.2.4 Identify all variables within 1 of their bounds
    • 4.6.2.5 Solve the optimization problem
    • 4.6.2.6 Check if any constraint residuals are large than 10$^{-14}$
    • 4.6.2.7 Identify all variables within 10$^{-5}$ of their bounds
    • 4.6.2.8 Check the rank of the constraint Jacobian at the solution
    • 4.6.2.9 Try Degeneracy Hunter
  • 4.6.3 Example 2: Linear Program with Redundant Equality Constraints
    • 4.6.3.1 Define the model in Pyomo
    • 4.6.3.2 Evaluate the initial point
    • 4.6.3.3 Identify constraints with residuals greater than 0.1 at the initial point
    • 4.6.3.4 Solve the optimization problem and extract the solution
    • 4.6.3.5 Check the rank of the Jacobian of the equality constraints
    • 4.6.3.6 Identify candidate degenerate constraints
    • 4.6.3.7 Find irreducible degenerate sets (IDS))
    • 4.6.3.8 Reformulate Example 2
    • 4.6.3.9 Solve the reformulated model
  • Markdown Links
    • Institute for the Design of Advanced Energy Systems

4.7 Simple Netwon Method for Equality Constrained NLPs¶

  • 4.7.1 Helper Functions
  • 4.7.2 Algorithm 5.1
  • 4.7.3 Example Problem 1
    • 4.7.3.1 Test Finite Difference Approximations
    • 4.7.3.2 Test Algorithm 5.1
  • 4.7.4 Example Problem 2
    • 4.7.4.1 Trying Algorithm 5.1 again without the redundant constraint
  • 4.7.5 Example Problem 3
    • 4.7.5.1 Visualize
    • 4.7.5.2 Starting Point Near Global Min ($\theta_0 = 1.0$))
    • 4.7.5.3 Starting Point Near Local Min ($\theta_0 = \pi$))
    • 4.7.5.4 Starting Point Near Global Max ($\theta_0 = 5.5$))
  • Markdown Figures
    • Alg51
  • Figure files used
    • figures/alg5-1.png

4.8 Inertia-Corrected Netwon Method for Equality Constrained NLPs¶

  • 4.8.1 Helper Functions
  • 4.8.2 Algorithm 5.2
  • 4.8.3 Example Problem 2
    • 4.8.3.1 Test Algorithm 5.2 without redundant constraints.
    • 4.8.3.2 Discussion
  • 4.8.4 Example Problem 3
    • 4.8.4.1 Starting Point Near Global Min ($\theta_0 = 1.0$))
    • 4.8.4.2 Starting Point Near Local Min ($\theta_0 = \pi$))
    • 4.8.4.3 Starting Point Near Global Max ($\theta_0 = 5.5$))
  • Markdown Figures
    • Eqn5-12
    • Alg5-2a
    • Alg5-2b
  • Figure files used
    • figures/alg5-2b.png
    • figures/alg5-2a.png
    • figures/eqn5-12.png

4.9 Algorithms Homework 4: Interior Point Methods¶

  • 4.9.1 Tips and Tricks
    • 4.9.1.1 Background
    • 4.9.1.2 Problem Formulation
    • 4.9.1.3 Reformulation Example
    • 4.9.1.4 Primal Dual Optimality Conditions
  • 4.9.2 Basic Interior Point Method for Inequality and Equality Constraint NLPs
    • 4.9.2.1 Pseudocode
    • 4.9.2.2 Python Implementation
  • 4.9.3 Test Problems
    • 4.9.3.1 Problem 1: Convex
    • 4.9.3.2 Problem 2: Convex
    • 4.9.3.3 Problem 3: Nonconvex
  • Markdown Figures
    • fig6_56
    • fig6_57
    • fig6_58
    • fig6_59
    • ip_bounds_G
  • Figure files used
    • figures/ip_bounds_G.png
    • figures/eq_6_57.png
    • figures/eq_6_56.png
    • figures/eq_6_58.png
    • figures/eq_6_59.png

Chapter 5.0 Special Topics¶

5.1 Integer Programming with Simple Branch and Bound¶

  • 5.1.1 Node 1 (Root))
  • 5.1.2 Node 2
  • 5.1.3 Node 3
  • 5.1.4 Node 4
  • 5.1.5 Node 5
  • 5.1.6 Node 6
  • 5.1.7 Node 7
  • 5.1.8 Node 8
  • 5.1.9 Node 9

5.2 MINLP Algorithms¶

5.3 Deterministic Global Optimization¶

5.4 Bayesian Optimization¶