The simplex algorithm
The simplex algorithm (or “method”) is an iterative technique for finding an optimal solution of the linear programming problem of eq. ¿eq:constrained_linear_optimization_1?,eq:constrained_linear_optimization_2. The details of the algorithm are somewhat involved, but the basic idea is to start at a vertex of the feasible solution space \(S\) and traverse an edge of the polytope that leads to another vertex with a greater value of \(f\). Then, repeat this process until there is no neighboring vertex with a greater value of \(f\), at which point the solution is guaranteed to be optimal.
Rather than present the details of the algorithm, we choose to show
an example using Python. There have been some improvements on the
original algorithm that have been implemented into many standard
software packages, including Python’s scipy
package (Virtanen et al. 2019) module
scipy.optimize
.1
Maximize the objective function for x ∈ ℝ2 and subject to constraints
First, load some Python packages.
from scipy.optimize import linprog
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
Ecoding the Problem
Before we can use linprog
, we must
first encode [@eq:simplex_example_01_f;@eq:simplex_example_01_constraints]
into a form linprog
will recognize. We
begin with f, which we can
write as c ⋅ x with
the coefficients of c
as follows.
= [-5, -2] # negative to find max c
We’ve negated each constant because linprog
minimizes f and we want to maximize
f. Now let’s encode the
inequality constraints. We will write the left-hand side coefficients in
the matrix A and the
right-hand-side values in vector a such that Notice that one
of our constraint inequalities is ≥
instead of ≤. We can flip this by
multiplying the inequality by − 1. We
use simple lists to encode A
and a.
= [
A 4, 1],
[1, 3],
[8, 1]
[
]= [40, 35, 75] a
Now we need to define the lower l and upper u bounds of x. The function linprog
expects these to be in a single list
of lower- and upper-bounds of each xi.
= [
lu 0, 10),
(-5,15),
( ]
We want to keep track of each step linprog
takes. We can access these by defining
a function callback
, to be passed to
linprog
.
= [] # for storing the steps
x def callback(res): # called at each step
global x
print(f"nit = {res.nit}, x_k = {res.x}")
# store x.append(res.x.copy())
Now we need to call linprog
. We don’t
have any equality constraints, so we need only use the keyword arguments
A_ub=A
and b_ub=a
. For demonstration purposes, we tell it
to use the 'simplex'
method
, which is not as good as its other
method
s, which use better algorithms
based on the simplex.
= linprog(
res
c, =A,
A_ub=a,
b_ub=lu,
bounds='simplex',
method=callback
callback
)= np.array(x) x
nit = 0, x_k = [ 0. -5.]
nit = 1, x_k = [10. -5.]
nit = 2, x_k = [8.75 5. ]
nit = 3, x_k = [7.72727273 9.09090909]
nit = 4, x_k = [7.72727273 9.09090909]
nit = 5, x_k = [7.72727273 9.09090909]
nit = 5, x_k = [7.72727273 9.09090909]
So the optimal solution (x,f(x)) is as follows.
print(f"optimum x: {res.x}")
print(f"optimum f(x): {-res.fun}")
optimum x: [7.72727273 9.09090909]
optimum f(x): 56.81818181818182
The last point was repeated
- Once because there was no adjacent vertex with greater f(x) and
- Twice because the algorithm calls
callback
twice on the last step.
Plotting
When the solution space is in ℝ2, it is helpful to graphically represent the solution space, constraints, and the progression of the algorithm. We begin by defining the inequality lines from A and a over the bounds of x1.
= len(c) # number of variables x
n = np.shape(A)[0] # number of inequality constraints
m = np.empty([m,2])
x2 for i in range(0,m):
= -A[i][0]/A[i][1]*np.array(lu[0]) + a[i]/A[i][1] x2[i,:]
Now we plot a contour plot of f over the bounds of x1 and x2 and overlay the
inequality constraints and the steps of the algorithm stored in x
.
= np.array(lu)
lu_array = plt.subplots()
fig, ax 'lines.linewidth'] = 3
mpl.rcParams[# contour plot
= np.linspace(*lu_array[0],100)
X1 = np.linspace(*lu_array[1],100)
X2 = np.meshgrid(X1,X2)
X1, X2 = -c[0]*X1 + -c[1]*X2 # negative because max hack
F2 = ax.contourf(X1,X2,F2)
con = fig.colorbar(con,ax=ax)
cbar 'objective function')
cbar.ax.set_ylabel(# bounds on x
= np.array([1,1])
un = {'c':'w','label':None,'linewidth':6}
opts 0],lu_array[1,0]*un,**opts)
plt.plot(lu_array[0],lu_array[1,1]*un,**opts)
plt.plot(lu_array[0,0]*un,lu_array[1],**opts)
plt.plot(lu_array[0,1]*un,lu_array[1],**opts)
plt.plot(lu_array[# inequality constraints
for i in range(0,m):
= plt.plot(lu[0],x2[i,:],c='w')
p, 'constraint')
p.set_label(# steps
plt.plot(0], x[:,1], '-o', c='r',
x[:,=False, zorder=20, label='simplex'
clip_on
)1])
plt.ylim(lu_array['$x_1$')
plt.xlabel('$x_2$')
plt.ylabel(
plt.legend() plt.show()
Online Resources for Section 8.3
No online resources.