-
Norm and Penalty Approximation
-
In the first example we solve the norm approximation problems
and the penalty approximation problem
We use randomly generated data.
The code uses the Matplotlib package for plotting the histograms of the
residual vectors for the two solutions. It generates the figure shown below.
from cvxopt import normal
from cvxopt.modeling import variable, op, max, sum
import pylab
m, n = 500, 100
A = normal(m,n)
b = normal(m)
x1 = variable(n)
op(max(abs(A*x1-b))).solve()
x2 = variable(n)
op(sum(abs(A*x2-b))).solve()
x3 = variable(n)
op(sum(max(0, abs(A*x3-b)-0.75, 2*abs(A*x3-b)-2.25))).solve()
pylab.subplot(311)
pylab.hist(A*x1.value-b, m/5)
pylab.subplot(312)
pylab.hist(A*x2.value-b, m/5)
pylab.subplot(313)
pylab.hist(A*x3.value-b, m/5)
pylab.show()
Equivalently, we can formulate and solve the problems as LPs.
t = variable()
x1 = variable(n)
op(t, [-t <= A*x1-b, A*x1-b<=t]).solve()
u = variable(m)
x2 = variable(n)
op(sum(u), [-u <= A*x2+b, A*x2+b <= u]).solve()
v = variable(m)
x3 = variable(n)
op(sum(v), [v >= 0, v >= A*x3+b-0.75, v >= -(A*x3+b)-0.75, v >= 2*(A*x3-b)-2.25, v >= -2*(A*x3-b)-2.25]).solve()
-
Robust Linear Programming
- The robust LP
is equivalent to the problem
The following code computes the solution and the solution of the equivalent
LP
for randomly generated data.
from cvxopt import normal, uniform
from cvxopt.modeling import variable, dot, op, sum
m, n = 500, 100
A = normal(m,n)
b = uniform(m)
c = normal(n)
x = variable(n)
op(dot(c,x), A*x+sum(abs(x)) <= b).solve()
x2 = variable(n)
y = variable(n)
op(dot(c,x2), [A*x2+sum(y) <= b, -y <= x2, x2 <= y]).solve()
-
1-Norm Support Vector Classifier
-
The following problem arises in classification:
It can be solved as follows.
x = variable(A.size[1],’x’)
u = variable(A.size[0],’u’)
op(sum(abs(x)) + sum(u), [A*x >= 1-u, u >= 0]).solve()
An equivalent unconstrained formulation is
x = variable(A.size[1],’x’)
op(sum(abs(x)) + sum(max(0,1-A*x))).solve()