Disclaimer

This notebook is only working under the versions:

  • JuMP 0.19 (unreleased, but currently in master)

  • MathOptInterface 0.4.1

  • GLPK 0.6.0

Description: This notebook is an introduction to JuMP. The topics described are as follows:

  • Installing Julia and JuMP
  • Representing vectors in Julia
  • Structure of a JuMP model
  • Solving general purpose linear programming problem
  • Solving general purpose integer programming problem

Author: Shuvomoy Das Gupta

License: Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Using Julia+JuMP for optimization - getting started


Julia

What is Julia?

Julia is a new programming language.

Why Julia?

  • Free and open-source
  • Syntax similar to MATLAB, speed similar to C

JuMP

What is JuMP?

JuMP is a modelling language for mathematical optimization [1]. It is embedded in Julia.

Why JuMP?

  • Very user-friendly
  • Speed similar to special purpose commercial modelling language like AMPL
  • Solver independent code: same code will run for both commercial and open-source solvers
  • Very easy to implement solver callback and problem modification

Installation

Installing Julia and IJulia

Installing Julia

Go to the download page of Julia, download the appropriate installer and run it. Now you can start an interactive Julia session!

Installing IJulia

IJulia will allow us to create powerful graphical notebooks, which is very convenient for the tutorial. We can insert codes, text, mathematical formulas and multimedia etc. in the same notebook. To install IJulia:

  • Install Anaconda from http://continuum.io/downloads. If you are on Windows, then while running the Anaconda installer please check the options Add Anaconda to the System Path and also Register Anaconda as default Python version of the system.

  • Now start a Julia interactive session. Type the following Pkg.add("IJulia")

  • To open a new notebook

    • In the Julia interactive session, run Using IJulia and then notebook(). It will open the Home for IJulia in your web browser.

    • The directory location can be checked by the command pwd(). If you want to change the directory to something else, the before running Using IJulia and notebook(), run cd(path to your preferred directory), e.g., cd("E:\\Dropbox\\Julia_Workspaces").

    • Click on New Notebook on the top right corner. In the new notebook, you can execute any Julia command pressing SHIFT+ENTER.

Installing JuMP

At first add the JuMP package by running the following code in the notebook:

In [1]:
#Pkg.add("JuMP")

We need to install a solver package. Let's install the open source solvers GLPK, Cbc and Clp by typing in Pkg.add("GLPKMathProgInterface"), Pkg.add("Cbc") and Pkg.add("Clp") respectively. Let's add the Julia package associated with CPLEX by typing in Pkg.add("CPLEX"). The other choices are "CPLEX", "Cbc", "Clp", "Gurobi", "Xpress" and "MOSEK".

It should be noted that, in order to use commercial solvers such as CPLEX, Gurobi, Xpress and Mosek in JuMP, we will require working installations of them with appropriate licences. Both Gurobi and Mosek are free for academic use. CPLEX is free for faculty members and graduate teaching assistants.

In [2]:
#Pkg.add("GLPKMathProgInterface")
In [3]:
#Pkg.add("Cbc")
In [4]:
#Pkg.add("Clp")
In [5]:
#Pkg.add("CPLEX") # Working installation of CPLEX is needed in advance
In [6]:
#Pkg.add("Gurobi") # Working installation of Gurobi is needed in advance
In [7]:
#Pkg.add("Xpress") # Working installation of Xpress is needed in advance

If you have not updated your Julia packages in a while, a good idea might be updating them.

In [8]:
#Pkg.update()
In [9]:
println("Hello World!")
Hello World!
In [10]:
using JuMP  # Need to say it whenever we use JuMP
using MathOptInterface
# shortcuts
const MOI = MathOptInterface
const MOIU = MathOptInterface.Utilities

using GLPK # Loading the GLPK module for using its solver

The very first example

At first let us try to solve a very simple and trivial optimization problem using JuMP to check if everything is working properly.

\begin{align} \text{minimize} \qquad & x+y \\ \text{subject to} \quad \quad & x+y \leq 1 \\ \qquad \qquad & x \geq 0, y \geq 0 \\ \qquad \qquad & x,y \in \mathbb{R} \end{align}

Here is the JuMP code to solve the mentioned problem:

In [11]:
#MODEL CONSTRUCTION
#-------------------- 
myModel = Model(optimizer = GLPK.GLPKOptimizerLP()) 
# Name of the model object. All constraints and variables of an optimization problem are associated 
# with a particular model object. The name of the model object does not have to be myModel, it can be yourModel too!
# This Creates an empty Model with solver attached.

#VARIABLES
#---------

# A variable is modelled using @variable(name of the model object, variable name and bound, variable type)
# Bound can be lower bound, upper bound or both. If no variable type is defined, then it is treated as 
#real. For binary variable write Bin and for integer use Int.

@variable(myModel, x >= 0) # Models x >=0

# Some possible variations:
# @variable(myModel, x, Bin) # No bound on x present, but x is a binary variable now
# @variable(myModel, x <= 10) # This one defines a variable with lower bound x <= 10
# @variable(myModel, 0 <= x <= 10, Int) # This one has both lower and upper bound, and x is an integer

@variable(myModel, y >= 0) # Models y >= 0

#OBJECTIVE
#---------

@objective(myModel, Min, x + y) # Sets the objective to be minimized. For maximization use Max

#CONSTRAINTS
#-----------

@constraint(myModel, x + y <= 1) # Adds the constraint x + y <= 1

#THE MODEL IN A HUMAN-READABLE FORMAT (TODO)
#------------------------------------
#println("The optimization problem to be solved is:")
#print(myModel) # Shows the model constructed in a human-readable form

#SOLVE IT
#--------
JuMP.optimize(myModel) # solves the model

# TEST SOLVER STATUSES
#---------------------
@show JuMP.hasresultvalues(myModel)
@show JuMP.terminationstatus(myModel) == MOI.Success
@show JuMP.primalstatus(myModel) == MOI.FeasiblePoint

# DISPLAY THE RESULTS
#--------------------
println("Objective value: ", JuMP.objectivevalue(myModel)) # JuMP.objectivevalue(model_name) gives the optimum objective value
println("x = ", JuMP.resultvalue(x)) # JuMP.resultvalue(decision_variable) will give the optimum value of the associated decision variable
println("y = ", JuMP.resultvalue(y))
JuMP.hasresultvalues(myModel) = true
JuMP.terminationstatus(myModel) == MOI.Success = true
JuMP.primalstatus(myModel) == MOI.FeasiblePoint = true
Objective value: 0.0
x = 0.0
y = 0.0

This was certainly not the most exciting optimization problem to solve. This was for test purpose only. However, before going into the structure of a JuMP model, let us learn how to represent vectors in Julia.

Representing vectors in Julia

  • A column vector, $y=(y_1, y_2, \ldots, y_n)= \begin{pmatrix} y_1 \\ y_2 \\ . \\ . \\ y_n \end{pmatrix} \in \mathbb{R}^n$ will be written in Julia as [y[1];y[2];...;y[n]].

    For example to create column vector $\begin{pmatrix} 3 \\ 2.4 \\ 9.1 \\ \end{pmatrix}$ use: [3; 2.4; 9.1].

In [12]:
[3; 2.4; 9.1] # Column vector
Out[12]:
3-element Array{Float64,1}:
 3.0
 2.4
 9.1
  • A row vector, $z=(z_1 \; z_2 \; \ldots \; z_n) \in \mathbb{R}^{1 \times n}$ will be written in Julia as [z[1] y[2]...z[n]].

    For example to create row vector $(1.2 \; 3.5 \; 8.21)$ use: [1.2 3.5 8.21].

In [13]:
[1.2 3.5 8.21] # Row vector
Out[13]:
1×3 Array{Float64,2}:
 1.2  3.5  8.21
  • To create a $m \times n$ matrix
$$ A = \begin{pmatrix} A_{11} & A_{12} & A_{13} & \ldots &A_{1n} \\ \ldots & \ldots & \ldots & \ldots & \ldots \\ A_{m1} & A_{m2} & A_{m3} & \ldots & A_{mn} \end{pmatrix} $$

write:

[A[1,1] A[1,2] A[1,3]... A[1,n]; ... ; A[m,1] A[m,2] ... A[m,n]].

So the matrix

$$ A = \begin{pmatrix} 1 & 1 & 9 & 5 \\ 3 & 5 & 0 & 8 \\ 2 & 0 & 6 & 13 \end{pmatrix} $$

is represented in Julia as:

A= [ 1 1 9 5; 3 5 0 8; 2 0 6 13 ]

In [14]:
# Generating a matrix
A= [
     1 1 9 5;
     3 5 0 8;
     2 0 6 13
    ]
Out[14]:
3×4 Array{Int64,2}:
 1  1  9   5
 3  5  0   8
 2  0  6  13

$A_{ij}$ can be accessed by A[i,j] ,the $i$th row of the matrix A is represented by A[i,:], the $j$th column of the matrix A is represented by A[:,j].

The size of a matrix $A$ can be determined by running the command size(A). If we write numRows, numCols = size(A), then numRows and numCols will contain the total number of rows and columns of A respectively.

In [15]:
numRows, numCols = size(A)
println(
"A has ", numRows, " rows and ", numCols, " columns \n",
"A[3,3] is ", A[3,3], "\n",
"The 3rd column of A is ", A[:,3], "\n",
"The 2nd row of A is ", A[2,:]
)
A has 3 rows and 4 columns 
A[3,3] is 6
The 3rd column of A is [9, 0, 6]
The 2nd row of A is [3, 5, 0, 8]

Suppose $x,y \in \mathbb{R}^n$. Then $x^T y =\sum_{i=1}^{n} {x_i y_i}$ is written as dot(x,y).

In [16]:
y=[1; 2; 3; 4]
x=[5; 6; 7; 8]
xTy=dot(x,y)
Out[16]:
70

Structure of a JuMP model

Any JuMP model that describes an optimization problem must have four parts:

  • Model Object,
  • Optimizer Object,
  • Variables,
  • Objective,
  • Constraints.

Model

Any instance of an optimization problem corresponds to a model object. This model object is associated with all the variables, constraints and objective of the instance. It is constructed using modelName = Model(). At this point a solver/optimizer might be specified or not.

Optimizer/Solver

We can use open source solvers such as:

  • Linear Programming Solver: ClpOptimizer(), GLPKOptimizerLP()
  • Mixed Integer Programming Solver: GLPKOptimizerMIP() CbcOptimizer()

Or commercial solver such as:

  • LP and MIP: XpressOptimizer(), GurobiOptimizer(), CPLEXOptimizer()

There are a few options to handle the solver object:

Automatic with optimizer

This is the easiest method to use a solver in JuMP. In order to do so, we simply set the solver inside the Model constructor:

Model(optimizer = GLPK.GLPKOptimizerLP())

Model(mode = JuMP.Automatic, optimizer = GLPK.GLPKOptimizerLP())

Automatic with NO optimizer

It is also possible to create a JuMP model with no optimizer attached.

After the Model Object is initialized empty (Model()) and all its Variables, Constraints and Objective are set, then we can attach the solver with the two steps:

MOIU.resetoptimizer!(myModel, GLPK.GLPKOptimizerLP())

MOIU.attachoptimizer!(myModel)

Direct model with a non default backend

Some solvers are able to handle the problem data directly. This is common for LP/MIP solver but not very common for open-source conic solvers.

In this case we do not set a optimizer, we set a backend which is more generic and is able to hold data and not only solving a model.

Model(mode = JuMP.Direct, backend = GLPK.GLPKOptimizerLP())

Manual model with an optimizer

Similar to the Automatic+optimizer, but there are less protections from the user getting errors from the solver API. On the other side, nothig happens silently, which might give the user more control.

Model(mode = JuMP.Manual, backend = GLPK.GLPKOptimizerLP())

this mode requires attaching the solver before the solve step

Variables

Variables are defined using @variable macro, which takes up to three input arguments. The first argument is the name of the model. Then the second argument contains the name of the variable, and a bound on the variable if it exists. The third argument is not needed if the variable is real. When the variable is binary or integer, then Bin or Int, respectively, is used in place of the third argument.

Examples of Variables

Suppose the model object is myModel.

In [17]:
myModel = Model(optimizer = GLPK.GLPKOptimizerLP())
Out[17]:
A JuMP Model
  • To describe a variable $z \in \mathbb{R}$ such that $0 \leq z \leq 10$ write
In [18]:
@variable(myModel, 0 <= z <= 10)
Out[18]:
$$ z $$
  • Now consider a decision variable $x \in \mathbb{R}^n$, and it has a bound $l \preceq x \preceq u$, where naturally $l, u \in \mathbb{R}^n$. For that we write
In [19]:
# INPUT DATA, CHANGE THEM TO YOUR REQUIREMENT
#-------------------------------------------
n = 10
l = [1; 2; 3; 4; 5; 6; 7; 8; 9; 10]
u = [10; 11; 12; 13; 14; 15; 16; 17; 18; 19]
Out[19]:
10-element Array{Int64,1}:
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
In [20]:
# VARIABLE DEFINITION
# ------------------- 
@variable(myModel, l[i] <= x[i=1:n] <= u[i])
Out[20]:
10-element Array{JuMP.VariableRef,1}:
 x[1] 
 x[2] 
 x[3] 
 x[4] 
 x[5] 
 x[6] 
 x[7] 
 x[8] 
 x[9] 
 x[10]
  • Suppose we have decision variables $x \in \mathbb{R}^n$, $y \in \mathbb{Z}^m$ and $z \in \mathbb \{0,1\}^p$ such that $x \succeq 0$, $a \preceq y \preceq b$. Here $a, b \in \mathbb{Z}^m$. To express this in JuMP we write
In [21]:
# INPUT DATA, CHANGE THEM TO YOUR REQUIREMENT
#-------------------------------------------
n = 4 # dimension of x
m = 3 # dimension of y
p = 2 # dimensin of z
a = [0; 1; 2]
b = [3; 4; 7]
Out[21]:
3-element Array{Int64,1}:
 3
 4
 7
In [22]:
# VARIABLE DEFINITION
# -------------------
@variable(myModel, xx[i=1:n] >= 0)
Out[22]:
4-element Array{JuMP.VariableRef,1}:
 xx[1]
 xx[2]
 xx[3]
 xx[4]
In [23]:
@variable(myModel, a[i] <= yy[i=1:m] <= b[i], Int)
Out[23]:
3-element Array{JuMP.VariableRef,1}:
 yy[1]
 yy[2]
 yy[3]
In [24]:
@variable(myModel, zz[i=1:p], Bin)
Out[24]:
2-element Array{JuMP.VariableRef,1}:
 zz[1]
 zz[2]

Constraints

Constraints are added by using @constraint macro. The first argument is the model object the constraint is associated with, the second argument is the reference to that constraint and the third argument is the constraint description. The constraint reference comes handy when we want to manipulate the constraint later or access the dual variables associated with it. If no constraint reference is needed, then the second argument is the constraint description.

Examples of Constraints

Let's give some examples on writing constraints in JuMP. Suppose the model name is yourModel.

In [25]:
yourModel = Model()
Out[25]:
A JuMP Model

1 Simple constraints

Consider variables $x, y \in \mathbb{R}$ which are coupled by the constraints $5 x +3 y \leq 5$. We write this as
@constraint(yourModel, 5*x + 3*y <= 5)
Naturally, x and y have to be defined first using @variable macro.

In [26]:
@variable(yourModel, x)
@variable(yourModel, y)
@constraint(yourModel, 5*x + 3*y <= 5)
Out[26]:
JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}}(1))

2 References

Here no constraint reference is given. Now suppose we want to get the dual value of some constraint after solving the problem, then we would need a constraint reference to assign to the constraint first. Let's call the constraint reference as conRef1 (it could be any valid name). Then the same constraint have to be written as:
conRef1 = @constraint(yourModel, 6*x + 4*y >= 5)
When we would need the dual value after solving the problem we just write println(JuMP.resultdual(conRef1)).

In [27]:
conRef1 = @constraint(yourModel, 6*x + 4*y >= 5)
Out[27]:
JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(2))
In [28]:
ret = @constraint(yourModel, conRef2, 6*x + 4*y >= 5)
ret==conRef2
Out[28]:
true

3 Sums

Consider a variable $x \in \mathbb{R}^4$, a coefficient vector $a=(1, -3, 5, -7)$ We want to write a constraint of the form $\sum_{i=1}^4{a_i x_i} \leq 3$. In JuMP we write:

In [29]:
a = [1; -3; 5; 7] 
@variable(yourModel, w[1:4])
@constraint(yourModel, sum(a[i]*w[i] for i in 1:4) <= 3)
Out[29]:
JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.LessThan{Float64}}(4))

Add constraints in a loop

In [30]:
@constraint(yourModel, conRef3[i in 1:3], 6*x + 4*y >= 5*i)
Out[30]:
3-element Array{JuMP.ConstraintRef{JuMP.Model,C} where C,1}:
 JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(5))
 JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(6))
 JuMP.ConstraintRef{JuMP.Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}}(A JuMP Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}(7))

Or just use a regular loop

In [31]:
for i in 1:3
    @constraint(yourModel, 6*x + 4*y >= 5*i)
end

Objective

Objective is set using @objective macro. It has three arguments. The first argument is as usual the model object. The second one is either Max if we want to maximize the objective function, or Min when we want to minimize. The last argument is the description of the objective which has similar syntax to that of constraint definition.

Example of objective

For the previous model, consider the decision variable $w \in \mathbb{R}^4$ and cost vector $c = (2, 3 , 4, 5)$. We want to minimize $c^T w$. In JuMP we would write:

In [32]:
c = [2; 3; 4; 5] 
@objective(yourModel, Min, sum(c[i]*w[i] for i in 1:4))

which could also be a maximization

In [33]:
@objective(yourModel, Max, sum(c[i]*w[i] for i in 1:4))

Solving a standard form Linear Programming

problem Let us try to write the JuMP code for the following standard form optimization problem:

$$ \begin{align} & \text{minimize} && c^T x \\ & \text{subject to} && A x = b \\ & && x \succeq 0 \\ & && x \in \mathbb{R}^n \end{align} $$

where, $n = 4$, $c=(1, 3, 5, 2)$, $A = \begin{pmatrix} 1 & 1 & 9 & 5 \\ 3 & 5 & 0 & 8 \\ 2 & 0 & 6 & 13 \end{pmatrix}$ and $b=(7, 3, 5)$. The symbol $\succeq$ ($\preceq$) stands for element-wise greater (less) than or equal to.

Entering different parts of the code one by one

Let us input different parts of the JuMP code one by one and see the corresponding outputs to detect if everything is okay. Of course we could input the whole code at once.

In [34]:
#MODEL CONSTRUCTION
#------------------

sfLpModel = Model(optimizer = GLPK.GLPKOptimizerLP()) # Name of the model object
Out[34]:
A JuMP Model
In [35]:
#INPUT DATA
#----------

c = [1; 3; 5; 2] 

A= [
     1 1 9 5;
     3 5 0 8;
     2 0 6 13
    ]

b = [7; 3; 5] 

m, n = size(A) # m = number of rows of A, n = number of columns of A
Out[35]:
(3, 4)
In [36]:
#VARIABLES
#---------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
Out[36]:
4-element Array{JuMP.VariableRef,1}:
 x[1]
 x[2]
 x[3]
 x[4]
In [37]:
#CONSTRAINTS
#-----------

for i in 1:m # for all rows do the following
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i]) # the ith row 
    # of A*x is equal to the ith component of b
end # end of the for loop
In [38]:
#OBJECTIVE
#---------

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x
In [39]:
#THE MODEL IN A HUMAN-READABLE FORMAT (TODO)
#------------------------------------

println("The optimization problem to be solved is:")
print(sfLpModel) # Shows the model constructed in a human-readable form
The optimization problem to be solved is:
A JuMP Model
In [40]:
# finally optimize the model
JuMP.optimize(sfLpModel) # solves the model

# TEST SOLVER STATUSES
#---------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Out[40]:
true
In [41]:
#SOLVE IT AND DISPLAY THE RESULTS
#--------------------------------

println("Objective value: ", JuMP.objectivevalue(sfLpModel)) # getObjectiveValue(model_name) gives the optimum objective value

println("Optimal solution is x = \n", JuMP.resultvalue.(x)) # getValue(decision_variable) will give the optimum value 
                                                   # of the associated decision variable
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

The whole code

In [42]:
#INPUT DATA
#----------

c = [1; 3; 5; 2] 

A= [
     1 1 9 5;
     3 5 0 8;
     2 0 6 13
    ]

b = [7; 3; 5] 

m, n = size(A) # m = number of rows of A, n = number of columns of A
Out[42]:
(3, 4)
In [43]:
#MODEL CONSTRUCTION
#------------------

sfLpModel = Model(optimizer = GLPK.GLPKOptimizerLP())

#VARIABLES
#---------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0

#CONSTRAINTS
#-----------

for i in 1:m # for all rows do the following
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i]) # the ith row 
    # of A*x is equal to the ith component of b
end # end of the for loop

#OBJECTIVE
#---------

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 


#SOLVE IT
#--------

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES
#---------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

# DISPLAY THE RESULTS
#-------------------------------- 

println("Objective value: ", JuMP.objectivevalue(sfLpModel)) # JuMP.objectivevalue(model_name) gives the optimum objective value

println("Optimal solution is x = \n", JuMP.resultvalue.(x)) # JuMP.resultvalue(decision_variable) will give the optimum value 
                                                   # of the associated decision variable
  0.000537 seconds (1.00 k allocations: 62.982 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

Modes

Auto + Optimizer

In [44]:
#MODEL INITIALIZATION
#--------------------

sfLpModel = Model(optimizer = GLPK.GLPKOptimizerLP())

#Problem construction
#--------------------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
for i in 1:m
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i])
end

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 

#SOLVE IT
#--------

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES and DISPLAY THE RESULTS
#---------------------------------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

println("Objective value: ", JuMP.objectivevalue(sfLpModel))
println("Optimal solution is x = \n", JuMP.resultvalue.(x))
  0.000516 seconds (923 allocations: 56.656 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

Enforced auto + optimizer

In [45]:
#MODEL INITIALIZATION
#--------------------

sfLpModel = Model(mode = JuMP.Automatic, optimizer = GLPK.GLPKOptimizerLP())

#Problem construction
#--------------------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
for i in 1:m
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i])
end

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 

#SOLVE IT
#--------

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES and DISPLAY THE RESULTS
#---------------------------------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

println("Objective value: ", JuMP.objectivevalue(sfLpModel))
println("Optimal solution is x = \n", JuMP.resultvalue.(x))
  0.000515 seconds (923 allocations: 56.656 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

No Optimizer (at first)

In [46]:
#MODEL INITIALIZATION
#--------------------

sfLpModel = Model()

#Problem construction
#--------------------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
for i in 1:m
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i])
end

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 

#SOLVE IT
#--------
# the first step towards solving a model is to initialize a (empty) solver:
solver = GLPK.GLPKOptimizerLP()

# then the solver is linked to the model by:
MOIU.resetoptimizer!(sfLpModel, solver)

# to push data into the solver:
MOIU.attachoptimizer!(sfLpModel)

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES and DISPLAY THE RESULTS
#---------------------------------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

println("Objective value: ", JuMP.objectivevalue(sfLpModel))
println("Optimal solution is x = \n", JuMP.resultvalue.(x))
  0.000083 seconds (39 allocations: 1.016 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

Direct + backend

In [47]:
#MODEL INITIALIZATION
#--------------------

sfLpModel = Model(mode = JuMP.Direct, backend = GLPK.GLPKOptimizerLP())

#Problem construction
#--------------------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
for i in 1:m
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i])
end

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 

#SOLVE IT
#--------

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES and DISPLAY THE RESULTS
#---------------------------------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

println("Objective value: ", JuMP.objectivevalue(sfLpModel))
println("Optimal solution is x = \n", JuMP.resultvalue.(x))
  0.000086 seconds (39 allocations: 1.016 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

Manual + optimizer

need to attach!

In [48]:
#MODEL INITIALIZATION
#--------------------

sfLpModel = Model(mode = JuMP.Manual, optimizer = GLPK.GLPKOptimizerLP())

#Problem construction
#--------------------

@variable(sfLpModel, x[1:n] >= 0) # Models x >=0
for i in 1:m
    @constraint(sfLpModel, sum(A[i,j]*x[j] for j in 1:n) == b[i])
end

@objective(sfLpModel, Min, sum(c[j]*x[j] for j in 1:n)) # minimize c'x 

#SOLVE IT
#--------

# Attention!
MOIU.attachoptimizer!(sfLpModel)

# finally optimize the model
@time begin
status = JuMP.optimize(sfLpModel) # solves the model
end

# TEST SOLVER STATUSES and DISPLAY THE RESULTS
#---------------------------------------------
@show JuMP.hasresultvalues(sfLpModel)
@show JuMP.terminationstatus(sfLpModel) == MOI.Success
@show JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint

println("Objective value: ", JuMP.objectivevalue(sfLpModel))
println("Optimal solution is x = \n", JuMP.resultvalue.(x))
  0.000059 seconds (39 allocations: 1.016 KiB)
JuMP.hasresultvalues(sfLpModel) = true
JuMP.terminationstatus(sfLpModel) == MOI.Success = true
JuMP.primalstatus(sfLpModel) == MOI.FeasiblePoint = true
Objective value: 4.923076923076923
Optimal solution is x = 
[0.423077, 0.346154, 0.692308, 0.0]

Solving a standard form Mixed Integer Programming Problem

Let us try to write the JuMP code for the following standard form optimization problem:

$$ \begin{align} & \text{minimize} && c^T x + d^T y\\ & \text{subject to} && A x + B y= f \\ & && x \succeq 0, y \succeq 0 \\ & && x \in \mathbb{R}^n, y \in \mathbb{Z}^p \end{align} $$

Here, $A \in \mathbb{R}^{m \times n}, B \in \mathbb{R}^{m \times p}, c \in \mathbb{R}^n, d \in \mathbb{R}^p, f \in \mathbb{R}^m$. The data were randomly generated. The symbol $\succeq$ ($\preceq$) stands for element-wise greater (less) than or equal to.

In [49]:
n = 5
p = 4
m = 3
A=
[0.7511 -0.1357   0.7955  -0.4567 0.1356
-0.6670 -0.3326   0.1657  -0.5519 -0.9367
 1.5894 -0.1302  -0.4313  -0.4875  0.4179]

B=
[-0.09520 -0.28056 -1.33978 0.6506
 -0.8581  -0.3518   1.2788  1.5114
 -0.5925  1.3477    0.1589  0.03495]

c=[0.3468,0.8687,0.1200,0.5024,0.2884]

d=[0.2017,0.2712,0.4997,0.9238]

f = [0.1716,0.3610,0.0705]
Out[49]:
3-element Array{Float64,1}:
 0.1716
 0.361 
 0.0705
In [50]:
sfMipModel = Model()

@variable(sfMipModel, x[1:n] >=0)
@variable(sfMipModel, y[1:p] >= 0, Int)

@objective(sfMipModel, Min, sum(c[i] * x[i] for i in 1:n)+sum(d[i]*y[i] for i in 1:p))

for i in 1:m
    @constraint(sfMipModel, sum(A[i,j]*x[j] for j in 1:n)+ sum(B[i,j]*y[j] for j in 1:p) == f[i])
end

print(sfMipModel, "\n")

solver = GLPK.GLPKOptimizerMIP()
MOIU.resetoptimizer!(sfMipModel, solver)
MOIU.attachoptimizer!(sfMipModel)
JuMP.optimize(sfMipModel) # solves the model
                
t_status = JuMP.terminationstatus(sfMipModel)# == MOI.Success
p_status = JuMP.primalstatus(sfMipModel)# == MOI.FeasiblePoint

st = JuMP.terminationstatus(sfMipModel)
print("Termination status of the problem is ", t_status," and primal status is ", p_status,"\n")

if JuMP.hasresultvalues(sfMipModel) && JuMP.terminationstatus(sfMipModel) == MOI.Success && JuMP.primalstatus(sfMipModel) == MOI.FeasiblePoint
    print("Optimal objective value = ", JuMP.objectivevalue(sfMipModel), "\nOptimal x = ", JuMP.resultvalue.(x), "\nOptimal y = ", JuMP.resultvalue.(y))
end
A JuMP Model
Termination status of the problem is Success and primal status is FeasiblePoint
Optimal objective value = 1.070277955983598
Optimal x = [0.0654907, 0.0, 1.62986, 0.0, 1.22151]
Optimal y = [0.0, 0.0, 1.0, 0.0]

Reference

[1] M. Lubin and I. Dunning, “Computing in Operations Research using Julia”, INFORMS Journal on Computing, to appear, 2014. arXiv:1312.1431