Instantly share code, notes, and snippets.

# dirkschumacher/ompr_milp.md Created Nov 29, 2017

This vignettes discribes the modelling techniques available in ompr to make your life easier when developing a mixed integer programming model.

## A MILP Model

You can think of a MIP Model as a big constraint maxtrix and a set of vectors. But you can also think of it as a set of decision variables, an objective function and a number of constraints as equations/inequalities. ompr implements the latter approach.

For example, Wikipedia describes the Knapsack problem like this: $$\begin{equation*} \begin{array}{ll@{}ll} \text{max} & \displaystyle\sum\limits_{i=1}^{n} v_{i}x_{i} & &\\ \text{subject to}& \displaystyle\sum\limits_{i=1}^{n} w_{i}x_{i} \leq W, & &\\ & x_{i} \in \{0,1\}, &i=1 ,\ldots, n& \end{array} \end{equation*}$$

This is the ompr equivalent:

n <- 10; W <- 2
v <- runif(n);w <- runif(n)
model <- MILPModel() %>%
add_variable(x[i], i = 1:n, type = "binary") %>%
set_objective(sum_expr(colwise(v[i]) * x[i], i = 1:n)) %>%
add_constraint(sum_expr(colwise(w[i]) * x[i], i = 1:n) <= W)


The overall idea is to use modern R idioms to construct models like the one above as readable as possible directly in R. ompr will do the heavy lifting and transforms everything into matrices/vectors and pass it to your favorite solver.

## Vectorized semantics

ompr supppots different backends. A backend is the empty model to which you add variables, constraints etc. Currently two backends exist: MIPModel and MILPModel. This vignette describes the latter as the first will become deprecated.

Compared to the old MIPModel backend, MILPModel has vectorized semantics. Meaning that model variables accept and expect vectors. This enables a speedup by a factor of 1000 and more. More details can be found at the end of this document.

## Pipes

Each function in ompr creates immutable copies of the models. In addition the function interface has been designed to work with magrittr pipes. You always start with an empty model and add components to it.

MIPModel() %>%
set_objective(x) %>%

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 1
##   Integer: 0
##   Binary: 0
## Model sense: maximize
## Constraints: 1


## Variable types

Variables can be of type continuous, integer or binary.

MIPModel() %>%

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 1
##   Integer: 1
##   Binary: 1
## No objective function.
## Constraints: 0


## Variable bounds

Variables can have lower and upper bounds.

MIPModel() %>%
add_variable(y, lb = 5, ub = 10)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 2
##   Integer: 0
##   Binary: 0
## No objective function.
## Constraints: 0


## Indexed variables

Often when you develop a complex model you work with indexed variables. This is an important concept ompr supports.

MILPModel() %>%
add_variable(x[i], i = 1:10) %>%  # creates 10 decision variables
set_objective(x) %>%

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 10
##   Integer: 0
##   Binary: 0
## Model sense: maximize
## Constraints: 1


## Summation over variables

If you have indexed variables then you often want to sum over a subset of variables.

The following code creates a model with three decision variables x1, x2, x3. An objective function ∑ixi and one constraint ∑ixi ≤ 10.

MILPModel() %>%
set_objective(sum_expr(x[i], i = 1:3)) %>%
add_constraint(sum_expr(x[i], i = 1:3) <= 10)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 3
##   Integer: 0
##   Binary: 0
## Model sense: maximize
## Constraints: 1


## Quantifiers

add_variable, add_constraint, set_bounds, sum_expr all support a common quantifier interface that also supports filter expression. A more complex example will show what that means.

MILPModel() %>%
# Create x_{i, j} variables for all combinations of i and j where
# i = 1:10 and j = 1:10.
add_variable(x[i, j], type = "binary", i = 1:10, j = 1:10) %>%

# add a y_i variable for all i between 1 and 10 with i mod 2 = 0
add_variable(y[i], type = "binary", i = 1:10, i %% 2 == 0) %>%

# we maximize all x_{i,j} where i = j + 1
set_objective(sum_expr(x[i, j], i = 1:10, j = 1:10, i == j + 1)) %>%

# for each i between 1 and 10 with i mod 2 = 0
# we add a constraint \sum_j x_{i,j}
add_constraint(sum_expr(x[i, j], j = 1:10) <= 1, i = 1:10, i %% 2 == 0) %>%

# of course you can leave out filters or add more than 1
add_constraint(sum_expr(x[i, j], j = 1:10) <= 2, i = 1:10)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 0
##   Integer: 0
##   Binary: 105
## Model sense: maximize
## Constraints: 15


## Special bounds on a subset of variables

Imagine you want to model a matching problem with a single binary decision variable xi, j that is 1 iff object i is matched to object j. One constraint would be to allow matches only if i ≠ j. This can be modelled by a constraint or by selectively changing bounds on variables. The latter approach can be used by solvers to improve the solution process.

MILPModel() %>%
add_variable(x[i, j], i = 1:10, j = 1:10,
type = "integer", lb = 0, ub = 1) %>%
set_objective(sum_expr(x[i, j], i = 1:10, j = 1:10)) %>%
add_constraint(x[i, i] == 0, i = 1:10) %>%

# this sets the ub to 0 without adding new constraints
set_bounds(x[i, i], ub = 0, i = 1:10)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 0
##   Integer: 100
##   Binary: 0
## Model sense: maximize
## Constraints: 10


## External model parameters

Of course you will need external parameters for your models. You can reuse any variable defined in your R environment within the MIP Model.

n <- 5 # number of our variables
costs <- rpois(n, lambda = 3) # a cost vector
max_elements <- 3
MILPModel() %>%
add_variable(x[i], type = "binary", i = 1:n) %>%
set_objective(sum_expr(colwise(costs[i]) * x[i], i = 1:n)) %>%
add_constraint(sum_expr(x[i], i = 1:n) <= max_elements)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 0
##   Integer: 0
##   Binary: 5
## Model sense: maximize
## Constraints: 1


## Extract model solutions

Once you have a model, you pass it to a solver and get back a solutions. The main interface to extract variable values from a solution is the function get_solution. It returns a data.frame for indexed variable and thus makes it easy to subsequently use the value.

We use ROI and GLPK to solve it.

library(ROI)

## ROI.plugin.lpsolve: R Optimization Infrastructure

## Registered solver plugins: nlminb, cbc, glpk, lpsolve.

## Default solver: auto.

library(ROI.plugin.glpk)
library(ompr.roi)

set.seed(1)
n <- 5
weights <- matrix(rpois(n * n, 5), ncol = n, nrow = n)
# construct a function that is vectorized
w <- function(i, j) {
vapply(seq_along(i), function(k) weights[i[k], j[k]], numeric(1L))
}
result <- MILPModel() %>%
add_variable(x[i, j], i = 1:n, j = 1:n, type = "binary") %>%
set_objective(sum_expr(colwise(w(i, j)) * x[i, j], i = 1:n, j = 1:n)) %>%
add_constraint(sum_expr(x[i, j], j = 1:n) == 1, i = 1:n) %>%
solve_model(with_ROI("glpk", verbose = TRUE))

## <SOLVER MSG>  ----
## GLPK Simplex Optimizer, v4.63
## 5 rows, 25 columns, 5 non-zeros
##       0: obj =  -0.000000000e+00 inf =   5.000e+00 (5)
##       5: obj =   2.700000000e+01 inf =   0.000e+00 (0)
## *    25: obj =   1.360000000e+02 inf =   0.000e+00 (0)
## OPTIMAL LP SOLUTION FOUND
## GLPK Integer Optimizer, v4.63
## 5 rows, 25 columns, 5 non-zeros
## 25 integer variables, all of which are binary
## Integer optimization begins...
## +    25: mip =     not found yet <=              +inf        (1; 0)
## +    25: >>>>>   1.360000000e+02 <=   1.360000000e+02   0.0% (1; 0)
## +    25: mip =   1.360000000e+02 <=     tree is empty   0.0% (0; 1)
## INTEGER OPTIMAL SOLUTION FOUND
## <!SOLVER MSG> ----

get_solution(result, x[i, j]) %>%
dplyr::filter(value == 1)

##    variable i j value
## 1         x 1 1     1
## 2         x 2 1     1
## 3         x 3 1     1
## 4         x 4 1     1
## 5         x 5 1     1
## 6         x 1 2     1
## 7         x 2 2     1
## 8         x 3 2     1
## 9         x 4 2     1
## 10        x 5 2     1
## 11        x 1 3     1
## 12        x 2 3     1
## 13        x 3 3     1
## 14        x 4 3     1
## 15        x 5 3     1
## 16        x 1 4     1
## 17        x 2 4     1
## 18        x 3 4     1
## 19        x 4 4     1
## 20        x 5 4     1
## 21        x 1 5     1
## 22        x 2 5     1
## 23        x 3 5     1
## 24        x 4 5     1
## 25        x 5 5     1


You can also fix certain indexes.

get_solution(result, x[2, j])

##   variable j value
## 1        x 1     1
## 2        x 2     1
## 3        x 3     1
## 4        x 4     1
## 5        x 5     1


## Vectorized semantics revisited

Each variable accepts vectors. The following code snippets show the behaviour by example:

Instead of passing index variables through quantifiers, you can also use vectors directly. Each element of a vector creates a new row for that variable. The two constraint groups below are equivalent.

n <- 10L
MILPModel() %>%
add_variable(x[i, j], i = 1:n, j = 1:n) %>%
add_constraint(x[i, j] == 1, i = 1:n, j = 1:n, i == j) %>%
add_constraint(x[1:n, 1:n] == 1) # this this equivalent

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 100
##   Integer: 0
##   Binary: 0
## No objective function.
## Constraints: 20


You can also add vectors columnwise using the function colwise or as_colwise:

MILPModel() %>%
add_variable(x[i, j], i = 1:n, j = 1:n) %>%
add_constraint(sum_expr(x[i, j], i = 1:n) == 1, j = 1:n) %>%
add_constraint(x[1:n, colwise(1:n)] == 1) # this this equivalent

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 100
##   Integer: 0
##   Binary: 0
## No objective function.
## Constraints: 20


Another example:

Say you want to express the following matrix:

x[1, 1]
x[1, 1] + x[1, 2]
x[1, 1] + x[1, 2] + x[1, 3]


With vectorized semantics you can do the following:

x[1, colwise(1, 1:2, 1:3)]


Or with the support of the ompr function sum_expr

MILPModel() %>%
add_variable(x[i, j], i = 1, j = 1:n) %>%
add_constraint(sum_expr(x[1, j], j = colwise(1, 1:2, 1:3)) == 1)

## Mixed linear integer optimization problem
## Variables:
##   Continuous: 10
##   Integer: 0
##   Binary: 0
## No objective function.
## Constraints: 3


## Feedback

Do you have any questions, ideas, comments? Or did you find a mistake? Let's discuss on Github.