The economics department has a site-license to the Artelys Knitro optimization library.
The network license can be used by all faculty and students from the UBC wired or wireless internet, and through the VPN from outside. Email Jesse for faculty who need an offline license (e.g. to run on airplanes or when the VPN isn't feasible).
This is a commercial optimizer with especially strong support for
- Nonlinear optimization, including large sparse systems (e.g. into the thousands of variables) and mixed-integrer problems
- Complementarity constraints which come about in many structural estimation and game theoretic models
- Constrained nonlinear least squares
- Systems of Nonlinear Equations either through setting up an objective with equality constraints, or using the nonlinear least squares
- Lots of options such as Parallel multi-start, etc. which might not 100% work with all languages quite yet.
- The Tuner which tries out a bunch of optimization algorithms and reports on the best option for you.
The Knitro license is avaiable for use within Julia, Matlab, Python, C/C++, and even Fortran. See the documentation for different languages.
- Download Windows Installer
- Click on the installer and execute it (ignoring security warnings). When it asks for a license file, hit next to avoid choosing one
- After the installation is complete, open up a powershell or cmd terminal and run the following:
setx ARTELYS_LICENSE_NETWORK_ADDR "137.82.185.3:8349"
If you have trouble you can check KNITRO.has_knitro() to see if it finds the binaries for knitro, and do a ] build KNITRO and restart julia if it seems so.
- Navigate in a terminal to where you would want to install the software
- Download the binary from here and unpack, or just execute
wget -qO- https://vse-public-files.s3.us-west-2.amazonaws.com/knitro/knitro-13.1.0-MacOS-64.tar.gz | tar -xzv
- Open your
~/.bash_profilefile (if it doesn't exist, runcdand thentouch .bash_profileto create it.) Inside, add:
export KNITRODIR="~/path/to/knitro/directory"
export DYLD_LIBRARY_PATH="$KNITRODIR/lib"
export ARTELYS_LICENSE_NETWORK_ADDR="turtle.econ.ubc.ca:8349"
For apple silicon, can use https://vse-public-files.s3.us-west-2.amazonaws.com/knitro/knitro-13.1.0-MacOS-M1.tar.gz instead.
Note: If you already have something in DYLD_LIBRARY_PATH, you will need to append this folder; e.g., export DYLD_LIBRARY_PATH="$KNITRODIR/lib:$DYLD_LIBRARY_PATH. But most likely that variable will be empty.
-
Navigate in a terminal to where you would want to install the software
-
Download the binary from here and unpack, or just execute
wget -qO- https://vse-public-files.s3.us-west-2.amazonaws.com/knitro/knitro-13.1.0-Linux-64.tar.gz | tar -xzv
- Open your
~/.bash_profilefile (if it doesn't exist, runcdand thentouch .bash_profileto create it.) Inside, add:
export KNITRODIR="~/path/to/knitro/directory"
export ARTELYS_LICENSE_NETWORK_ADDR="turtle.econ.ubc.ca:8349"
export LD_LIBRARY_PATH="$KNITRODIR/lib"
Note: If you already have something in LD_LIBRARY_PATH, you will need to append this folder; e.g., export LD_LIBRARY_PATH="$KNITRODIR/lib:$LD_LIBRARY_PATH". But most likely that variable will be empty.
- After installing the front-end libraries then depends on the particular programming languages. For Julia, just open a julia terminal and go
] add KNITRO JuMP
For a simple test of the setup, run the following in a new Jupyter notebook
using JuMP, KNITRO
m = Model(optimizer_with_attributes(KNITRO.Optimizer)) # settings for the solver
@variable(m, x)
@variable(m, y)
@NLobjective(m, Min, (1-x)^2 + 100(y-x^2)^2)
optimize!(m)
println("x = ", value(x), " y = ", value(y))The JuMP.jl interface is not the only way to access it, but it provides a good baseline to explore features.
You can switch between linear and nonlinear with the @objective vs. @NLobjective or @constraint vs. @NLconstraint.
For example, using a linear objective and nonlinear constraints
using JuMP, KNITRO
# solve
# max( x[1] + x[2] )
# st sqrt(x[1]^2 + x[2]^2) <= 1
m = Model(optimizer_with_attributes(KNITRO.Optimizer))
@variable(m, x[1:2])
@objective(m, Max, sum(x))
@NLconstraint(m, sqrt(x[1]^2+x[2]^2) <= 1)
@show optimize!(m)- Note that
xis defined as a vector of length2 - Also, note that the Knitro is providing a bunch of output on the identification of the problem, and the choice of algorithms
- e.g.
algorithm from AUTO to 1, andbar_murule from AUTO to 4 - This is the result of heuristics that Knitro is using to figure out the best options for the algorithm choice. You can manually change these settings, as well as use the tuner to explore variations on them.
- e.g.
The following code snippet is the same example as above, but using more of KNITRO's functionality.
using JuMP, KNITRO
# m = Model(optimizer_with_attributes(KNITRO.Optimizer, "honorbnds" => 1, "outlev" => 1, "algorithm" => 4, "ms_enable" => 1)) # (1)
m = Model(optimizer_with_attributes(KNITRO.Optimizer, "honorbnds" => 1, "outlev" => 1, "algorithm" => 4)) # (1)
@variable(m, x, start = 1.2) #(5)
@variable(m, y)
@variable(m, z)
@variable(m, v)
@variable(m, 4.0 <= u <= 4.0) # (2)
mysquare(x) = x^2
register(m, :mysquare, 1, mysquare, autodiff = true) # (3)
@NLobjective(m, Min, mysquare(1 - x) + 100(y-x^2)^2 + u)
@constraint(m, z == x + y)
@constraint(m, v == 5.0) # (4)
optimize!(m)
(value(x), value(y), value(z), value(v), value(u), objective_value(m), termination_status(m)) # (5)-
(1), we see a few options:
-
ms_enable = 1, which turns on multistart. This starts the solver from a few different initial conditions, and takes the minimum over all runs. This may requireKNITRO#masterif it fails. -
outlev = 1. A little printing. Other options areoutlev = 0, 1, 2, 3, 4, 5, 6. See here for what each does. -
honorbnds = 1. Forces intermediate iterates to satisfy the box bounds (not just the initial point and solution). Helps if the bounds define a feasible space, outside of which the problem is ill-behaved. Note that this does not apply to the nonlinear constraints, and probably doesn't apply to linear constraints. -
algorithm = 4chooses the SQP algorithm instead of letting Knitro determine the auto choice. See here for algorithm choices.
-
-
(2) initial condition for a variable
-
(3) and (4) A lower box-bound on the
vvalue, where the upper bound is assumed to be infinite, and a two-sided box bound onu. -
(5) we register our own function
mysquarefor auto-differentiation for use in the model.- Otherwise, the
@NLobjectiveand@NLconstraintcan only handle some basic built-in functions such as powers, sum, etc. - The arguments are:
(model, :name_in_the_model, number_of_arguments, name_in_julia, autodiff = true) - If you have parameters you don't want to pass to the solver, create a closure (e.g.
f_fixed(x) = f(x, p), wherepis a bunch of parameters, and then registerf_fixedinstead.
- Otherwise, the
-
(6) provide a linear constraint. Note that we use
@constraintand not@NLconstraint. -
(7) A linear equality constraint that
uthat is fixed at 4.- If you look at the output, it says
Knitro presolve eliminated 1 variable and 1 constraint - This means that Knitro figured out that the variable didn't need to be solved for, and eliminated the constraint and variable
- While this is trivial to see here, sometimes this can help dramatically when the fixed constraints and values are less obvious
- A lesson is that you shoouldn't worry about putting on lots of linear constraints.
- If you look at the output, it says
-
(8) Access results from the optimized model. For more on the outputs and getter functions, see here.
For the full list of KNITRO solver options, see here.
As with most solvers, there are a number of variations on the parameters and algorithms , which can have differences in the speed of convergence. It won't matter for a problem this small and simple, but for real problems it can be dramatic.
The tuner = 1 option enables the KNITRO solver parameter tuner which will try out various algorithm and setting shoices. You generally would only run this once and then use the best options in the algorithm etc. later.
using JuMP, KNITRO
m = Model(optimizer_with_attributes(KNITRO.Optimizer, "tuner" => 1)) # (1)
@variable(m, x, start = 1.2) #(2)
@variable(m, y)
@variable(m, z)
@variable(m, v >= 0.0) # (3)
@variable(m, -4.0 <= u <= 4.0) # (4)
mysquare(x) = x^2
register(m, :mysquare, 1, mysquare, autodiff = true) # (5)
@NLobjective(m, Min, mysquare(1 - x) + 100(y-x^2)^2 + u)
@constraint(m, z == x + y) # (6)
@constraint(m, v == 5.0) # (7)
optimize!(m)The output says that Knitro Tuner will explore up to 48 option combinations and then goes through a number of permutations.
At the edn, it may give output such as
Tuner non-default option settings for this solve are:
algorithm: 1
bar_murule: 5
hessopt: 1
hessian_no_f: 1
linsolver: 4
Which tells you which options you could set to get the fastest execution time. (e.g. m = Model(with_optimizer(KNITRO.Optimizer, algorithm = 1, bar_murule=5) etc.)
However, unless the differences are dramatic, you may be better off leaving the default options.
- If something is a linear or a quadratic constraint, define it as such. Constrained optimizers have a much easier time dealing with linear/quadratic constraints
- Don't be afraid to add in extra variables with simple linear constraints or even nonlinear definitions. Commerical optimizers are very good at dealing with additional variables if they have a jacobian through auto-differentiation (or simply through linearity)
- If your box bound is not a numeric literal (i.e.,
x >= kfor some constantk), thenxalways needs to be on the left. That is:
@variable(m, x >= k) # OK @variable(m, k <= x) # not OK - If you see an error like
AttributeNotSupported, this is likely because theJuMPinterface for KNITRO has a bug. If that feature is essential, consider using the KNITRO-specific API found here.
See https://www.artelys.com/docs/knitro/2_userGuide/gettingStarted/startPython.html#how-to-use-the-knitro-python-interface for setup instructions After installation, try
from knitro import *
# Define the variables information
variables = Variables(nV=4, xLoBnds=[0,0,0,0])
# Define the objective information
# Default objGoal is set to 'minimize'
objective = Objective(objLinear=[[0, 1], [-4, -2]])
# Define the constraints information
constraints = Constraints(nC=2,
cLinear=[[0, 0, 0, 1, 1, 1],
[0, 1, 2, 0, 1, 3],
[1., 1., 1., 2., 0.5, 1.]],
cEqBnds=[5., 8.])
# Solve the problem
solution = optimize(variables=variables,
objective=objective,
constraints=constraints)