SAS Optimization, and SAS Simulation Studio

turn on suggestions

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

Find a Community

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-03-2016 05:05 PM - edited 11-04-2016 10:30 AM

Hi everybody,

I am optimizing an integer (binary) linear model with **proc optmilp**. After many iterations, the solver runs out of memory. The process was moving in the right direction started to plateau with a large gap (>300%) so I need to find ways to optimize this further.

Here is what I have done so far:

- Changed the formulation of the model couple of times in order to use less variables.
- Aggregated my data and split the problem into multiple smaller problems (cover smaller geographic areas one by one instead of trying to solve one nationwide model.)
- Developed a method to give the optimization model a decent starting point (an initial feasible solution, so that it does not start from zeros.) I am generating this initial solution with a clustering algorithm.
- Increased memory allocation (from 4 GB to 8, then 12, and recently up to 16 GB of memory.) I confirmed that the amount of memory requested is indeed being assigned to the SAS program (the logs show that.)
- Confirmed that SAS has plenty of temp space. Have 400 GB available at the beginning of the process (and records that show that this space is not being consumed while the model is running.)
- Tweaked some solver options (set cutting planes to aggresive.)

This is not a complex or uncommon model (the heart of it is a simple assignment problem) so probably many ways exist to improve upon this. I thought that I would share this with the SAS optimization community as people with more experience in this particular area may have lots to suggest. I want to solve this with SAS… that would be my goal.

I created a “toy problem” with simulated test data (to protect the confidential details) but the structure of the model is essentially the same. I am posting 3 SAS programs with their respective logs:

**optim_test_scenario1.sas:**Small model. This one solves easily. No problems here. I created it just to test the process. The first part of the script creates the test dataset. The last part has the actual optimization model.**optim_test_scenario2.sas:**Medium model. Same model structure but larger input datasets (which translate into more variables and constraints.) This model ran out of memory (with 16 GB) but exited gracefully so I have a solution saved. The solution was moving in the right direction (compared to the seed) but was not optimal.**optim_test_scenario3.sas:**Large model. Same model structure but many more variables and constraints. This model ran out of memory (with 16 GB) and did not exit gracefully so I don’t have a solution saved.

My goal would be to make scenario2 work. That would probably be enough. But if there is a way to solve for something of the magnitude of scenario 3, that would be even better.

Here are the main questions in my mind:

- What ways may exist to formulate this problem more efficiently? using less variables, less resources, but solving for the same minimization goal (without oversimplifying the problem)
- What solver parameters or options may help this particular problem to converge more quickly (this is where many opportunities may exist, that would be my guess, I have not explored this enough, I see lots of possibilities and don’t have a lot of time to test/try them all one by one.)
- How much memory is typically needed for a problem of this size? (maybe I’m asking for too little) What do people typically use? Or how does one determine the right sizing? (the log shows the size of the problem in terms of # variables, constraints, etc.)
- Is there a way to have SAS start using temp space (disk) when memory is running out? So that the process does not abort.

Thanks to anybody that takes a look at this. I would appreciate any comments or feedback.

Carlos

Accepted Solutions

Solution

02-18-2017
10:30 PM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

01-09-2017 06:13 PM

Yes, Benders decomposition applies even when the LP subproblem has only one block. In fact, the original paper by Benders in 1962 treated only that case. But exploiting block structure can certainly boost performance.

For completeness, here is one implementation of Benders decomposition for the capacitated version of your problem. To avoid the complications of dynamically generated feasibility cuts, I made VALID_MAPS the fully dense cross product of ORDERS and LOCATIONS. That change and the introduction of the SufficientCapacity constraint together imply that the subproblem will always be feasible. With max_orders_per_location = 5 and an optimality gap of 5%, scenario 2 solves quickly on my laptop.

```
%let relobjgap = 0.05;
proc optmodel printlevel=0;
/* Benders decomposition parameters */
num infinity = constant('BIG');
num best_lower_bound init 0; /* some finite lower bound on the original objective */
num best_upper_bound init infinity;
num gap = (if best_lower_bound ne 0 and best_upper_bound < infinity
then (best_upper_bound - best_lower_bound) / abs(best_lower_bound)
else .);
/* declare MILP master problem over complicating variables */
*Define data elements;
put 'optmodel: Define data elements';
set <num> ORDERS;
set <num> LOCATIONS;
num distance {ORDERS, LOCATIONS} init &inf_dist.;
*Define variable 1;
var On {LOCATIONS} binary init 0;
*Read datasets 1;
put 'optmodel: Read datasets 1';
read data work.orders into ORDERS=[ORDER_ID];
read data work.points into LOCATIONS=[LOCATION_ID] On=on;
read data work.distances into [ORDER_ID LOCATION_ID] distance=&distance_var.;
*Define variable 2;
/* set VALID_MAPS = {p in ORDERS, w in LOCATIONS: distance[p,w] <= (&inf_dist.-10)};*/
set VALID_MAPS = ORDERS cross LOCATIONS; /* make dense to avoid feasibility cuts */
/* var Map {VALID_MAPS} binary init 0;*/
var Map {VALID_MAPS} >= 0; /* relax to get correct duals */
*Read datasets 2;
put 'optmodel: Read datasets 2';
read data work.distances into [ORDER_ID LOCATION_ID] Map=map;
*Calculations;
put 'optmodel: Calculations';
impvar total_LOCATIONs_on = sum{w in LOCATIONS} On[w];
impvar total_maps_on = sum{<p,w> in VALID_MAPS} Map[p,w];
impvar total_distance = sum{<p,w> in VALID_MAPS} Map[p,w]*distance[p,w];
impvar total_penalty = total_LOCATIONs_on * &LOCATION_distance_penalty.;
*Mechanical constraints;
put 'optmodel: Mechanical constraints';
con ORDERs_mapped_correctly {p in ORDERS}:
/* (sum{<(p),w> in VALID_MAPS} Map[p,w]) = 1;*/
sum{w in LOCATIONS} Map[p,w] = 1;
set SUPPORT init {};
* con LOCATIONs_mapped_correctly {w in LOCATIONS}:
(sum{<p,(w)> in VALID_MAPS} Map[p,w]) <= (w in SUPPORT) * &max_ORDERs_per_LOCATION.;
con LOCATIONs_mapped_correctly {<p,w> in VALID_MAPS}:
-Map[p,w] >= -(w in SUPPORT);
num capacity {w in LOCATIONS} = min(&max_ORDERs_per_LOCATION., card(ORDERS));
con CapacityCon {w in LOCATIONS}:
-sum{p in ORDERS} Map[p,w] >= -(w in SUPPORT) * capacity[w];
con SufficientCapacity:
sum {w in LOCATIONS} capacity[w] * On[w] >= card(ORDERS);
*Business constraints;
put 'optmodel: Business constraints';
con max_LOCATIONs_on:
total_LOCATIONs_on <= &final_max_LOCATIONs_on.;
*Minimize distance;
put 'optmodel: Set Goals';
min goal = total_distance + total_penalty;
*Initial solution check (seed);
print "VALUES BEFORE SOLVING" total_LOCATIONs_on total_maps_on goal total_distance total_penalty;
/* master objective */
var Eta >= 0;
num Eta_sol;
min MasterObjective = total_penalty + Eta;
/* optimality cuts */
num num_optcuts init 0;
set OPTCUTS = 1..num_optcuts;
num cut_constant {OPTCUTS};
num cut_coefficient {LOCATIONS, OPTCUTS} init 0;
con OptimalityCut {optcut in OPTCUTS}:
Eta >= cut_constant[optcut] + sum {w in LOCATIONS} cut_coefficient[w,optcut] * On[w];
/* omit feasibility cuts (assume SufficientCapacity implies feasibility of subproblem) */
/* additional valid cuts */
/* smallest positive distance from p */
num spd {p in ORDERS} = min {w in LOCATIONS diff {p}} distance[p,w];
/* if On[p] = 0 then Eta[p] >= spd[p] */
* con OnOrPositiveDistance {p in ORDERS}:
Eta[p] >= spd[p] * (1 - On[p]);
con OnOrPositiveDistance:
Eta >= sum {p in ORDERS} spd[p] * (1 - On[p]);
problem Master include
On Eta
MasterObjective
SufficientCapacity
max_LOCATIONs_on
OptimalityCut OnOrPositiveDistance;
/* declare subproblem over Map here (same as original but with On fixed) */
min SubproblemObjective = sum{<p,w> in VALID_MAPS} Map[p,w]*distance[p,w];
problem Subproblem include
Map
SubproblemObjective
ORDERs_mapped_correctly LOCATIONs_mapped_correctly CapacityCon;
num lower_bound;
num upper_bound = SubproblemObjective.sol + &LOCATION_distance_penalty. * card(SUPPORT);
set INCUMBENT init {};
num Map_sol {VALID_MAPS};
/* main loop */
do until (gap ne . and gap <= &relobjgap);
put 'Solving Master...';
use problem Master;
put num_optcuts=;
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
Eta = max(Eta.lb,
sum {p in ORDERS} spd[p] * (1 - On[p]),
max {optcut in OPTCUTS}
(cut_constant[optcut] + sum {w in LOCATIONS} cut_coefficient[w,optcut] * On[w])
);
solve with MILP / primalin logfreq=10000 target=(best_lower_bound) relobjgap=&relobjgap;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: master is infeasible';
stop;
end;
Eta_sol = Eta.sol;
lower_bound = _OROPTMODEL_NUM_['BEST_BOUND'];
best_lower_bound = max(best_lower_bound, lower_bound);
SUPPORT = {w in LOCATIONS: On[w].sol > 0.5};
put SUPPORT=;
put 'Solving Subproblem...';
use problem Subproblem;
solve with LP / algorithm=ip crossover=0 logfreq=0;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: subproblem is infeasible';
stop;
end;
else do;
if best_upper_bound > upper_bound then do;
best_upper_bound = upper_bound;
INCUMBENT = SUPPORT;
for {<p,w> in VALID_MAPS} Map_sol[p,w] = Map[p,w].sol;
create data incumbent from [LOCATION_ID]=INCUMBENT;
end;
if lower_bound < upper_bound then do;
if Eta_sol < SubproblemObjective then do;
/* generate new optimality cut */
num_optcuts = num_optcuts + 1;
cut_constant[num_optcuts] = sum {p in ORDERS} ORDERs_mapped_correctly[p].dual;
for {w in LOCATIONS}
cut_coefficient[w,num_optcuts] =
- sum{p in ORDERS} LOCATIONs_mapped_correctly[p,w].dual
- capacity[w] * CapacityCon[w].dual;
end;
end;
end;
put "BOUNDS: " best_lower_bound= best_upper_bound= gap=;
end;
put INCUMBENT=;
/* recover optimal solution */
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
for {<p,w> in VALID_MAPS} Map[p,w] = Map_sol[p,w];
quit;
```

All Replies

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-07-2016 12:41 PM

I quickly looked at your model (but I haven't run it): I think there's no problem with it.

Probably you already have looked at these:

But these examples do the same as your code.

If you remove total_penalty from your model, does it run faster? (Of course, if it is part of the problem, then this is just a test.)

I would try to set the **priority** of **on** variabels higher then the **map** variables. Maybe that helps...

Also if you set the branching **direction** to "Round up to nearest integer", you probably will get faster a feasible solution.

160078 binary variables are a lot. Maybe reducing the number of links (VALID_MAPS set size) even more? I guess you already tried that.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-07-2016 03:34 PM

@ gergely_batho: thanks for looking at the model. Yes, I had seen those articles and had used the tip shown in the second one in order to save memory. That's a great tip, indeed. Like you said, my problem is very similar to that model, just a bit larger.

If I remove the penalty and let the "MAX_LOCATIONS_ON" constraint kick-in, then the gap decreases (which is good) but still does not converge and the problem still runs out of memory. Reducing the number of links futher would be quivalent to reducing the area that a given location may cover, so I would prefer to not to attack this problem from that angle.

Thanks for the tip on priority and branching direction. I have added these changes to my model (after defining all the variables) and will now run it again:

*Set priority and branching direction (1=Round up to nearest integer);

for {w in LOCATIONS} on[w].priority = 5;

for {w in LOCATIONS} on[w].direction = 1;

for {<p,w> in VALID_MAPS} map[p,w].direction = 1;

If any other thoughts or suggestions, please advice!

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-07-2016 03:39 PM

I will take a look, but one thing I noticed right away is that you are not passing the initial solution to the solver. The most straightforward way to do this is with the PRIMALIN option in the SOLVE WITH MILP statement. Because you are instead calling PROC OPTMILP via a SUBMIT block, you should first save the solution to a data set:

```
create data primalin(drop=j) from [j] _VAR_=_VAR_.label _VALUE_=_VAR_.sol;
```

And then add PRIMALIN=primalin to the PROC OPTMILP call.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-08-2016 11:25 AM - edited 11-08-2016 11:43 AM

Rob, thanks for pointing out the initialization issue. Indeed, I was loading the initial solution to proc optmodel but was not passing that information along to proc optmilp. Thanks for the code snippet. I have implemented all the suggested changes in my models (see attached script) and will rerun them now. Will share the results when the run is over. I am not sure if the 2 inputs datasets (MPS & PRIMALIN) are passing everything including priority and branching direction. Perhaps we need to pass that explicitly too. If any other comment or suggestion, please let me know. Very much appreciated!

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-09-2016 03:20 PM - edited 11-23-2016 04:44 PM

Yes, the branching priorities and directions are included in the BRANCH section of the MPS data set.

A few other things to try, alone or in combination:

1. Relax the map variables to be >= 0 instead of binary.

2. Relax the ORDERs_mapped_correctly constraints to be >= 1 instead of = 1.

3. You have omitted the link constraints that appear in the doc examples, presumably to save memory. They are not required but do strengthen the formulation. To recover some of the strength without bloating the problem size, include these constraints instead:

```
con Cover {p in ORDERS}:
sum{<(p),w> in VALID_MAPS} On[w] >= 1;
```

4. Use the OPTMILP option tuner.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-23-2016 04:44 PM - edited 11-23-2016 09:59 PM

I have spent some time on this and can solve scenario 2 (optimal objective value is 347,566) in a few minutes by exploiting the problem structure. Explicitly, I relaxed the Map variables to be >= 0 instead of binary and then applied classical Benders decomposition, with a MILP master problem and an LP subproblem. The motivation is that the problem is easy once the On variables are fixed. See also this SAS Global Forum paper for a simplified approach that does not use the .DUAL suffix or exploit the independent blocks in the subproblem.

```
%let relobjgap = 1e-4;
proc optmodel printlevel=0;
/* Benders decomposition parameters */
num infinity = constant('BIG');
num best_lower_bound init 0; /* some finite lower bound on the original objective */
num best_upper_bound init infinity;
num gap = (if best_lower_bound ne 0 and best_upper_bound < infinity
then (best_upper_bound - best_lower_bound) / abs(best_lower_bound)
else .);
/* declare MILP master problem over complicating variables */
*Define data elements;
put 'optmodel: Define data elements';
set <num> ORDERS;
set <num> LOCATIONS;
num distance {ORDERS, LOCATIONS} init &inf_dist.;
*Define variable 1;
var On {LOCATIONS} binary init 0;
*Read datasets 1;
put 'optmodel: Read datasets 1';
read data work.orders into ORDERS=[ORDER_ID];
read data work.points into LOCATIONS=[LOCATION_ID] On=on;
read data work.distances into [ORDER_ID LOCATION_ID] distance=&distance_var.;
*Define variable 2;
set VALID_MAPS = {p in ORDERS, w in LOCATIONS: distance[p,w] <= (&inf_dist.-10)};
/* var Map {VALID_MAPS} binary init 0;*/
var Map {VALID_MAPS} >= 0; /* relax to get correct duals */
*Read datasets 2;
put 'optmodel: Read datasets 2';
read data work.distances into [ORDER_ID LOCATION_ID] Map=map;
*Calculations;
put 'optmodel: Calculations';
impvar total_LOCATIONs_on = sum{w in LOCATIONS} On[w];
impvar total_maps_on = sum{<p,w> in VALID_MAPS} Map[p,w];
impvar total_distance = sum{<p,w> in VALID_MAPS} Map[p,w]*distance[p,w];
impvar total_penalty = total_LOCATIONs_on * &LOCATION_distance_penalty.;
set LOCATIONS_p {ORDERS} init {};
for {<p,w> in VALID_MAPS} LOCATIONS_p[p] = LOCATIONS_p[p] union {w};
*Mechanical constraints;
put 'optmodel: Mechanical constraints';
con ORDERs_mapped_correctly {p in ORDERS}:
/* (sum{<(p),w> in VALID_MAPS} Map[p,w]) = 1;*/
sum{w in LOCATIONS_p[p]} Map[p,w] = 1;
con Cover {p in ORDERS}:
/* sum{<(p),w> in VALID_MAPS} On[w] >= 1;*/
sum{w in LOCATIONS_p[p]} On[w] >= 1;
set SUPPORT init {};
* con LOCATIONs_mapped_correctly {w in LOCATIONS}:
(sum{<p,(w)> in VALID_MAPS} Map[p,w]) <= (w in SUPPORT) * &max_ORDERs_per_LOCATION.;
con LOCATIONs_mapped_correctly {<p,w> in VALID_MAPS}:
-Map[p,w] >= -(w in SUPPORT);
*Business constraints;
put 'optmodel: Business constraints';
con max_LOCATIONs_on:
total_LOCATIONs_on <= &final_max_LOCATIONs_on.;
*Minimize distance;
put 'optmodel: Set Goals';
min goal = total_distance + total_penalty;
*Initial solution check (seed);
print "VALUES BEFORE SOLVING" total_LOCATIONs_on total_maps_on goal total_distance total_penalty;
/* master objective */
var Eta {ORDERS} >= 0;
num Eta_sol {ORDERS};
min MasterObjective = total_penalty + sum {p in ORDERS} Eta[p];
/* optimality cuts */
num num_optcuts {ORDERS} init 0;
set OPTCUTS {p in ORDERS} = 1..num_optcuts[p];
num cut_constant {p in ORDERS, OPTCUTS[p]};
num cut_coefficient {p in ORDERS, w in LOCATIONS_p[p], OPTCUTS[p]} init 0;
con OptimalityCut {p in ORDERS, optcut in OPTCUTS[p]}:
Eta[p] >= cut_constant[p,optcut] + sum {w in LOCATIONS_p[p]} cut_coefficient[p,w,optcut] * On[w];
/* omit feasibility cuts (assume Cover implies feasibility of subproblem) */
problem Master include
On Eta
MasterObjective
Cover
max_LOCATIONs_on
OptimalityCut;
/* declare subproblem over Map here (same as original but with On fixed) */
impvar SubproblemObjectiveBlock {p in ORDERS} = sum{w in LOCATIONS_p[p]} Map[p,w]*distance[p,w];
min SubproblemObjective = sum {p in ORDERS} SubproblemObjectiveBlock[p];
problem Subproblem include
Map
SubproblemObjective
ORDERs_mapped_correctly LOCATIONs_mapped_correctly;
num lower_bound;
num upper_bound = SubproblemObjective.sol + &LOCATION_distance_penalty. * card(SUPPORT);
set INCUMBENT init {};
num Map_sol {VALID_MAPS};
/* main loop */
do until (gap ne . and gap <= &relobjgap);
put 'Solving Master...';
use problem Master;
/* put num_optcuts[*];*/
put (sum {p in ORDERS} num_optcuts[p])=;
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
for {p in ORDERS} Eta[p] = max(Eta[p].lb,
max {optcut in OPTCUTS[p]}
(cut_constant[p,optcut] + sum {w in LOCATIONS_p[p]} cut_coefficient[p,w,optcut] * On[w])
);
solve with MILP / primalin logfreq=10000 target=(best_lower_bound) relobjgap=&relobjgap;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: master is infeasible';
stop;
end;
for {p in ORDERS} Eta_sol[p] = Eta[p].sol;
lower_bound = _OROPTMODEL_NUM_['BEST_BOUND'];
best_lower_bound = max(best_lower_bound, lower_bound);
SUPPORT = {w in LOCATIONS: On[w].sol > 0.5};
put SUPPORT=;
put 'Solving Subproblem...';
use problem Subproblem;
solve with LP / algorithm=ip crossover=0 logfreq=0;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: subproblem is infeasible';
stop;
end;
else do;
if best_upper_bound > upper_bound then do;
best_upper_bound = upper_bound;
INCUMBENT = SUPPORT;
for {<p,w> in VALID_MAPS} Map_sol[p,w] = Map[p,w].sol;
create data incumbent from [LOCATION_ID]=INCUMBENT;
end;
if lower_bound < upper_bound then do;
for {p in ORDERS: Eta_sol[p] < SubproblemObjectiveBlock[p]} do;
/* generate new optimality cut */
num_optcuts[p] = num_optcuts[p] + 1;
cut_constant[p,num_optcuts[p]] = ORDERs_mapped_correctly[p].dual;
for {w in LOCATIONS_p[p]}
cut_coefficient[p,w,num_optcuts[p]] = -LOCATIONs_mapped_correctly[p,w].dual;
end;
end;
end;
put "BOUNDS: " best_lower_bound= best_upper_bound= gap=;
end;
put INCUMBENT=;
/* recover optimal solution */
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
for {<p,w> in VALID_MAPS} Map[p,w] = Map_sol[p,w];
quit;
```

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-23-2016 06:24 PM

Rob,

Thanks for the suggestion. Looks promising. I will review and try this approach.

Hope you have a Happy Thanksgiving. Thank you sir!

Carlos

Thanks for the suggestion. Looks promising. I will review and try this approach.

Hope you have a Happy Thanksgiving. Thank you sir!

Carlos

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

11-30-2016 04:57 PM

Did you try this out? The following valid cuts seem to help by strengthening the master problem:

```
/* additional valid cuts */
/* smallest positive distance from p */
num spd {p in ORDERS} = min {w in LOCATIONS_p[p] diff {p}} distance[p,w];
/* if On[p] = 0 then Eta[p] >= spd[p] */
con OnOrPositiveDistance {p in ORDERS}:
Eta[p] >= spd[p] * (1 - On[p]);
problem Master include
On Eta
MasterObjective
Cover
max_LOCATIONs_on
OptimalityCut OnOrPositiveDistance;
```

For scenario 3, the optimal objective value is between 142.8M and 144.8M.

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

12-29-2016 04:38 PM

Rob,

Thank you again for this suggestion. I had to read a bunch to become familiar with the benders technique, but I get it know. Thank you sir!

If I wanted to put back the original constraint that limits how many orders can be served by each location:

`/*Removing this: */`

con LOCATIONs_mapped_correctly {<p,w> in VALID_MAPS}: -Map[p,w] >= -(w in ON_SUPPORT);

/*Going back to this: */

con LOCATIONs_mapped_correctly {w in LOCATIONS}:
sum{<p,(w)> in VALID_MAPS} map[p,w] <= (w in SUPPORT) * &max_ORDERs_per_LOCATION.;

What else would have to change in the model? I can see at least two changes on the optimality cuts. One, remove the negative sign from the dual value. Two, change how the "cut_coefficient" term is indexed (change from [p,w] to [w] so to speak) since the original constraint is indexed by locations and not by pairs. This secong change is what I am not sure how to implement, because it seems to break the block structure (which is based on orders.) Any thoughts?

```
/* In this part of the code: */
for {p in ORDERS: Eta_sol[p] < SubproblemObjectiveBlock[p]} do;
num_optcuts[p] = num_optcuts[p] + 1;
cut_constant[p,num_optcuts[p]] = ORDERs_mapped_correctly[p].dual;
/* replacing this code... */
for {w in VALID_MAP_LOCATIONS[p]}
cut_coefficient[p,w,num_optcuts[p]] = -LOCATIONs_mapped_correctly[p,w].dual
end;
/* replacing that with something of the form... */
for {w in LOCATIONS}
new_coefficient[w, new_num_optcuts[w]] = LOCATIONs_mapped_correctly[w].dual
/* In the part of the code where optimality constraints are defined, we would need something along these lines */
num new_num_optcuts {LOCATIONS} init 0;
set new_OPTCUTS {w in LOCATIONS} = 1..new_num_optcuts[w];
num new_coefficient {w in LOCATIONS, new_OPTCUTS[w]};
/* But then the structure of the optimality cut would have to change... this is where I am stuck... */
con OptimalityCut {p in ORDERS, optcut in OPTCUTS[p]}:
Eta[p] >= cut_constant[p,optcut] + sum {w in VALID_MAP_LOCATIONS[p]} cut_coefficient[p,w,optcut] * On[w]
+ sum {w in LOCATIONS} new_coefficient;
```

Looks like we would have to sacrifice the block structure in order to add that constraint in it's original form. You may see an easier way to handle this. Please let me know if you do.

Thank you sir. Hope you have a Happy New Year!

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

01-03-2017 11:20 AM - edited 01-03-2017 11:26 AM

Yes, the capacity constraint destroys the block structure, and the corresponding optimality cuts are weaker. You would need a single Eta variable instead of Eta[p] for each p. And you would also need master Benders feasibility cuts because the subproblem would not necessarily be feasible without them. What values of max_ORDERs_per_LOCATION are you interested in for the three scenarios?

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

01-03-2017 07:02 PM

Rob, many thanks for the comments!

I like the proposed block structure a lot. My top priority is to get the problem to solve quickly, and this method clearly delivered on that front. The capacity constraint (in this particular case) is just the cherry on the pie.

Let me ask you a follow up question. In your experience, is Benders decomposition a good approach to try when the problem doesn't have a block structure? In theory I think it would be, because one can use LP for the more numerous "map" variables and reduce the integer problem to the "on" variables.

But if in practice the biggest benefits tend to come from exploiting the block structures, then I am ready to accept the current approach. That would be a good solution for me. I would appreciate if you can let me know what you have seen in practice.

Regarding your question, with this simulated data, I would test the method with max orders of 5 or 7 per location. When running this in production, I would use larger numbers. The real data (which I cannot send) is more "clustered" than this simulated data, if that makes sense.

Thank you sir!

I like the proposed block structure a lot. My top priority is to get the problem to solve quickly, and this method clearly delivered on that front. The capacity constraint (in this particular case) is just the cherry on the pie.

Let me ask you a follow up question. In your experience, is Benders decomposition a good approach to try when the problem doesn't have a block structure? In theory I think it would be, because one can use LP for the more numerous "map" variables and reduce the integer problem to the "on" variables.

But if in practice the biggest benefits tend to come from exploiting the block structures, then I am ready to accept the current approach. That would be a good solution for me. I would appreciate if you can let me know what you have seen in practice.

Regarding your question, with this simulated data, I would test the method with max orders of 5 or 7 per location. When running this in production, I would use larger numbers. The real data (which I cannot send) is more "clustered" than this simulated data, if that makes sense.

Thank you sir!

Solution

02-18-2017
10:30 PM

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

01-09-2017 06:13 PM

Yes, Benders decomposition applies even when the LP subproblem has only one block. In fact, the original paper by Benders in 1962 treated only that case. But exploiting block structure can certainly boost performance.

For completeness, here is one implementation of Benders decomposition for the capacitated version of your problem. To avoid the complications of dynamically generated feasibility cuts, I made VALID_MAPS the fully dense cross product of ORDERS and LOCATIONS. That change and the introduction of the SufficientCapacity constraint together imply that the subproblem will always be feasible. With max_orders_per_location = 5 and an optimality gap of 5%, scenario 2 solves quickly on my laptop.

```
%let relobjgap = 0.05;
proc optmodel printlevel=0;
/* Benders decomposition parameters */
num infinity = constant('BIG');
num best_lower_bound init 0; /* some finite lower bound on the original objective */
num best_upper_bound init infinity;
num gap = (if best_lower_bound ne 0 and best_upper_bound < infinity
then (best_upper_bound - best_lower_bound) / abs(best_lower_bound)
else .);
/* declare MILP master problem over complicating variables */
*Define data elements;
put 'optmodel: Define data elements';
set <num> ORDERS;
set <num> LOCATIONS;
num distance {ORDERS, LOCATIONS} init &inf_dist.;
*Define variable 1;
var On {LOCATIONS} binary init 0;
*Read datasets 1;
put 'optmodel: Read datasets 1';
read data work.orders into ORDERS=[ORDER_ID];
read data work.points into LOCATIONS=[LOCATION_ID] On=on;
read data work.distances into [ORDER_ID LOCATION_ID] distance=&distance_var.;
*Define variable 2;
/* set VALID_MAPS = {p in ORDERS, w in LOCATIONS: distance[p,w] <= (&inf_dist.-10)};*/
set VALID_MAPS = ORDERS cross LOCATIONS; /* make dense to avoid feasibility cuts */
/* var Map {VALID_MAPS} binary init 0;*/
var Map {VALID_MAPS} >= 0; /* relax to get correct duals */
*Read datasets 2;
put 'optmodel: Read datasets 2';
read data work.distances into [ORDER_ID LOCATION_ID] Map=map;
*Calculations;
put 'optmodel: Calculations';
impvar total_LOCATIONs_on = sum{w in LOCATIONS} On[w];
impvar total_maps_on = sum{<p,w> in VALID_MAPS} Map[p,w];
impvar total_distance = sum{<p,w> in VALID_MAPS} Map[p,w]*distance[p,w];
impvar total_penalty = total_LOCATIONs_on * &LOCATION_distance_penalty.;
*Mechanical constraints;
put 'optmodel: Mechanical constraints';
con ORDERs_mapped_correctly {p in ORDERS}:
/* (sum{<(p),w> in VALID_MAPS} Map[p,w]) = 1;*/
sum{w in LOCATIONS} Map[p,w] = 1;
set SUPPORT init {};
* con LOCATIONs_mapped_correctly {w in LOCATIONS}:
(sum{<p,(w)> in VALID_MAPS} Map[p,w]) <= (w in SUPPORT) * &max_ORDERs_per_LOCATION.;
con LOCATIONs_mapped_correctly {<p,w> in VALID_MAPS}:
-Map[p,w] >= -(w in SUPPORT);
num capacity {w in LOCATIONS} = min(&max_ORDERs_per_LOCATION., card(ORDERS));
con CapacityCon {w in LOCATIONS}:
-sum{p in ORDERS} Map[p,w] >= -(w in SUPPORT) * capacity[w];
con SufficientCapacity:
sum {w in LOCATIONS} capacity[w] * On[w] >= card(ORDERS);
*Business constraints;
put 'optmodel: Business constraints';
con max_LOCATIONs_on:
total_LOCATIONs_on <= &final_max_LOCATIONs_on.;
*Minimize distance;
put 'optmodel: Set Goals';
min goal = total_distance + total_penalty;
*Initial solution check (seed);
print "VALUES BEFORE SOLVING" total_LOCATIONs_on total_maps_on goal total_distance total_penalty;
/* master objective */
var Eta >= 0;
num Eta_sol;
min MasterObjective = total_penalty + Eta;
/* optimality cuts */
num num_optcuts init 0;
set OPTCUTS = 1..num_optcuts;
num cut_constant {OPTCUTS};
num cut_coefficient {LOCATIONS, OPTCUTS} init 0;
con OptimalityCut {optcut in OPTCUTS}:
Eta >= cut_constant[optcut] + sum {w in LOCATIONS} cut_coefficient[w,optcut] * On[w];
/* omit feasibility cuts (assume SufficientCapacity implies feasibility of subproblem) */
/* additional valid cuts */
/* smallest positive distance from p */
num spd {p in ORDERS} = min {w in LOCATIONS diff {p}} distance[p,w];
/* if On[p] = 0 then Eta[p] >= spd[p] */
* con OnOrPositiveDistance {p in ORDERS}:
Eta[p] >= spd[p] * (1 - On[p]);
con OnOrPositiveDistance:
Eta >= sum {p in ORDERS} spd[p] * (1 - On[p]);
problem Master include
On Eta
MasterObjective
SufficientCapacity
max_LOCATIONs_on
OptimalityCut OnOrPositiveDistance;
/* declare subproblem over Map here (same as original but with On fixed) */
min SubproblemObjective = sum{<p,w> in VALID_MAPS} Map[p,w]*distance[p,w];
problem Subproblem include
Map
SubproblemObjective
ORDERs_mapped_correctly LOCATIONs_mapped_correctly CapacityCon;
num lower_bound;
num upper_bound = SubproblemObjective.sol + &LOCATION_distance_penalty. * card(SUPPORT);
set INCUMBENT init {};
num Map_sol {VALID_MAPS};
/* main loop */
do until (gap ne . and gap <= &relobjgap);
put 'Solving Master...';
use problem Master;
put num_optcuts=;
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
Eta = max(Eta.lb,
sum {p in ORDERS} spd[p] * (1 - On[p]),
max {optcut in OPTCUTS}
(cut_constant[optcut] + sum {w in LOCATIONS} cut_coefficient[w,optcut] * On[w])
);
solve with MILP / primalin logfreq=10000 target=(best_lower_bound) relobjgap=&relobjgap;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: master is infeasible';
stop;
end;
Eta_sol = Eta.sol;
lower_bound = _OROPTMODEL_NUM_['BEST_BOUND'];
best_lower_bound = max(best_lower_bound, lower_bound);
SUPPORT = {w in LOCATIONS: On[w].sol > 0.5};
put SUPPORT=;
put 'Solving Subproblem...';
use problem Subproblem;
solve with LP / algorithm=ip crossover=0 logfreq=0;
if _solution_status_ = 'INFEASIBLE' then do;
put 'ERROR: subproblem is infeasible';
stop;
end;
else do;
if best_upper_bound > upper_bound then do;
best_upper_bound = upper_bound;
INCUMBENT = SUPPORT;
for {<p,w> in VALID_MAPS} Map_sol[p,w] = Map[p,w].sol;
create data incumbent from [LOCATION_ID]=INCUMBENT;
end;
if lower_bound < upper_bound then do;
if Eta_sol < SubproblemObjective then do;
/* generate new optimality cut */
num_optcuts = num_optcuts + 1;
cut_constant[num_optcuts] = sum {p in ORDERS} ORDERs_mapped_correctly[p].dual;
for {w in LOCATIONS}
cut_coefficient[w,num_optcuts] =
- sum{p in ORDERS} LOCATIONs_mapped_correctly[p,w].dual
- capacity[w] * CapacityCon[w].dual;
end;
end;
end;
put "BOUNDS: " best_lower_bound= best_upper_bound= gap=;
end;
put INCUMBENT=;
/* recover optimal solution */
for {w in LOCATIONS} On[w] = (w in INCUMBENT);
for {<p,w> in VALID_MAPS} Map[p,w] = Map_sol[p,w];
quit;
```

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

01-13-2017 11:31 PM

Rob,

Thank you very much for this suggestion. Looks like this model has all the elements. I am going to run it when I come back to the office in couple of weeks.

Thank you sir.

Carlos

Thank you very much for this suggestion. Looks like this model has all the elements. I am going to run it when I come back to the office in couple of weeks.

Thank you sir.

Carlos

- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Email to a Friend
- Report Inappropriate Content

02-18-2017 10:32 PM

Rob, I just completed the review of the results of my model using benders decomposition, and it generated very good solutions. Thank you sir!