Hello, I'm trying to solve the following optimization problem: min f(x) under h(x) > L, where x is n-dimensional and h is a piecewise linear function. h(x) can also be written as h(x) = h_1(x_1)+...+h_n(x_n), where each h_i is a piecewise linear function. E.g.: h_i = 0 if x_i < 5 and 10 if x_i >=5. Note that each h_i is monotonically increasing. For the problem i'm trying to solve, n is > 100 and each piecewise function h_i consists of around 25 "cutpoints". I have a dataset which describes h (columns: i, "start cutpoint", "end cutpoint", value). Is there an easy way to solve this problem? I tried to solve a really simple version of the problem: n = 2, f(x) = x_1+x_2 and h(x) = 0, unless x_1 >=4, then h(x) = 3. Furthermore I set 1<= x_i<=10 and L = 2. Therefore the optimal solution is (4,1). However the following SAS code fails to deliver the optimal solution: proc optmodel;
number n = 2;
var x{1..n} >=1 <=10;
minimize f = x[1]+x[2];
con loss: (x[1] >= 4)*3 >= 2;
solve with NLP /multistart=(Maxstarts=20);
print x[1] x[2];
run; Without the multistart option, the solution is not feasible (1.15, 1.15). With the multistart option the solution is better, but not good (4.8,1.6). It seems to me, that the solver is not able to work with the constraint. I tried to write the constraint in other ways, but so far I was not able to achieve a satisfying solution. Obviously the feasible region of the problem I'm trying to solve is complicated, however since each h_i is monotonically increasing the region is connected and therefore it should be possible to solve this problem, or am I missing something? So my question is: How to do an optimization with a piecewise linear constraint? Greetings Bernd

... View more