Hi,
Trying to run Proc Opt model. Able to execute it only for 7000 rows. Throwing error for more than 7000.
Please find attached details. request you to please let me know, if we could execute it.
Thanks and regards
I don't see any attachment. I just see this:
Hi Rob,
Included once again. Please let me know, if you are able to see it now.
CustomerID | Slot1 | Slot2 | Slot3 | Slot4 | Slot5 | Slot6 | Slot7 | Slot8 | Slot9 | Slot10 | Slot11 | Slot12 | Slot13 | Slot14 | Slot15 | Call_Intensity |
100 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 3 |
101 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 1 |
102 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 3 |
103 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 3 |
104 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 2 |
105 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 4 |
106 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 2 |
107 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 5 |
108 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 3 |
109 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 2 |
110 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 4 |
111 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 4 |
112 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 5 |
113 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 5 |
114 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 5 |
115 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 2 |
116 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 2 |
117 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 1 |
118 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 2 |
119 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 2 |
120 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 2 |
121 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 4 |
122 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 5 |
123 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 1 |
124 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 5 |
125 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 1 |
126 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 1 |
127 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 5 |
128 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 1 |
129 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 1 |
130 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 4 |
131 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 3 |
132 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 2 |
133 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 5 |
134 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 1 |
135 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 1 |
136 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 3 |
137 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 2 |
138 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 4 |
139 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 1 |
140 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 4 |
141 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 3 |
142 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 3 |
143 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 4 |
144 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 5 |
145 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 |
146 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 4 |
147 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 5 |
148 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 1 |
149 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 2 |
150 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 1 |
151 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 2 |
152 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 3 |
153 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 2 |
154 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 5 |
155 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 4 |
156 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 5 |
157 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 1 |
158 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 4 |
159 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 1 |
160 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 1 |
161 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 3 |
162 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 4 |
163 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 2 |
164 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 3 |
165 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 1 |
166 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 2 |
167 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 4 |
168 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 3 |
169 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 3 |
170 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 3 |
171 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 4 |
172 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 1 |
173 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 4 |
174 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 5 |
175 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 2 |
176 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 1 |
177 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 2 |
178 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 3 |
179 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 1 |
180 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 4 |
181 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 5 |
182 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 1 |
183 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 2 |
184 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 4 |
185 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 5 |
186 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 4 |
187 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 1 |
188 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 2 |
189 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 4 |
190 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 2 |
191 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 3 |
192 | 0.9 | 0.85 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 4 |
193 | 0.8 | 0.75 | 0.7 | 0.65 | 0.6 | 0.55 | 0.5 | 0.45 | 0.4 | 0.35 | 0.3 | 0.25 | 1 | 0.95 | 0.9 | 1 |
proc import out= wfs.Sample_Data
datafile = 'Sample_Data.xlsx'
dbms = xlsx;
sheet = "Data";
getnames = yes;
run;
proc import out= wfs.Capacity
datafile = 'Sample_Data.xlsx'
dbms = xlsx;
sheet = "Capacity";
getnames = yes;
run;
Data wfs.chk;
set wfs.Sample_Data;
array cap_slot(15) cap_slot1-cap_slot15;
run;
Data wfs.prefvector_201805_opt_sample;
set wfs.Sample_Data;
/*Call Capacity*/
cap_slot1 = 1329 ;
cap_slot2 = 1401 ;
cap_slot3 = 1178 ;
cap_slot4 = 769 ;
cap_slot5 = 202 ;
cap_slot6 = 734 ;
cap_slot7 = 257 ;
cap_slot8 = 1114 ;
cap_slot9 = 269 ;
cap_slot10 = 1456 ;
cap_slot11 = 642 ;
cap_slot12 = 412 ;
cap_slot13 = 1401 ;
cap_slot14 = 126 ;
cap_slot15 = 643 ;
/*Actual Slot values read from dataset*/
/*
Slot1 = 1;
Slot2 = 1;
Slot3 = 1;
Slot4 = 1;
Slot5 = 1;
Slot6 = 1;
Slot7 = 1;
Slot8 = 1;
Slot9 = 1;
Slot10 = 1;
Slot11 = 1;
Slot12 = 1;
Slot13 = 1;
Slot14 = 1;
Slot15 = 1;
*/
/*Variable Slots*/
F_slot1 = 1;
F_slot2 = 1;
F_slot3 = 1;
F_slot4 = 1;
F_slot5 = 1;
F_slot6 = 1;
F_slot7 = 1;
F_slot8 = 1;
F_slot9 = 1;
F_slot10 = 1;
F_slot11 = 1;
F_slot12 = 1;
F_slot13 = 1;
F_slot14 = 1;
F_slot15 = 1;
/* cip=5;*/
Sum_F_slot1 = 1;
Sum_F_slot2 = 1;
Sum_F_slot3 = 1;
Sum_F_slot4 = 1;
Sum_F_slot5 = 1;
Sum_F_slot6 = 1;
Sum_F_slot7 = 1;
Sum_F_slot8 = 1;
Sum_F_slot9 = 1;
Sum_F_slot10 = 1;
Sum_F_slot11 = 1;
Sum_F_slot12 = 1;
Sum_F_slot13 = 1;
Sum_F_slot14 = 1;
Sum_F_slot15 = 1;
/*Priority Slots*/
P_slot1 = 1;
P_slot2 = 1;
P_slot3 = 1;
P_slot4 = 1;
P_slot5 = 1;
P_slot6 = 1;
P_slot7 = 1;
P_slot8 = 1;
P_slot9 = 1;
P_slot10 = 1;
P_slot11 = 1;
P_slot12 = 1;
P_slot13 = 1;
P_slot14 = 1;
P_slot15 = 1;
keep
CustomerID
Call_Intensity
cap_slot1
cap_slot2
cap_slot3
cap_slot4
cap_slot5
cap_slot6
cap_slot7
cap_slot8
cap_slot9
cap_slot10
cap_slot11
cap_slot12
cap_slot13
cap_slot14
cap_slot15
Slot1
Slot2
Slot3
Slot4
Slot5
Slot6
Slot7
Slot8
Slot9
Slot10
Slot11
Slot12
Slot13
Slot14
Slot15
F_slot1
F_slot2
F_slot3
F_slot4
F_slot5
F_slot6
F_slot7
F_slot8
F_slot9
F_slot10
F_slot11
F_slot12
F_slot13
F_slot14
F_slot15
cip
Sum_F_slot1
Sum_F_slot2
Sum_F_slot3
Sum_F_slot4
Sum_F_slot5
Sum_F_slot6
Sum_F_slot7
Sum_F_slot8
Sum_F_slot9
Sum_F_slot10
Sum_F_slot11
Sum_F_slot12
Sum_F_slot13
Sum_F_slot14
Sum_F_slot15
P_slot1
P_slot2
P_slot3
P_slot4
P_slot5
P_slot6
P_slot7
P_slot8
P_slot9
P_slot10
P_slot11
P_slot12
P_slot13
P_slot14
P_slot15
;
run;
Data wfs.prefvector_201805_opt_Sample1;
set wfs.prefvector_201805_opt_sample;
/*
if _N_ < 10000;
Cost=50;
*/
run;
proc sql ;
select count(*) into :_Nobs from wfs.prefvector_201805_opt_Sample1 ;
quit ;
%put &_Nobs. ;
proc optmodel;
/*proc optmilp data=prefvector_201805_opt_Sample1;*/
set<num> indx;
number
Call_Intensity{Indx},
cap_slot1 ,
cap_slot10 ,
cap_slot11 ,
cap_slot12 ,
cap_slot13 ,
cap_slot14 ,
cap_slot15 ,
cap_slot2 ,
cap_slot3 ,
cap_slot4 ,
cap_slot5 ,
cap_slot6 ,
cap_slot7 ,
cap_slot8 ,
cap_slot9 ,
cip{Indx},
CustomerID{Indx},
P_slot1{Indx},
P_slot10{Indx},
P_slot11{Indx},
P_slot12{Indx},
P_slot13{Indx},
P_slot14{Indx},
P_slot15{Indx},
P_slot2{Indx},
P_slot3{Indx},
P_slot4{Indx},
P_slot5{Indx},
P_slot6{Indx},
P_slot7{Indx},
P_slot8{Indx},
P_slot9{Indx},
Slot1{Indx},
Slot10{Indx},
Slot11{Indx},
Slot12{Indx},
Slot13{Indx},
Slot14{Indx},
Slot15{Indx},
Slot2{Indx},
Slot3{Indx},
Slot4{Indx},
Slot5{Indx},
Slot6{Indx},
Slot7{Indx},
Slot8{Indx},
Slot9{Indx},
Sum_F_slot1,
Sum_F_slot10,
Sum_F_slot11,
Sum_F_slot12,
Sum_F_slot13,
Sum_F_slot14,
Sum_F_slot15,
Sum_F_slot2,
Sum_F_slot3,
Sum_F_slot4,
Sum_F_slot5,
Sum_F_slot6,
Sum_F_slot7,
Sum_F_slot8,
Sum_F_slot9
;
var
F_slot1{Indx} Integer,
F_slot10{Indx} Integer,
F_slot11{Indx} Integer,
F_slot12{Indx} Integer,
F_slot13{Indx} Integer,
F_slot14{Indx} Integer,
F_slot15{Indx} Integer,
F_slot2{Indx} Integer,
F_slot3{Indx} Integer,
F_slot4{Indx} Integer,
F_slot5{Indx} Integer,
F_slot6{Indx} Integer,
F_slot7{Indx} Integer,
F_slot8{Indx} Integer,
F_slot9{Indx} Integer,
X1 {indx} integer,X2 {indx} integer,X3 {indx} integer,X4 {indx} integer,X5 {indx} integer,X6 {indx} integer,X7 {indx} integer,
X8 {indx} integer,X9 {indx} integer,X10 {indx} integer,X11 {indx} integer,X12 {indx} integer,X13 {indx} integer,X14 {indx} integer,
X15 {indx} integer
;
read data wfs.prefvector_201805_opt_Sample1 into indx=[_N_]
CustomerID = CustomerID
Call_Intensity = Call_Intensity
cap_slot1 = cap_slot1
cap_slot10 = cap_slot10
cap_slot11 = cap_slot11
cap_slot12 = cap_slot12
cap_slot13 = cap_slot13
cap_slot14 = cap_slot14
cap_slot15 = cap_slot15
cap_slot2 = cap_slot2
cap_slot3 = cap_slot3
cap_slot4 = cap_slot4
cap_slot5 = cap_slot5
cap_slot6 = cap_slot6
cap_slot7 = cap_slot7
cap_slot8 = cap_slot8
cap_slot9 = cap_slot9
/*cip = cip*/
F_slot1 = F_slot1
F_slot10 = F_slot10
F_slot11 = F_slot11
F_slot12 = F_slot12
F_slot13 = F_slot13
F_slot14 = F_slot14
F_slot15 = F_slot15
F_slot2 = F_slot2
F_slot3 = F_slot3
F_slot4 = F_slot4
F_slot5 = F_slot5
F_slot6 = F_slot6
F_slot7 = F_slot7
F_slot8 = F_slot8
F_slot9 = F_slot9
P_slot1 = P_slot1
P_slot10 = P_slot10
P_slot11 = P_slot11
P_slot12 = P_slot12
P_slot13 = P_slot13
P_slot14 = P_slot14
P_slot15 = P_slot15
P_slot2 = P_slot2
P_slot3 = P_slot3
P_slot4 = P_slot4
P_slot5 = P_slot5
P_slot6 = P_slot6
P_slot7 = P_slot7
P_slot8 = P_slot8
P_slot9 = P_slot9
Slot1 = Slot1
Slot10 = Slot10
Slot11 = Slot11
Slot12 = Slot12
Slot13 = Slot13
Slot14 = Slot14
Slot15 = Slot15
Slot2 = Slot2
Slot3 = Slot3
Slot4 = Slot4
Slot5 = Slot5
Slot6 = Slot6
Slot7 = Slot7
Slot8 = Slot8
Slot9 = Slot9
Sum_F_slot1 = Sum_F_slot1
Sum_F_slot10 = Sum_F_slot10
Sum_F_slot11 = Sum_F_slot11
Sum_F_slot12 = Sum_F_slot12
Sum_F_slot13 = Sum_F_slot13
Sum_F_slot14 = Sum_F_slot14
Sum_F_slot15 = Sum_F_slot15
Sum_F_slot2 = Sum_F_slot2
Sum_F_slot3 = Sum_F_slot3
Sum_F_slot4 = Sum_F_slot4
Sum_F_slot5 = Sum_F_slot5
Sum_F_slot6 = Sum_F_slot6
Sum_F_slot7 = Sum_F_slot7
Sum_F_slot8 = Sum_F_slot8
Sum_F_slot9 = Sum_F_slot9
;
number n = &_Nobs.; /* size of matrix */
con d1 {i in 1..n}: X1[i]= (F_slot1[i] );
con d2 {i in 1..n}: X2[i]= (F_slot2[i] );
con d3 {i in 1..n}: X3[i]= (F_slot3[i] );
con d4 {i in 1..n}: X4[i]= (F_slot4[i] );
con d5 {i in 1..n}: X5[i]= (F_slot5[i] );
con d6 {i in 1..n}: X6[i]= (F_slot6[i] );
con d7 {i in 1..n}: X7[i]= (F_slot7[i] );
con d8 {i in 1..n}: X8[i]= (F_slot8[i] );
con d9 {i in 1..n}: X9[i]= (F_slot9[i] );
con d10 {i in 1..n}: X10[i]= (F_slot10[i] );
con d11 {i in 1..n}: X11[i]= (F_slot11[i] );
con d12 {i in 1..n}: X12[i]= (F_slot12[i] );
con d13 {i in 1..n}: X13[i]= (F_slot13[i] );
con d14 {i in 1..n}: X14[i]= (F_slot14[i] );
con d15 {i in 1..n}: X15[i]= (F_slot15[i] );
con Eq1 {i in 1..n}: X1[i] <= 1 ;
con Eq2 {i in 1..n}: X2[i] <= 1 ;
con Eq3 {i in 1..n}: X3[i] <= 1 ;
con Eq4 {i in 1..n}: X4[i] <= 1 ;
con Eq5 {i in 1..n}: X5[i] <= 1 ;
con Eq6 {i in 1..n}: X6[i] <= 1 ;
con Eq7 {i in 1..n}: X7[i] <= 1 ;
con Eq8 {i in 1..n}: X8[i] <= 1 ;
con Eq9 {i in 1..n}: X9[i] <= 1 ;
con Eq10 {i in 1..n}: X10[i] <= 1 ;
con Eq11 {i in 1..n}: X11[i] <= 1 ;
con Eq12 {i in 1..n}: X12[i] <= 1 ;
con Eq13 {i in 1..n}: X13[i] <= 1 ;
con Eq14 {i in 1..n}: X14[i] <= 1 ;
con Eq15 {i in 1..n}: X15[i] <= 1 ;
con Eq16 {i in 1..n}: X1[i] >= 0 ;
con Eq17 {i in 1..n}: X2[i] >= 0 ;
con Eq18 {i in 1..n}: X3[i] >= 0 ;
con Eq19 {i in 1..n}: X4[i] >= 0 ;
con Eq20 {i in 1..n}: X5[i] >= 0 ;
con Eq21 {i in 1..n}: X6[i] >= 0 ;
con Eq22 {i in 1..n}: X7[i] >= 0 ;
con Eq23 {i in 1..n}: X8[i] >= 0 ;
con Eq24 {i in 1..n}: X9[i] >= 0 ;
con Eq25 {i in 1..n}: X10[i] >= 0 ;
con Eq26 {i in 1..n}: X11[i] >= 0 ;
con Eq27 {i in 1..n}: X12[i] >= 0 ;
con Eq28 {i in 1..n}: X13[i] >= 0 ;
con Eq29 {i in 1..n}: X14[i] >= 0 ;
con Eq30 {i in 1..n}: X15[i] >= 0 ;
con Eq31 {i in 1..n}: (F_slot1[i]+ F_slot2[i]+F_slot3[i]+ F_slot4[i]+F_slot5[i]+ F_slot6[i]+F_slot7[i]+F_slot8[i]+ F_slot9[i]+F_slot10[i]+ F_slot11[i]+F_slot12[i]+ F_slot13[i]+F_slot14[i]+F_slot15[i]) <= Call_Intensity[i];
con q1: sum{i in 1..n}(F_slot1[i])<= 100;
con q2: sum{i in 1..n}(F_slot2[i])<= 150;
con q3: sum{i in 1..n}(F_slot3[i])<= 100;
con q4: sum{i in 1..n}(F_slot4[i])<= 100;
con q5: sum{i in 1..n}(F_slot5[i])<= 100;
con q6: sum{i in 1..n}(F_slot6[i])<= 100;
con q7: sum{i in 1..n}(F_slot7[i])<= 129;
con q8: sum{i in 1..n}(F_slot8[i])<= 100;
con q9: sum{i in 1..n}(F_slot9[i])<= 229;
con q10: sum{i in 1..n}(F_slot10[i])<= 100;
con q11: sum{i in 1..n}(F_slot11[i])<= 100;
con q12: sum{i in 1..n}(F_slot12[i])<= 100;
con q13: sum{i in 1..n}(F_slot13[i])<= 100;
con q14: sum{i in 1..n}(F_slot14[i])<= 100;
con q15: sum{i in 1..n}(F_slot15[i])<= 100;
max Total_cost = sum{i in 1..n}
/* (P_slot1[i]+ P_slot2[i]+P_slot3[i]+ P_slot4[i]+P_slot5[i]+ P_slot6[i]+P_slot7[i]+P_slot8[i]+ P_slot9[i]+P_slot10[i]+ P_slot11[i]+P_slot12[i]+ P_slot13[i]+P_slot14[i]+P_slot15[i]) ;*/
((Slot1[i] * F_Slot1[i])+
(Slot2[i] * F_Slot2[i])+
(Slot3[i] * F_Slot3[i])+
(Slot4[i] * F_Slot4[i])+
(Slot5[i] * F_Slot5[i])+
(Slot6[i] * F_Slot6[i])+
(Slot7[i] * F_Slot7[i])+
(Slot8[i] * F_Slot8[i])+
(Slot9[i] * F_Slot9[i])+
(Slot10[i] * F_Slot10[i])+
(Slot11[i] * F_Slot11[i])+
(Slot12[i] * F_Slot12[i])+
(Slot13[i] * F_Slot13[i])+
(Slot14[i] * F_Slot14[i])+
(Slot15[i] * F_Slot15[i]));
solve ;
print F_slot1
F_slot2
F_slot3
F_slot4
F_slot5
F_slot6
F_slot7
F_slot8
F_slot9
F_slot10
F_slot11
F_slot12
F_slot13
F_slot14
F_slot15;
create data wfs.Optmodel from [i]
CustomerID=CustomerID
F_slot1 = F_slot1
F_slot10 = F_slot10
F_slot11 = F_slot11
F_slot12 = F_slot12
F_slot13 = F_slot13
F_slot14 = F_slot14
F_slot15 = F_slot15
F_slot2 = F_slot2
F_slot3 = F_slot3
F_slot4 = F_slot4
F_slot5 = F_slot5
F_slot6 = F_slot6
F_slot7 = F_slot7
F_slot8 = F_slot8
F_slot9 = F_slot9
call_intensity=call_intensity
;
run;
proc sql;
create table wfs.Sum_slot as
select
Sum(F_Slot1) as Sum_F_Slot1,
Sum(F_Slot2) as Sum_F_Slot2,
Sum(F_Slot3) as Sum_F_Slot3,
Sum(F_Slot4) as Sum_F_Slot4,
Sum(F_Slot5) as Sum_F_Slot5,
Sum(F_Slot6) as Sum_F_Slot6,
Sum(F_Slot7) as Sum_F_Slot7,
Sum(F_Slot8) as Sum_F_Slot8,
Sum(F_Slot9) as Sum_F_Slot9,
Sum(F_Slot10) as Sum_F_Slot10,
Sum(F_Slot11) as Sum_F_Slot11,
Sum(F_Slot12) as Sum_F_Slot12,
Sum(F_Slot13) as Sum_F_Slot13,
Sum(F_Slot14) as Sum_F_Slot14,
Sum(F_Slot15) as Sum_F_Slot15,
Sum(call_intensity) as Sum_call_intensity
from wfs.optmodel;
quit;
data wfs.sum_slot1;
set wfs.Sum_slot;
sum_slot=Sum_F_Slot1+
Sum_F_Slot2+
Sum_F_Slot3+
Sum_F_Slot4+
Sum_F_Slot5+
Sum_F_Slot6+
Sum_F_Slot7+
Sum_F_Slot8+
Sum_F_Slot9+
Sum_F_Slot10+
Sum_F_Slot11+
Sum_F_Slot12+
Sum_F_Slot13+
Sum_F_Slot14+
Sum_F_Slot15;
run;
You can reduce the code by a factor of 15 by using multiple indices, like X[1,i] instead of X1[i].
Also, you can omit constraints Eq1 to Eq30 by using the BINARY option (instead of INTEGER) in the VAR statement for the X variables.
Nevertheless, what you have solves quickly for me. What SAS/OR version are you using, and what does your log look like?
NOTE: Problem generation will use 4 threads.
NOTE: The problem has 299970 variables (299970 free, 0 fixed).
NOTE: The problem has 0 binary and 299970 integer variables.
NOTE: The problem has 459969 linear constraints (159999 LE, 149985 EQ, 149985 GE, 0 range).
NOTE: The problem has 899910 linear constraint coefficients.
NOTE: The problem has 0 nonlinear constraints (0 LE, 0 EQ, 0 GE, 0 range).
NOTE: The OPTMODEL presolver is disabled for linear problems.
NOTE: The MILP presolver value AUTOMATIC is applied.
NOTE: The MILP presolver removed 159363 variables and 449955 constraints.
NOTE: The MILP presolver removed 618696 constraint coefficients.
NOTE: The MILP presolver modified 0 constraint coefficients.
NOTE: The presolved problem has 140607 variables, 10014 constraints, and 281214 constraint
coefficients.
NOTE: The MILP solver is called.
NOTE: The parallel Branch and Cut algorithm is used.
NOTE: The Branch and Cut algorithm is using up to 4 threads.
Node Active Sols BestInteger BestBound Gap Time
0 1 2 1708.0000000 1708.0000000 0.00% 2
0 0 2 1708.0000000 1708.0000000 0.00% 2
NOTE: Optimal.
NOTE: Objective = 1708.
Hi Rob,
Able to put only 1000 records. When these 1000 records are extended to 100, 000 then it gives out of memory error.
Thanks
Hi Rob,
Could you advice on how to proceed when number of records are extended to 100,000. Is there a way, we can fine tune or define it in different way?
Thanks
What SAS/OR version are you using, and what does your log look like?
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Learn how to run multiple linear regression models with and without interactions, presented by SAS user Alex Chaplin.
Find more tutorials on the SAS Users YouTube channel.