Partitioning technique is reducing the elapsed time by 50 to 75% (60 mins reduced to 15 mins, another place 80 mins reduced to 25 mins). Out of which, most of the time getting consumed for overhead processes like below... 1) preparing the source data(creating formats, sort the source data, and determining the partition size(when partitions based on business keys) etc.,) 2) consolidate the partitioned target data into a single dataset 3) cleanup the intermediate tables We have 1 processor chip and other CPU resources are listed below... x cpuinfo: GenuineIntel Intel(R) Xeon(R) CPU E5-4655 v3 @ 2.90GHz x x cpuinfo: Hz=3126.335 bogomips=5808.16 x x cpuinfo: ProcessorChips=1 PhyscalCores=6 x x cpuinfo: Hyperthreads =2 VirtualCPUs =48 x x # of CPUs: 48 Can you highlight the risks of load failures that may expect in future if the CPU utilization goes beyond 95%? I am more interested in knowing if making the CPU utilization upto 100% during off peak hours (for batch loads), when it is the only process getting executed. Can it be considered as optimal way of utilizing the resources? I had seen issues with work tables, and views(applied fix as replacing these with permanent tables). SAS had difficulties in closing the sessions when it had crated work tables for processing as highlighted in my previous post. I had executed it several times and ensure no errors thrown. I wanted to be ready with fixes in advance, if any issue can be expected in future
... View more