12-14-2016 04:58 PM
I have used the solution below given from the esteemed advisor Tom, it works well, but I have a matter with a big data, the exporting take more than 15 minutes for 3000,000 observations, I would like to have less than 2 minutes, is it possible ?
data _null_; file log ; set test(obs=1) test; length __name $32 __length 8 __value $200 ; do while (1=1); call vnext(__name); if lowcase(__name)='__name' then leave; if _n_=1 then __value = __name ; else __value = quote(strip(vvaluex(__name))); __length=lengthn(__value); put __value $varying200. __length ';' @ ; end; put; run;
12-14-2016 05:24 PM
Run it with options fullstimer. If real time is considerably longer than CPU time, you are I/O bound and need to work on storage throughput.
12-19-2016 08:16 AM
If real time exceeds CPU time significantly (not just a few percent), you either
- have to share CPU power with other processes. Run the necessary system tools (Task Manager on Windows, nmon or topas on UNIX) to determine the processes competing for CPU.
- run into I/O bottlenecks
Possible avenues for I/O tuning:
- separate disks that are being read from those that are being written (keep source and target libraries on physically separate disks)
- set up disk arrays, so that more than one disk handles a certain I/O load; this is called striping
- migrate to SSDs
12-19-2016 08:02 AM
@Ksharp: thank you
put x1 x2 .........;
put the variables names do not reply to my need because the name can change from x1 x2...toto Wi or Yi ...so on