07-25-2015 01:41 AM
Being Data Scientist,One of the prominent challenge is to extract data from server to local machine.Is there any way out to extract data from oracle server more speedy than traditional Data;Set;Run; Process.
Majority data extraction for us are blanket extraction i.e. without applying any filter.Is there any bulk download process which wont read every line of the data and download all the files in batch mode.
08-07-2015 07:49 AM
Thanks Xia,this option really helped me a lot to reduce TAT.However I am unable to increase value more than 32000 on my local system against ReadBuff value.My sas version is 9.3 and I am operating it through 32 Bit processor
08-07-2015 08:37 AM
Optimizing run time is decreasing the technical overhead. Read buffers is mostly sufficient There should be no logging and no locking when reading (writing is different).
You local system could be a bottleneck (SSD?) as is the network transfer rate. Optimizing buffering bufsize aligniofiles could help. SAS/ACCESS(R) 9.2 for Relational Databases: Reference, Fourth Edition Xia give you that one already
Mostly it is adiviced to work "federated" leave data as much as possible at his source, avoiding copies. With mining that is not really possible.
As of Views and other technical stuff in Oracle it could be Oracle is not faster and being the limitation. Ask your oracle DBA for that area.