07-02-2013 01:25 PM
I found the following macro here : SAS(R) 9.2 Macro Language: Reference (Example 5)
%global dset nvars nobs;
%let dsid = %sysfunc(open(&dset));
%if &dsid %then
%let nobs =%sysfunc(attrn(&dsid,NOBS));
%let rc = %sysfunc(close(&dsid));
%put &dset has &nvars variable(s) and &nobs observation(s).;
%put Open for data set &dset failed - %sysfunc(sysmsg());
The macro works great on small datasets but for some reason I get this result :
mylib.my_huge_table has 27 variable(s) and -1 observation(s)
Anyone knows how to solve this?
Thank you for your help and time
07-02-2013 01:57 PM
I found this that may provide some insight:
Are we talking about tables as large as those in the article?
07-02-2013 02:35 PM
Thank you for the reply. I tried running a simple sql count and the query is still running ( I started it 5 hours ago...)
When I try this code :
set mylib.my_huge_table nobs=nobs;
I get this result : nobs=9.0071993E15 that would mean, if I understand the result correctly, that there is 9.0071993 * 10^15 records, which means more than nine hundred million billions.... I'm not sure if that result is accurate. This is why I wanted to run the macro on this table.
Could you please provide me with more guidance on this issue?
Thank you for your help and time.
07-02-2013 03:07 PM
From your description, it sounds like the observation counter has overflowed and cannot accurately contain the number of observations in the SAS data set. This limits access to sequential, prevents indexing and generally turns your data set into a giant tape file.
Here is an option:
*IF* you are running SAS 9.3 or later, you can recreate your SAS data set using the data set option like this:
set olddataset; run;
I highly recommend using a different data set name, not reusing the original data set name.
The new copy of the data set will have a different internal structure than the original, but will allow you to query the number of observations, create indexes and access the data set randomly using POINT= option.
The down side is that the new data set cannot be read by SAS 9.2 or earlier releases due to the internal structure change.
07-10-2013 11:21 AM
Here i have one more question about the last suggestion in the above post.
If you get the following error in SAS Log,
"ERROR: File MYFILES.BIGFILE contains 2G -1 observations and cannot hold more because it contains an index or an Integrity Constraint that uses an index"
then you need to set EXTENDOBSCOUNTER=YES to get rid of above error. Such an error happens when you are working with very big data sets(36112 - Managing large SAS® data sets that exceed the maximum number of observations). Actullay, sas documentation says, by specifying the EXTENDOBSCOUNTER= option when you create an output SAS data file, the resulting 32-bit file behaves like a 64-bit file regarding counters.
I have observed after setting the option "EXTENDOBSCOUNTER=YES" the performance of sas decreses i.e. SAS takes more time for the same task.
My question is: does anybody know about this option and does it really effect the performance of SAS?
07-10-2013 11:33 AM
Does this work:
title "Number of observations in Mylib.my_huge_table";
where libname="MYLIB" and MEMNAME="MY_HUGE_TABLE";
The values of the library name and member name must be in capital letters as that is they way they are stored in the dictionary tables.