08-21-2014 02:33 AM
I have a sas table with numeric fields and appending it to a Greenplum table using bulkload thru gpfdist. However, I am getting an error when there are values greater than 1 billion eventhough the fields in greenplum are declared bigint.
Is there an option to allow loading such large numbers to be loaded in Greenplum when using bulkload option. Below is a sample script of what I am currently using.
proc append base=gplib.tgttable (BULKLOAD=YES BL_DATAFILE="<file>" BL_HOST=<host> BL_PORT=<port> BL_ENCODING='LATIN1' BL_FORMAT=CSV BL_NULL='\NULL\')
08-21-2014 03:22 AM
Bigint is not a type supported by SAS (classic). It is converted to floating. See:
- SAS/ACCESS(R) 9.4 for Relational Databases: Reference, Fifth Edition
There is a problem having more than 14 digits.... they are at best becoming randomly.
To overcome those type conversions SAS has new appraoches
- Base SAS(R) 9.4 Procedures Guide, Third Edition (Proc Ds2)
These are for accessing data out of DBMS and using in database technologies
Dow you want to use the bigint type for some reason my advice would be first loading it to an other type of field and have it converted later in Greenplum.
It is the common misperception, it would be better defined as:
- measurements With a certain precision (floating)
- classifications (character but possible restricted to use the numeric ones as a constraint)
- Time (date datetime time is UTC or localized if you like star-time)