11-26-2014 12:08 PM
I've created a table with a primary key on 2 columns. I need to import a file's contents inside of that table. I've parsed the internet but I did not find anything relevant to do an import with columns name and primary keys.
I've got almost 2 Gb of data and I don't want to change my script with insert.
Do you have any ideas, please?
11-26-2014 12:31 PM
Can you explain what you mean by "I don't want to change my script with insert" ? Seems to me that INSERT is the SQL way to insert new data into an existing table. - PG
11-26-2014 12:42 PM
PROC APPEND would do the job and also allow for Bulk Load depending on where your target table is stored.
You need to better describe what you have, what you need, and what you've already done that is working/not working in order for us to give you better advice.
- Where is your target table stored (SAS or a data base)?
- How does your source data look like (SAS table, data base staging table, "raw" data)?
- Are the columns in source and target the same?
11-27-2014 04:00 AM
Thanks for the answer.
@PGStats : I have a file with record. The file is close to 2Gb. The records are like that `12121111;BigBoss;google.com;McDo,BK,ChipoGrill,Tacos,Tieboudiene`
I do not see myself doing an insert for the millions of lines, I have
@Patrick: The tables are stored in a sas, located on a server. The line look like this
`12121111;BigBoss;google.com;McDo,BK,ChipoGrill,Tacos,Tieboudiene` and the table structure is like this
CREATE TABLE heuri.table_repo cookie (varchar(100)),
constraint pk_ok_pa primary key(cookie,partner);
11-27-2014 04:48 AM
With regards to insert there is no need to specify each row of data:
create table HAVE as
insert into HAVE
11-27-2014 04:52 AM
Long time no see , mate.
Indeed, the idea is not bad , at all.
I can do a basic import and then, from that do an insert select to get what I want.
Let me have a try.
11-27-2014 03:27 PM
I would use Proc Append over SQL Insert as it's faster.
create table work.CLASS( label='Student Data' bufsize=65536 )
constraint pk_name primary key(name)
proc append base=work.class data=sashelp.class force;
proc contents data=work.class;
11-27-2014 05:12 PM
Base SAS(R) 9.3 Procedures Guide, Second Edition (proc datasets append statement) yep you are right. But for those exceptions, when being hit by them removing and rebuilding solves a lot.
One of those exceptions is that with referential integrity contraints it will fall back on v6 method adding a n index one by one (my lockin).
11-27-2014 05:19 PM
Thanks for your help. However, the proc append is only in the case where you have a table ready. I do have a table with the constraints but I have a file I need to import a file in that table with constraints.
Should I do the usual import with a proc import? Because to what I read, the append is not for files... or am I wrong?
11-27-2014 07:06 PM
Step 1: Read external file into SAS table
Step 2: Load SAS table into target table (the one with your indexes).
11-27-2014 09:14 PM
Your description of the problem seems confusing. It sounds like what you mean is that you have a text file that you want to read and insert the records into an existing table. You also mentioned the size of the data so perhaps your actual question is how can you do this without making multiple copies of the data. You could use a data step view to read the data and append that to you existing table.
data new / view=new ;
infile 'my_text_file' ;
input var1 var2 var3 ... ;
proc append data=new base=mylib.mytable ;