I need to create as many execute statement in a pass through as the values in a ID variable.
I have the A table which contains some variables, one of them is the ID variable. Let’s say i have 30 rows. So the ID has values from 1 to 30.
I have to insert each one of those rows in an oracle table. One by one.
So, using a macro, how can i handle multiple execute statement?
Also, does the execute statement commit the script it runs? Or the process must disconnet from oracle in order to commit?
This paper here provides multiple options how to insert rows from a SAS table into an Oracle table.
Option DBCOMMIT allows you to define after how many insert operations a commit gets executed (or COMMIT=... for Proc DBLOAD).
I normally load SAS tables into Oracle using Proc Append with DBCOMMIT=0 (which issues a single commit only). I prefer doing it this way because in case something goes wrong I get a full rollback and I don't have to deal with partially loaded data.
There seems to be a shift in the task requested in the middle of your description. You start out talking about executing something in the remote database. I assume you meant running some type of stored procedure or command to manipulate data on the remote database.
Then you talk about inserting data into a table.
Which is it that you want to do?
If you have data in a SAS dataset and want to use it to generate the PROC SQL statement EXECUTE then just do that.
filename code temp; data _null_; set have ; file code ; put 'execute (' / 'my remote sql code with value of dataset variable x here' X / ') by oracle;' ; run; proc sql; connect to oracle .... ; %include code / source2; quit;
You lost me again.
The original Oracle INSERT statement should insert ALL of the records from the selected table.
Why do you need to run more than one INSERT?
Are you trying to read from DIFFERENT tables? Apply different WHERE clauses? Something else?
If the list of things that change (source table or where clause) is in a SAS dataset then just use normal SAS code to generate the Oracle statements you want to run. See my earlier answer. You can write the generated code to a file and %INCLUDE it. Or use CALL EXECUTE() to push it onto a stack to run after the data step that is executing the CALL EXECUTE() statements ends. You could possibly make the code a little easier by making a little macro that takes as inputs the parts that vary. Then the data step just needs to generate calls to the macro and not the full SQL statement.
How would you do that?
If There’s are 10 rows and 1 have wrong values all the insert go wrong, has can i avoid that?
If there is not a very good business reason to do it otherwise then I would want an "all or nothing" approach meaning that if one of the rows hits an integrity constraints then everything gets rolled back. If you don't roll-back everything then you have to deal with partial inserts which makes things much harder to rerun after you've fixed the data issue as now you have to work out the delta as well.
You can define DBCOMMIT=1 in your connection string. This will commit one row at a time. I'm not sure though what happens if a row fails the check; if other rows still get inserted.
IF you want to go for an approach "load whatever you can" then I'd Google Oracle forums like "Ask Tom" to figure out pure Oracle syntax for implementing what you want. That's then eventually PL/SQL. You can execute "anything" within a SAS EXECUTE block - also PL/SQL. So for such an approach I'd be using SQL Developer for my code and only if I have something fully working on the Oracle side "copy/paste" the code into a SAS EXECUTE block.
Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.
Find more tutorials on the SAS Users YouTube channel.