- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
i want to ask, if i have query like this...
pro sql;
create table DATABASE.SKUSKA AS
SELECT....
an this querry runs like 40 mins, and this table database.skuska is base of front end table in application, so when user come to application in the time when querry runs, there is incomplete information. Is there a way to 1. run a querry and then 2. replace a table faster? So there will not be 40 minutes incomplete information in application? How can i do it if it is possible?
For example run a querry in work library and when is table complete just overwrite database.skuska table.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
The generic answer is: Yes, there will be options to improve things.
You need to provide more detail for a more detailed answer.
Questions
1. DATABASE.SKUSKA: is "database" a libref that points to a folder with SAS files or is it pointing to some actual database. If it's a database then which one? - Oracle, SQL Server, ....
2. What OS are you running your code in Unix/Linux, Windows, other?
3. Do you have option XCMD set (to answer this question: Proc Options option=xcmd; run; )
4. We're talking SAS9.4 and not Viya/CAS - right?
Do you know why your SQL takes 40 minutes? Would there be an opportunity to tweak the code for better performance?
"so when user come to application in the time when querry runs, there is incomplete information"
Unless for some database side trickle feed techniques it's normally "all or nothing" so not sure what your statement means.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
It is a sql database, we use sql server and it runs so long because there are many rows in table, a few milions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
@Pato485 wrote:
It is a sql database, we use sql server and it runs so long because there are many rows in table, a few milions.
1. So you always create a new table in SQL and then it's a full load into this table?
2. Where are your source tables located?
Ideally share the full SQL you're using. Are there any joins between SQL and SAS tables or are all source tables either SQL or SAS - which ones?
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Push your data to a temporary/intermediate table in the SQL server first, and do the final update step in the SQL server with explicit pass-through.
In my personal experience, using a remote database as a backend for BI with SAS is suboptimal; it adds additional configuration/maintenance requirements, and never performs as good as your SAS on its own (unless the database is specifically confgured for BI requirements and you have a VERY FAT network connection to it).
The ideal way to handle your issue would be this:
- run your BI exclusively in SAS
- run your SAS on UNIX
- create the new dataset with a new name in the target library first, then use the FDELETE function to remove the current file (not the library member!), and the RENAME function to rename the new file to the target name
The delete/rename happens so fast that no service interruption happens; users having the file open will see the old contents until they close and reopen.
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Please share the query, and elaobrate how/when the source data is being updated.