BookmarkSubscribeRSS Feed
lc_isp
Quartz | Level 8

Hi all,

 

 I'm building a project which needs to upload a pre-existent SAS db (table) from a Windows network share into SAS folders as a preparation step to then import it into a SAS table (I'm following such way 'cause, on Windows net-shares I've no control over files: filexist() reports they exist even when there are no files, can't rename, or delete, etc. etc.).

 

 The whole project is already made and, if I was able to reliably upload (copyfile from PC to SAS) the table every time, I could even release v1.0, but it seems I'm having nearly the same problem reported here, with the difference that, thinking to timeout/time-dependent problems even by myself, I already used a process flow like it's suggested as a (not totally fixing) solution. Also, I'm trying to be more "insistent" by stacking 3 tretries, to the same upload (if all was succesful it wasn't a problem 'cause they overwrite)

 

lc_isp_0-1683738133620.png

 

Then running a CheckDBexist procedure which exports a &dbexist global macrovar.

At last, I conditioned the ImportDB_storico to the &dbexist's value: if 1 then go on importing the data (db_storico.sas7bdat) into a SAS table (WORK.DB_STORICO), else go back to CopyDBNet2SAS_2 and try again (in theory, if the error is persistent, that could make an infinite loop and the SAS project, if started by a scheduler, will have to be force-terminated after some long timeout, e.g. after 4 hours).

 

What I'm seeing, when I run the whole project, is SAS executing the 1st stacked CopyDB..., not being succesful in uploading the file, and the next 2 CopyDB... steps doing the very same. If I open the 3 steps upload's log I can read this:

--------------------------------------------------------------------------------

Uploading files from machine "SCCTXSEP0713" to SAS session on "SASApp".

Using these settings:
Source file specification: &netDBpath.db_storico.sas7bdat
Destination path: &tempDBpath.
Overwrite existing files: Yes
Resolve macro references in file paths: Yes
Convert line endings for text files: No

NOTE: Resolving macro expressions in file paths. SAS log:

 

--------------------------------------------------------------------------------

with nothing more after "SAS log:", exactly like in the reference post I linked (which is from 2015).

 

On our system I'm working via Citrix, so my PC is just a client interface to SAS and runs nothing, my input and output files stay on our intranet network share (Windows) and on SAS servers I can store projects only (*.egp files), can't keep the working data. I've no persistent access to SAS server storage nor to the system logs, to read further details.

 

The weird thing is, if I execute the upload manually (run the CopyDB... task) it succeed and I've the file into SAS folders

That's the log in such case:

--------------------------------------------------------------------------------

Uploading files from machine "SCCTXSEP0713" to SAS session on "SASApp".

Using these settings:
Source file specification: &netDBpath.db_storico.sas7bdat
Destination path: &tempDBpath.
Overwrite existing files: Yes
Resolve macro references in file paths: Yes
Convert line endings for text files: No

NOTE: Resolving macro expressions in file paths. SAS log:
1 The SAS System 12:47 Wednesday, May 10, 2023

Source files resolve to
\\direzione.sede.corp.sanpaoloimi.com\s\0171249\Rdocs\Uff_Regole_e_Strumenti\Exelia\DATI\db\db_storico.sas7bdat
Target folder resolves to /sas/BFD20/lab/appdata/Project/DCO_INV_AML_Regole_Strumenti/Exelia/TEMP_db/

NOTE: Source file resolved to \\direzione.sede.corp.sanpaoloimi.com\s\0171249\Rdocs\Uff_Regole_e_Strumenti\Exelia\DATI\db\db_storico.sas7bdat
NOTE: Target folder resolved to /sas/BFD20/lab/appdata/Project/DCO_INV_AML_Regole_Strumenti/Exelia/TEMP_db/

Uploading \\direzione.sede.corp.sanpaoloimi.com\s\0171249\Rdocs\Uff_Regole_e_Strumenti\Exelia\DATI\db\db_storico.sas7bdat to /sas/BFD20/lab/appdata/Project/DCO_INV_AML_Regole_Strumenti/Exelia/TEMP_db/db_storico.sas7bdat ...
...Uploaded db_storico.sas7bdat, 393.216 bytes

NOTE: Copied 393.216 bytes in 0,34 seconds.
NOTE: Total number of files processed: 1

--------------------------------------------------------------------------------

 

but, when I run the whole project, it fails (3 times in a row), NOT raising any error (so that I've been forced to add a CheDBexist task). Needless to say that, the following steps which needs the table in place will fail.

If the CopyFiles task was callable from a procedure, I would add a program loop to retry 'till success, but I can't see the running code (like I do with querybuilders, e.g., where I can copy the code into a program and tweak it to my needs): it's a wizard and it seems I can't "call it" from a procedure.

 

Thanks in advance for any solution/suggestion. If I forgot to add useful details just let me know and I'll update this post.

 

13 REPLIES 13
SASKiwi
PROC Star

I suggest you talk to your SAS administrator about setting up the same Windows folder share on your SAS App server. That way you can avoid the EG CopyFile completely and just read from the folder share using a LIBNAME statement in your SAS server session. 

lc_isp
Quartz | Level 8

 TYVM SASKiwi, but there I'm having a problem which, in my previous post, I mentioned as: "on SAS servers I can store projects only (*.egp files), can't keep the working data. I've no persistent access to SAS server storage".

 De facto, opening an internal support ticket, the Support personnel "suggested me" to use a SAS folder, to store the (few KB weight) table I need: I followed their suggestion and had no further upload problem (also, having no downloads' problem too, I'm keeping a backup copy on the network share I was previously using, "just in case" someone removes that table from SAS folders).

 Nowadays, still trying to "structurally solve" that problem, I'm trying to interface the IT SAS management, to know if I'm really allowed to keep data there or not. In the meantime, the db_storico.sas7bdat table is happily working from SAS folders... but I'm feeling like a pirate, if you know what I mean. 😉

SASKiwi
PROC Star

TYVM SASKiwi, but there I'm having a problem which, in my previous post, I mentioned as: "on SAS servers I can store projects only (*.egp files), can't keep the working data. I've no persistent access to SAS server storage".

 

I suggest you talk to your management in that case to get this restriction fixed. Working around it by keeping a copy in your SAS folder with a backup on a Windows share seems very clunky to me.

lc_isp
Quartz | Level 8

I totally agree, it's a "patch solution" I had to adopt 'cause of my current limitations.

But, exactly today, I found a way to walk around that: I discovered we have a service which implies "data support" (so available storage for our tables/DBs), scheduling, etc. etc.

At the moment the problem I'm seeing is budget related but, as you correctly told, the solution exists, somewhere in my company.

 

All above said, I still hope SAS company will solve that problem I was referring to, in my initial post, which seems lasting since at least 2015: it's 2023 and it's a little sad that problem is still there, most probably also 'cause of Citrix virtualization in the middle, I bet. Nowadays "cloud working" and virtualization should be far better known than years ago.

SASKiwi
PROC Star

If you want to continue with your workaround then I'd suggest contacting Tech Support.

 

Also have you tried writing to your SASUSER library? These days it is usual for this to be set to read-only, but you might be lucky to be able to write to this. You can check if this is possible by running this:

proc options option=rsasuser;
run;

As far as I'm concerned, it is your company policy of not providing permanent SAS data storage for users that is the problem, not SAS.

lc_isp
Quartz | Level 8

To see if I was able to write to SASUSER lib was one of my first checks: it's read-only.

Also, for the few I know, SASUSER is a "personal library": I mean (if not read-only) I can store objects I can access, but my colleagues won't see, and we need "per office" storage, which we "solved" by means of the SAS folder we're using (the network share export is just a backup, as I told).

 

At last, I can agree when you say the problem "it's your company policy", which I can't discuss with my company but, to the other side, before having the upload problem, my project was correctly running, uploading its own table from our network storage: from some point forward I started having that problem, which at last I "solved" by keeping my table into SAS folders. As this "solution" is at risk, I'm keeping a network share backup too: if SAS kept uploading his table without a problem I wasn't forced (half by my company's policy but also half by SAS itself) to adopt this "solution".

 

I'm not here to state "it's a SAS problem". I know the source could be in the middle of three zones: my company's policy, Citrix virtualization, and SAS itself. Only thing I know for sure it's I've the problem, and the solution I'm using have some risk.

SAS company could solve the "SAS and Citrix" part, which I was hoping, while I can't solve "my company's policy" one (but, as I told, I may have a workaround: it only depends on the budget and my boss' ok).

 

Thank you again for your time and support,

AlanC
Barite | Level 11

Your environment is complex so solving it will be tough w/o being in the systems to do the diagnosis. Here are some things to check:

 

  • Security changes
  • Timeouts
    • This is what first came to mind. A process that has run for a long time and suddenly fails may be due to growth. Data grows over time and it may simply be timing out now. Check DB and network timeouts
  • Use network tools to try and determine what is happening such as Fiddler. You need more diagnostic tools.
  • I doubt it is Citrix. With Fiddler you can allow it to an https decrypt so you can see the traffic but I wouldn't start there.

Upload/download issues are often simply a timeout at one layer or another. 

 

https://github.com/savian-net
lc_isp
Quartz | Level 8

TYVM for your suggestions, Alan, unfortunately, in my company, I'm more a "SAS user" (I develop but can't access the "low level" layers I should need, to debug the network dataflow) than a system administrator. Due to my netadmin past I've some knowledge about what you mean but I'm restricted in my role by company's policy (yes, it's not only SAS, to be restricted ;-)) so, the best I can do is to "try asking" our support, but can't force them to review company's policy, or even just to solve my problem: the "solution" they gave me, which I exposed to SASKiwi too, is "unstable" 'cause a team told me that (which effectively solved the problem) but I doubt the "SAS storage managers" (whoever they are) can agree with the Support.

 

What I can tell, at now, it's I'm uploading/downloading "nearly the same" data amount: we're not speaking of Big Data, here, it's just 350-380 KB (I'm putting a lot of attention to avoid wasting the storage I even shouldn't use): downloading them always succeed, uploading them initially worked, then I had to move the upload task earlier in my project ('cause of the involved checks I need) and it started giving me problems.

 

As I told, if I execute the upload task by hand (so with the project not running) it succeeds 100% of the times. If I execute the same task while the project is running, 99% of the times it fails (where the 1% success seems randomic and probably relies on just luck: more than "timeouts", I'm feeling some kind of "internal conflict" throught SAS processes working in parallel but, as usual, I can't be sure).

 

Every week I increment the data amount by 50 observations (average), so just a few KBs.

Now that I'm keeping the table into SAS folders directly (as a sas7bdat), importing it into WORK library works 100% of the times... with the "possible issues" I mentioned already.

AlanC
Barite | Level 11

It 100% sounds like a timeout issue. You don't experience it while in a UI because the UI handles the refresh and may also have a longer timeout period. 

 

Something may have the file locked during the upload so the upload doesn't happen until later than it appears. So, the while loop would time out due to file being in use. It may seem like it is 'working' when it is simply waiting.

https://github.com/savian-net
lc_isp
Quartz | Level 8

Thanks for the hint, Alan, I'll try to inspect more, into SAS, about the possible lock on read: maybe something is really locking my table, SAS-side, so generating the problem, that's very interesting.

 

I could exclude network-share "locks" 'cause I already tested, into another project, that SAS seems having no problems (correctly) in reading even open Excel files from our network shares (as far as it's a READ access, of course): that was my other project customer's concern but seems a no-problem, to that side.

 

It still could be a SAS-lock problem, which arised when I had to move my import earlier.

Tom
Super User Tom
Super User

You should put pressure on the IT support staff at your company to find the solution.  If their "policy" is interfering with your ability to do your job then it is their job to find a solution.  The computer systems are there to support the users.

 

I have found that modern IT people do not understand that users actually need to use computers to do computing.  They are too used to building point and click systems where users only make selections.

ChrisHemedinger
Community Manager

Adding the user-defined links to ensure proper sequence is the key. There is a known problem (sometimes) where the downstream Copy process tries to execute too soon -- it's a threading race condition that happens in rare cases. I've always seen customers work around it by adding one or more user-defined links to force EG to complete all prior tasks first.

Learn from the Experts! Check out the huge catalog of free sessions in the Ask the Expert webinar series.
lc_isp
Quartz | Level 8

Hi Chris and thank you for the explanation.

Reading the previous thread (2015's one) I tried doing what you're mentioning.

Not going into too much details, the project it's a little bit complex and summarized below:

lc_isp_0-1684506989640.png

  1. it obviously start top-left, by reading a filtering table I'll use later,
  2. then pauses 'cause it needs other data I'm going to load
  3. at this point I first load our weekly data (around 50 obs)
  4. and I tried adding such "tower structure" to retry (3 times) loading the "historical DB" (just a 350-380 KB single table) as a file into local SAS folders (which I should be forbidden to, but working as temp-files I probably can): such upload, which there fails (the same as the 2 next ones), was in the past positioned far later in the project (after point 10) 'cause at that time I only needed to update such data, but didn't need to do further checks, which now happens early (e.g. if weekly data already are into historical table, which may mean someone forgot to refresh them)
  5. even the 2.nd and 3.rd upload (re)tries fails, and they do keep failing even when I place wait-tasks (program calling sleep()) in between
  6. so, after the (failed) loads, the CheckDBexist fails too (not uploaded) and the project stops there
    if the upload succeeded, the DB/table was imported
  7. then calculations was done about dates intervals (about the weekly data)
  8. then the project diverges: the upper-right zone filters the whole weekly data about some users only
  9. and mails the results to our customers
  10. the bottom-right tasks update the historical table and, at last, export the weekly file (renamed so to avoid replacing it in the network folder) and the historical "DB" too

What's shown above is an old version, tho (v0.2.6.82), nowadays one (v0.2.9.89) skips the DB/table upload by keeping it in the SAS folder, so that the import always happens, and it's just to download it, at project's end for safety purposes, as a backup-copy. Current workaround is working (as SAS have no problems in importing from its own folders) and I'll see where I'll go, with that table: we're evaluating an internal service which told me they can give us storage for files, on SAS (so it's like I'm doing right now, but not "as a pirate", even if I'm "pirating" a very few KB room ;-)))

 

If SAS will solve its race condition part I could go back uploading all, fresh, from our network, so to use SAS folders temporarily only (as I should), or maybe I'll modify the project further, auto-solving the race condition reason, who knows?

SAS Innovate 2025: Register Now

Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!

SAS Enterprise Guide vs. SAS Studio

What’s the difference between SAS Enterprise Guide and SAS Studio? How are they similar? Just ask SAS’ Danny Modlin.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 13 replies
  • 2283 views
  • 3 likes
  • 5 in conversation