Hi, I'm new to DI studio. I have a simple job that runs a pass-through query against an Oracle database. It's a big query, so I expect it to run a few hours. I scheduled it to run over night. So in DI studio I created a job, which became a flow, and I scheduled the flow in SMC. But the whole job is just the one step that runs pass-through query with user-written code. This morning, I come in to see how everything went, and I'm not sure. Either the job aborted after an hour, or it's still running. : ) On the SAS server, I can see the log file from the job in ..../Lev1/SASApp/BatchServer/Logs It is stopped right at the point of the pass through query, i.e.: The log stops at the pass-through query:
MPRINT(QUERY): proc SQL;
MPRINT(QUERY): Connect to oracle(PATH=XXXXXX UID=XXXXXXX PASSWORD="XXXXXXXXXXXX");
MPRINT(QUERY): Create Table out.mydata As Select * from Connection to oracle (
select ...
And that is where the log ends. At the select statement. Which I might expect to see if the query were still running on Oracle, and had not yet been returned to SAS server. If I look for out.mydata on the SAS server, I see the temporary file, mydata.sas7bdat.lck. It looks like just a shell dataset, and mod time is the time the job started. Also what I would expect to see if job was in progress. I don't have access to the Oracle server to see if anything is running there. When I run TOP command on the SAS server (linux) I don't see any processes that look like this query. The mod date on the log is almost exactly 1 hour after the job started, making me wonder if this somehow timed out on the (SAS server or on Oracle side) and was aborted. Anyway, looking for suggestions on how to go about diagnosing this, particulary if there is a way for me to check the status of job from SAS server side, or if I need to call the oracle DBA and ask them to start reviewing those logs. Thanks, --Q.
... View more