04-16-2015 09:57 AM
04-16-2015 10:02 AM
04-16-2015 10:02 AM
Bearing in mind that X command merely sends that text to the OS environment, why do you think that:
When run through DOS for instance, would actually do anything? Dos has no new command. Maybe describe what you are attempting to achieve as it appears you are mixing up writing SAS code and sending instruction to an OS.
To create a new variable you put that commands in a datastep:
a_new_variable=123; /* or could be "somethin" if you want a text variable */
/* Or you can set it with a length or attrib statement, or in SQL with alter table */
04-16-2015 01:04 PM
So let me see if I understand:
You have a value in SAS
that you want to send to the OS
so that you can reformat it
and bring back into SAS?
I suspect any reformatting you need to do with dates can likely be done directly in SAS.
04-17-2015 01:22 AM
If you set a UNIX environment variable from a SAS session, it is automaticallly lost at the end of this session.
To create a permanent UNIX environment cariable, you have to set it in a configuration file for the respective user, ie $HOME/.profile for logins, or in the SAS invocation script. For what you want to achieve, a UNIX environment variable is not the solution.
You could record such data in an external file in a location where it will be preserved so that other jobs can read it.
Or you could set up a SAS dataset to store such values.
length jobname $32 value $100;
jobname = "my_job_name";
rundate = date();
value = put('31dec2015'd,best5.);
proc append data=preserv_value base=somelib.preserv_value;*this is the permanent dataset;
Or, if you are using a proper scheduling system, you could simply create some output that is "caught" by the scheduling system and preserved as parameter for other jobs.
04-16-2015 06:18 PM
Yes, but you cannot retrieve them in another X session. I believe that SAS is generating a sub-process to run the command so that sub-process cannot write back into the environment variables of the parent session. One way to get the results from an operating system command is to use the PIPE engine on the INFILE statement and read the results as data.
For example here is a trivial example that passes a macro variable to KSH and uses KSH arithmetic to divide it by 10 and return the value.
infile "x=&x; y=$((x/10)) ; echo $y" pipe;
You could even use the FILEVAR option on the INFILE statement to generate the values to pass and the commands to run from a data set.
input x ;
cmd = catx(';',cats('x=',x),'y=$((x/10))','echo $y') ;
infile oscmd pipe filevar=cmd ;
put x= y=;
04-17-2015 03:32 AM
I am getting an association with Milo-testing (time travelling) and more of those advanced testing/validating approaches.
When going into that the Time/date, that is an important part of information in the business processes.
What you have:
1/- business code ( Develop, Test, Acceptance ,Production) Version management at development and release management for the changes.
2/- business data ( Develop, Test, Acceptance, Production) That are the faked ones ( Dev / Test) the depersonalized ones (Acceptance) and the real values as related to the normal operating processes of the business
With Accpetance (Data) you can:
a/ verify new business code using production code or acceptance code (regression testing
b/ new infrastructural changes (hardware, OS, middleware) in that case using acceptance data with business production code.
c/ Education of new staff (humans as the object to improve/test)
d/ technical research for problems that you have made a quick bypass but need to investigate better
c/ Some partial pre-run of a critical costly job (montly yearly) that you revert before that one really executes
With Develop and Test there is commonly a need in bigger project to be able to run test in parallel aside the questions of parallel development (version management)
Rather a very detailed list on common practices.
The one that is not mentioned is the assumed run-date. Within a production environment the system-date is often coded as the today(). But logical that approach is incorrect as not to verify with paste or future dates with the same business logic/data.
You should have some standard for that in your environment. those could be:
a/ a parameter option (Set) given to the process when the process starts
b/ an external file that is read in, could be an executable as some alternative today() function or a flat file being read or some RDBMS dataset value.
The best thing is something that can be retrieved by all kind of programs in use in your environment not focussing on only SAS processes
Combine those with a scheduler system that runs all or much of your processes is a goal that goes with these. It will also covers the needed exceptions for the production processing.
It is not uncommon for many processes to have their own reference date being defined in some parameter-file. It is defining the logical interval to be processed for that logical process.
You can see the I) system-datetime the II) assumed process rundate-time and III) the logical interval as three independent pieces of information and threat them separately.
Mostly the idea is the same as Kurt.
The only difference is that when a Unix script is taking part of II) all data/time assumed processing also that Unix script must be able to reference that one. Reading a file or using a scheduler can solve that.
04-17-2015 03:42 AM
A big post-scriptum:
In order to use the capabilities of your operating system, you must get a grip on its working principles, or the OS will run you instead of the other way round.
04-17-2015 10:30 AM
Check system parameter SYSPARM
This example shows the SYSPARM= system option and the SYSPARM function.
If sysparm()=’yes’ then