DATA Step, Macro, Functions and more

Parallel development in DI Studio 4.21M2

Reply
Contributor
Posts: 38

Parallel development in DI Studio 4.21M2

Dear all,

I am wondering what the best practise in a developement/maintenance setup with DI Studio 4.21M2 is.

What I am talking about is:

There are two repositories: "Development (DEV)" and "Maintenance (MAINT)". A release (being e.g. 250 jobs) is developed in DEV and, once tested successfully, copied over to MAINT. From there the jobs are planned to SAS code and then deployed to the Production (PROD) environment (batch mode). When a bug in PROD needs to be fixed, it is done in MAINT, while the next release is being developed in DEV.

What is the best or common way to copy all relevant jobs, tables, utilities, macros etc. from DEV to MAINT? I know about exporting everything into multiple SPKs, which need to be imported in the correct order (tables, utilities, macros, jobs) and - as happened in the past often - loose mappings or connections to included utilities or jobs due to some DI Studio bugs.

How do you guys handle such a scenario?

Thanks,

Thomas

SAS Employee
Posts: 27

Parallel development in DI Studio 4.21M2

Posted in reply to thomash123

Metadata export packages are the typical vehicle for migrating jobs.  Can you be more specific about the mappings and connections that you are losing?  You might also try posting this question to the data management forum at  http://communities.sas.com/community/sas_enterprise_data_management_integration.

Contributor
Posts: 38

Re: Parallel development in DI Studio 4.21M2

Hi,

what I am referring to are e.g. included jobs or custom transactions, which are not included anymore for whatever reason or cannot be imported as they exist in the other repository on the same machine. There exist bug reports for it, but actually we dont actually have to worry about that in this thread.

What I am asking myself is this: Our data warehouse is (at the moment) quite small, so it is somehow manageable to build SPKs for tables, macros, jobs etc. and import them in the correct order. But this always takes a while and is pretty much manual effort.

How is this solved in bigger setups where it is not possible to export 1000 jobs and 300 tables into a huge number of SPKs and import them separately back into the other repository?

Can this be done automatically?

Thomas

SAS Employee
Posts: 27

Re: Parallel development in DI Studio 4.21M2

Posted in reply to thomash123

SAS 9.2 provides utilities for batch promotion of metadata.  This would allow you to build scripts to do partial promotions.  See http://support.sas.com/resources/papers/sgf2008/migratemetadata.pdf and go to page 6 ("Batch Partial Promotion").

Ask a Question
Discussion stats
  • 3 replies
  • 179 views
  • 0 likes
  • 2 in conversation