BookmarkSubscribeRSS Feed
TripsDDD
Calcite | Level 5

Hi,

 

I know a little SAS DI at a high level (not ignorance I promise, just that my role doesn't to date hasn't required me to know more, until now!) - so please bear with me if this is a daft problem / question...

 

This is on AWS EC2 Cloud Infrastructure with a SAS 9.4 DI Installation on Linux.

 

We are attempting to build bespoke CI/CD pipelines utilising GitLabCI (Git runner too, with some bespoke python and Git CLI commands) and are endeavouring to store SAS artifacts at a more granular 'binary' level. So rather than exporting everything or multiple jobs/flows etc.to a single .spk file, from one environment to the next (or from GitLab to the next), we're attempting to create single .spk files for each lower-level artifact e.g. Table/Job/Flow..

 

Functionally, the solution works well - the issue being that for every item listed in a controlling 'Manifest' file - a Yaml file that orchestrates the order / config etc - there is a separate 'connect, export then disconnect' issued for every export from DI - ignoring GitLab for the moment and the ability to retrieve from there, this is for the initial deployment build, where the code will need to be exported from the dev/source env, to make its way to GitLab, before it can go anywhere else.

 

This works fine for 50 - 100 objects/artifacts, but some of our core component builds are likely to be 1500-2000+ distinct items - so performance is proving an issue when we're testing a full system build, roughly 20-30 seconds is lost for each item, with a connect, disconnect then for next item connect again (and so on and so forth).

 

To summarise, this is executing the DI export package from the command line - my question is, is there any way to maintain the connection to export multiple packages, without having to disconnect and then reconnect for the next one in the queue (Driving manifest file)? 

 

I suppose the real question, irrespective as to how the call/execution is made - is there a way to force connectivity to remain between export executions?

 

Again, apologies if terminology is incorrect or I am articulating incorrectly, any help / guidance is much appreciated.

 

Many Thanks,

Dave

2 REPLIES 2
LinusH
Tourmaline | Level 20

I think you have a quite specific setup (but interesting!), so I suggest you open a track with SAS tech support in parallel.

Data never sleeps
Patrick
Opal | Level 21

Binary file/.spk's with version control are always a challenge and I've never seen "the solution". What you describe is the most advanced approach I have ever heard of. 

I agree with @LinusH that for the question you're asking it's best to contact SAS Tech Support. I'm really interested where you end-up with so please keep us posted. 

 

"with a SAS 9.4 DI Installation on Linux"

DIS is a Windows client. Your SAS Compute server where the DIS generated code executes is Linux. The code - .sas files - generated by DIS are done by the Windows client based on what's in SAS Metadata on the SAS Metadata Server. The code generated depends on the version of the DIS client (= an upgraded DIS version can generate different code using metadata without any change, same is true for the xml in the .spk - which is basically a .zip archive).

 

In the following just a few things for your consideration

There are two possible promotion paths. One is to create .spk's in source, import into target and re-deploy. The other option is to use DIS in DEV only. You deploy the jobs and then version control and promote the deployed test based .sas files. DIS is only used in Dev. If you keep your folder structures identical in all environment then only some limited automated code changes will be required (like generated libnames or connection strings with credentials). This can be done as part of automated deployment scripts. I know of one site with heaps of DIS jobs which has taken this approach very successfully.


DIS is the ETL client for SAS 9.4

SAS Viya uses SAS Studio as its main client and just adds capabilities/transformations based on what you license. SAS 9.4 DI Studio capabilities will become part of SAS Studio under Viya - which is different technology.

As far as I know there will be a migration path for SAS 9.4 DIS jobs to SAS Studio under Viya ....but things are not fully there yet. 

If you start to talk to SAS then I feel it might also be very worthwhile for you to get some more information how things will work with Viya so you can already now implement a DevOps process that's as future proof as possible.

 

That a new version of SAS 9.4 DIS can generate different code is a challenge when it comes to version control. On the other hand I've been grateful for new versions of DIS to generate better and more efficient code and I would expect that this is going to continue also with SAS Studio under Viya. It's something you will need to address and have an approach for when it comes to version control - like: only re-generate code when there is a business driven change required or re-generate, fully re-test, version control and promote all generated code with changes after a change of the SAS version.

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

Get Started with SAS Information Catalog in SAS Viya

SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 2 replies
  • 514 views
  • 1 like
  • 3 in conversation