So, as an “old school” DI person, I’m not comelette au fair with source control tools like SVN and GIT, but I do have the SVN module running on my DI install on my PC, and have dabbled with that a bit.
Back to the “old school” - for me, a lot of this comes back to the promotion processes and ceremonies. The Dev/Test/Prod pattern is power, especially when you think of the artefacts involved in the promotion process itself, the exported SPK that carry new and changed objects between environments or levels.
Thinking about the lifecycle of code over time, those SPKs do capture all changes and versions over time, albeit only at a “finished enough to promote” level of granularity.
So, establishing a strong regime to retain and document all these SPKs represents an opportunity to be in a position of control over changes over time, and to be able to roll back, through the catalogue of SPKs.
Couple this with a well considered approach to the folder structure, at ”source level” and separating out utility/toolbox objects, and I think you do end up in a strong position with respect to overall control.
That said, it isn’t the same as running GIT over a corpus of text based source code, but then, DIS isn’t a corpus of text based source code.
That said, there was a paper a few years back about using GIT to manage text based items in SAS environments, like the autocall macro area, configuration files. And I guess, something like this could be considered as an additional step, run over the deployed code directory, if only to create a view of the changes over time, at a deployed job level.
Yes, Data Prep in Viya isn’t currently addressing the same range of use cases as DIS currently does, sure. My view is that we will see DIS in play for quite a while yet, certainly any existing sites are likely to persist, as the depth of functionality in DIS will take quite a while to refactor. There’s over a decade of elaboration and finessing vested in DIS.
... View more