BookmarkSubscribeRSS Feed

SAS Viya IaC for Azure + Viya deployment via an Azure DevOps pipeline

Started ‎01-20-2021 by
Modified ‎10-25-2021 by
Views 6,854

hackathonImage.jpgSAS is sponsoring a global hackathon, #HackinSAS. The event is designed to explore how to use data for social good in new and creative ways. I won’t go into details on the hackathon in this article, but I highly encourage you to refer to the #HackinSAS page as well as the Hacker’s Hub on the SAS Communities for more information. What I do want to cover in this post is how we went about creating the environment we’ll provide for each team participating in the hackathon.

 

Article Mission

Introduce the possibilities Azure DevOps Pipelines bring in the context of automating SAS Viya deployments.

 

Pre-requisites

This article relies heavily on two separate sassoftware GitHub repositories:

SAS Viya 4 IaC for Azure

SAS Viya 4 Deployment

 

These repositories contain tools, scripts, and code for SAS Viya deployment in an Azure environment. I highly recommend becoming familiar with these concepts.

 

Foundations

I have been using the powerful SAS Viya Infrastructure as Code (IaC) repository (referenced above) to create the necessary Azure compute resources for a SAS Viya deployment.  The project contains Terraform scripts to provision Microsoft Azure Cloud infrastructure resources required to deploy SAS Viya 4 products. After the Infrastructure is created, I deploy SAS Viya with the code in the other referenced repository.

 

 

Azure DevOps pipelines

The Azure infrastructure for #HackinSAS requires environments for 100 teams. Each team needs access to his own SAS Viya environment.  We are currently researching how to provide all that infrastructure in the most agile and efficient way, but I decided to start looking at Azure DevOps pipelines to automate this process as much as possible.

 

Azure Pipelines combine continuous integration (CI) and continuous delivery (CD) to constantly and consistently test and build your code and ship it to any target.  Consider it an automation server, similar to Jenkins.  My goal is to share with you how you can integrate the Viya4 IaC Terraform & Viya 4 deployment scripts in these pipelines. This requires some familiarity with automation servers and the SAS Viya IaC project.

 

The graphic below represents the entire deployment process explained. I’ll refer to it in the Pipeline initialization section.

 

pipelineWorkflow.png

 

Pipeline initialization

Consider the first DevOps pipeline the initialization process before we can kick off the main pipeline.  It first creates a container to store the terraform.tfstate file within an existing Azure Storage Account.  Each participating team will have its unique terraform.tfstate file because it contains all the information about the Azure Cloud Infrastructure created for that team. The creation of that storage container is the first task of the pipeline. 

 

The second task of the starters pipeline is to build a Docker Container Image, tag it with the name of the team and push it to an Azure Container Registry (step 3). This Docker image runs a few times in the second pipeline when we plan and apply the Terraform plan. Here are the Docker commands executed in the first pipeline to prepare the Docker image:

 

docker build -t viya4-iac-azure-hack:$(team) .
docker tag viya4-iac-azure-hack:$(team) $(location).azurecr.io/viya4-iac-azure-hack:$(team)
docker push $(location).azurecr.io/viya4-iac-azure-hack:$(team)

 

As you can see, we are making use of variables to identify team and locations.   The pipeline can't start without these required parameters. The Dockerfile (represented bellow) in use for the container image build is the same as what is found in the SAS IaC Github project. Note that one additional parameter,  ARM_ACCESS_KEY is required. This value provides access to the Azure Storage Account.   I also modified the Terraform init command by adding some backend parameters of the location of the terraform.tfstate file.  During the pipeline execution, the team string is replaced with the name of the hackathon team.

 

...
ENV ARM_ACCESS_KEY=__storagekey__
RUN apk --update --no-cache add git openssh \
  && terraform init -backend-config="key=__team__-terraform.tfstate" -backend-config="container_name=__team__" /viya4-iac-azure
...

 

To get this working you also need to add the rest of the backend details to the main.tf of the Terraform scripts.

 

terraform {
    backend "azurerm" {
       resource_group_name = "tstate"
       storage_account_name = "tstatehackathon"
    }
}

 

The pipeline execution (step 4) normally finishes in around two minutes. Below is a representation of the results of the pipeline execution.

 

pipelineExecutionResults.png

 

Main pipeline

Now it's time to do the bulk of the work which is setting up an Azure AKS Kubernetes cluster (step 5) and deploy Viya.  We set up the AKS cluster with the Docker image we pushed before to the Azure Container Registry. To deploy SAS Viya we make use of another very handy project on GitHub, SAS Viya4 deployment.

 

Just as with the first pipeline the idea is to check-in all necessary & parameterized files in an Azure DevOps project with an associated Azure Repos Git.

  • azure_docker_creds.env
  • terraform.tfvars
  • ansible-vars-iac-azure.yaml
  • TLS certificates

 

The first file contains the authentication information we will source during a specific stage in the pipeline.  It allow the Azure DevOps pipeline to authenticate Terraform to access our Azure subscription as described in the instructions for the SAS Viya 4 IaC for Azure project. Again, we need to add an additional variable to the azure_docker_creds.env file.  The ARM_ACCESS_KEY provides access the backend Azure storage so the Terraform state file can be saved in there.  Below is a representation of the file contents.

 

TF_VAR_subscription_id="00000000-0000-0000-0000-000000000000"
TF_VAR_tenant_id="00000000-0000-0000-0000-000000000000"
TF_VAR_client_id="00000000-0000-0000-0000-000000000000"
TF_VAR_client_secret="00000000-0000-0000-0000-000000000000"
ARM_ACCESS_KEY=__storagekey__

 

The pipeline queries the value of the storagekey with the appropriate credentials during the run.  In the terraform.tfvars and the ansible-vars-iac-azure.yaml I've added a few parameters to replace values during the run.

 

Below is a representation of the files.

 

terraform.tfvars

# **************** REQUIRED VARIABLES ****************
# These required variables' values MUST be provided by the User
prefix = "hackinsas-__team__"
location = "__location__" # e.g., "eastus2"
# **************** REQUIRED VARIABLES ****************

 

ansible-vars-iac-azure.yaml

## Cluster
NAMESPACE: hackinsas-__team__
....
JUMP_SVR_HOST: __jmpip__
JUMP_SVR_USER: jumpuser
JUMP_SVR_PRIVATE_KEY: /data/auto-private-key-__team__.pem
...
## Ingress
V4_CFG_INGRESS_TYPE: ingress
V4_CFG_INGRESS_FQDN: __team__-hackinsas.vectorlabs.sas.com 
V4_CFG_TLS_MODE: full-stack # [full-stack|front-door|disabled]
V4_CFG_TLS_CERT: /data/vector.crt
V4_CFG_TLS_KEY: /data/vector.key
V4_CFG_TLS_TRUSTED_CA_CERTS: /data/vector-cacerts.crt
..

 

With all of this in place, I kick off the Azure DevOps pipeline. This process takes about 30 minutes to complete.  During this time the setup of the infrastructure and the SAS Viya deployment is done automatically for me. This allows me to quickly spin up the necessary environments as teams are enrolling for the hackathon.  Here (graphic below) I start the pipeline manually but it can also be kicked off by using specific triggers.

 

runPipeline.png

 

 

When the execution completes, the pipeline creates an AKS cluster with six nodepools and kicks off a full SAS Viya deployment. About an hour later you can start working on the environment.

 

Details of the AKS cluster

System node Pool default_nodepool_vm_type = "Standard_D4_v2"

Node Pool

Vm_type

Min nodes

Max Nodes

system

D4_v2

2

2

CAS

E16ds_v4

1

1

compute

E16ds_v4

1

1

connect

E8ds_v4

1

1

stateless

E16ds_v4

1

2

stateful

E16ds_v4

1

3

 

As you can see from the screenshot below, I split the work in the pipeline over different stages and jobs.  Accept the challenge to provide more structure and better naming conventions 😊.  I also added the option to destroy the IaC and clean all the Azure resources when the Hackathon is finished.

 

pipelineStages.png

 

And if you checked the details of the image below, you might see I'm publishing two artifacts.  The build artifact is a build-team.zip file that contains all the manifests, kustomization.yaml and site-config folder of the deployment.  The drop folder has the necessary files to connect to the Kubernetes cluster and Jump server created by the IaC terraform scripts. Finally, with that I can access and customize the SAS Viya environment if required.

 

publishedArtifacts.png

 

 

Finally

If this all is a bit abstract for you, I recorded a video demonstrating the Azure DevOps pipeline.  So, yes, we are getting ready for the Hackathon!

Comments

Mind-bending! Well done Frederik!

Cannot wait for that Video. this is awesome!

Hi Fredrik,

 

Did you make a video documenting all these steps?

 

Where can I find it? We are struggling a little bit installing our own environment on Azure.

 

Best regards,

Morten

Hi @mortenlangvik @McDiddles , it took a while but I now recorded a video with the latest release of the pipeline.

vki

Hi @FrederikV ,

 

Really liked this work and I would like to replicate it for our environment as well.
Would it be possible for you to share the DevOps pipeline YAML files?

Thanks,

Vaibhav

This is great Fredrick, Good work. Does this environment include all sas viya solutions like risk management, we want to deploy a new sas model for our risk management team.

@toby4all  

Hi,

my deployment pipeline didn't include risk solutions.  But you can deploy any SAS software order with such automation.  This deployment pipeline is based on a viya4-iac-azure GitHub project (https://github.com/sassoftware/viya4-iac-azure) and a viya4-deployment Github project (https://github.com/sassoftware/viya4-deployment).   The pipeline is running the content of both projects .  Extra risk software components would require specific customizations to be added.

Frederik

Thank you very much for your response.

I understand everything better now. I can use this IAC github project to
deploy a sas environment for our risk modeling platform on the cloud like
Azure and we can deploy models and decisions from published from sas
directly to the already provisioned resources on Azure.

Thanks once again.

Regards
Version history
Last update:
‎10-25-2021 12:28 PM
Updated by:
Contributors

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

Free course: Data Literacy Essentials

Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning  and boost your career prospects.

Get Started

Article Tags