BookmarkSubscribeRSS Feed

Automatically destroying AWS infrastructure for SAS Viya

Started ‎07-14-2021 by
Modified ‎07-14-2021 by
Views 3,341

In my last post, Manually destroying AWS infrastructure for SAS Viya, I showed a point-and-click approach for using the AWS Console web site to find and delete unwanted SAS Viya resources in AWS. Like any UI-driven approach, it's pretty easy to follow, but isn't really scalable when the process needs frequent repeating.

 

In our SAS employee workshops, we encourage students to delete their infrastructure when they're done to save SAS (significant) costs. Still, problems happen, and we want an automated process to occasionally run through the environment and delete anything too old.

 

Now that we understand the challenge and the available tools a bit better, here are the approaches we're looking at to get this done.

Terraform

If you've tried the viya4-iac-aws project, then you're probably familiar with the Terraform software utility. This is a very good and predictable automation tool to create, manage, and delete infrastructure in AWS (or other cloud providers).

 

When things are in full swing, Terraform keeps track of the infrastructure in its .tfstate file. If you have a valid state file for your SAS Viya environment, then you'll definitely want to use Terraform to destroy any resources you're no longer using.

 

However, in my misadventures with deploying SAS Viya 4, I found myself on multiple occasions having lost or otherwise invalidated my Terraform state file. So I needed an alternate approach to delete multiple resources when Terraform wasn't an option. And further, I needed something which could scale up to wipe out multiple SAS Viya environments in one fell swoop.

cloud-nuke

When casting around for a known solution to the challenge of deleting many ad-hoc resources in AWS - that is, when you don't have a valid state file which can be used to direct Terraform to destroy resources - one utility suggested prominently is the cloud-nuke project from the team at gruntwork.io.

 

As the name suggests, it's a powerful utility with significant consequences if used indiscriminately. They even offer some pretty dire warnings about it's use:

 

...this tool is HIGHLY DESTRUCTIVE and deletes all resources! This should never be used in a production environment!

So take care when using it.

 

Cloud-nuke does offer some options to exclude resources from deletion, such as their type, or regions where they're placed, or their age. So that's great - we don't really have to blow away EVERYTHING when we use it.

 

For our employee workshops, we want to delete all student-built resources that are over 8 hours old. And so with a simple command, this can be accomplished:

$ cloud-nuke aws --older-than 8h

With results similar to:

[cloud-nuke] INFO[2021-07-01T14:12:47-04:00] The following resource types will be nuked:
– ami
– asg
– ebs
– ec2
– ecscluster
– ecsserv
– eip
– ekscluster
– elb
– elbv2
– iam
– lambda
– lc
– nat-gateway
– rds
– s3
– secretsmanager
– snap
– sqs
– transit-gateway
– transit-gateway-attachment
– transit-gateway-route-table
Retrieving active AWS resources in [eu-north-1, ap-south-1, eu-west-3, eu-west-2, eu-west-1, ap-northeast-3, ap-northeast-2, ap-northeast-1, sa-east-1, ca-central-1, ap-southeast-1, ap-southeast-2, eu-central-1, us-east-1, us-east-2, us-west-1, us-west-2, global]
… … …
Getting – 1-18 buckets of batch 1/1
The following 15 AWS resources will be nuked:
* ami ami-0cabf1cb05bd65e0a us-east-1
* snap snap-0f963596648e6711c us-east-1
* lambda SAS-Viya-test1-SASViyaStac-IsLicensePublicFunction-10RXMAJYBUWUL us-east-1
* lambda SAS-Viya-sasbcl1-SASViyaSt-IsLicensePublicFunction-1JVRM7V6SVHQX us-east-1
* lambda SAS-Viya-sasbcl1-SASViyaStack-XA01-DelCertFunction-ILC3CL1UCDHU us-east-1
* lambda SAS-Viya-test1-SASViyaStack-1IAU0K-LicenseFunction-H7VQTNZZBH1S us-east-1
* lambda SAS-Viya-test1-SASViyaStack-1IAU0K-DelCertFunction-SJRD2TC8E627 us-east-1
* lambda SAS-Viya-sasbcl1-SASViyaStack-XA01-LicenseFunction-C0JSO176P6M4 us-east-1
* s3 export-to-s3-turogb-us-east-1 us-east-1
* s3 robcollum us-east-1
* s3 sas-config-bucket-182696677754 us-east-1
* s3 sujeobbucket001 us-east-1
* s3 depot-cloudup us-east-1
* s3 sasbcl-test1 us-east-1
* s3 cf-templates-1th317d1r3j29-us-east-1 us-east-1

All we have to do now is set up a recurring job that'll run at regular intervals and we can keep the workshops from accidentally overrunning our budget for AWS resources. Great!

 

But…

 

What if we needed more discrimination of what to delete? See, for the employee workshops, we have students sign in to AWS using their SAS credentials and then they choose the IAM account/role we need them to use for the workshop.

 

From the perspective of AWS, every student is effectively the same person. So when cloud-nuke runs, it doesn't differentiate between who created the resources - it just deletes anything over 8 hours old (and optionally of the type or in the regions you specify).

 

This is fine for our employee workshops because we're treating all students the same. However, if you needed that level of differentiation, you'll need to do more. Since cloud-nuke is an open-source project, one possibility is to contribute additional differentiation back to the project.

 

Coda

For AWS, the easiest path to comprehensively deleting the AWS resources backing your SAS Viya environment is still to use Terraform (à la the viya4-iac-aws project). But if your Terraform state file is out of sync with the actual environment, or lost, then manually deleting resources is your next step.

 

Deleting items by hand using the AWS Console is effective, but there's little guidance on the exact sequence of events to get it done. That's why I wrote my previous article.

 

Achieving better automation is key to ensuring costs are properly maintained for our SAS employee workshops as described for my specific use-case, but just the same for our customers in the real world. 

Version history
Last update:
‎07-14-2021 02:36 PM
Updated by:
Contributors

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

Free course: Data Literacy Essentials

Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning  and boost your career prospects.

Get Started