Many of us are now using cloud-based productivity tools like Microsoft Office 365, hosted in Microsoft Azure. And some of us are using SAS software in the cloud, too. For those of us who use SAS to read and create Microsoft Excel documents, cloud-based files can add an extra wrinkle when we automate the process. It also adds some exciting possibilities!
The Microsoft 365 suite offers APIs to discover, fetch, and update our documents using code. In this video, I show you how to use SAS programs to reach into your Microsoft OneDrive, Microsoft Teams and SharePoint cloud to read and update your files. The approach relies on the REST APIs in Microsoft Office 365 and on the HTTP procedure in SAS.
Once you have the basic connection working, you can use the Microsoft Graph API to work with other objects too, such as SharePoint lists.
Chris, just following up after our email exchange to also let the community know how valuable these techniques have been for us! Using your tutorials and your code, along with input from Joseph Henry, we are successfully integrating our SAS 9.4 Grid with our Sharepoint Online O365 environment. Thanks so much for sharing with all of us!
Thanks Chris for this valuable post !! This made my job easy.
Chris,
I am having trouble getting our logic to work for Sharepoint Lists, we have it working to get a file and save to sharepoint a file (some issues with 200 record limit, and large files). But also we have a requirement to read and write to a sharepoint list, do you have any sas code examples for this
Greg Kokanour
SCE
@gkokanour I don't have code samples for these scenarios. The 200 record limit (typical for APIs like this) usually requires you to process the JSON response and read the remaining "count" or get a URL endpoint that will fetch the next batch in the list...until the list is exhausted.
@ChrisHemedinger I am having trouble trying to get the folder id for the record past the initial 200 limit, how would I go about processing it again (sas code logic) to get the next 200 records?
When you say process the JSON response, this the code that I am running that hits the 200 record limit, so do I just do it a 2nd time both ?
filename resp TEMP;
proc http url="https://graph.microsoft.com/v1.0/me/drives/&driveId./items/root/children"
oauth_bearer="&access_token"
out = resp;
run;
libname jresp json fileref=resp;
/* Create a data set with the top-level paths/files in the drive */
data paths;
set jresp.value;
run;
Registration is open! SAS is returning to Vegas for an AI and analytics experience like no other! Whether you're an executive, manager, end user or SAS partner, SAS Innovate is designed for everyone on your team. Register for just $495 by 12/31/2023.
If you are interested in speaking, there is still time to submit a session idea. More details are posted on the website.
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.