BookmarkSubscribeRSS Feed

Prompt, Generate, Repeat: Using Azure OpenAI In-Context Learning Custom Step in SAS Studio

Started 3 weeks ago by
Modified 3 weeks ago by
Views 330

Large Language Models (LLMs) have rapidly become a powerful tool in the data and analytics world.  With just a simple instruction, LLMs can execute and array of tasks such as summarization, translation, question-answering, and much, much more.  Apart from their pre-existing knowledge, which has been distilled during a training process, LLMs also exhibit a behavior known as in-context learning, which refers to their ability to use the context provided to inform their response.  This kind of in-context learning means that the LLM doesn't need to be fine-tuned to commit these tasks but rather it simply responds based on the instructions, examples, and context given at the time of execution. 

 

To facilitate the interaction with an LLM and to take advantage of In-context learning within SAS Viya,  we have developed a SAS Studio Custom Step, a low-code component that enables users to complete specific tasks in a reusable and streamlined manner.  The LLM - Azure OpenAI In-context Learning custom step (available on a GitHub repository at https://github.com/SundareshSankaran/LLM-Azure-OpenAI-In-context-Learning ) helps you interact with your LLM to perform tasks based on your input data. All while using the UI to help guide your prompt engineering techniques.  Specifically, this custom step supports in-context learning, where illustrative examples can be included in the prompt to guide the model's response.  By adding it to your SAS Studio flows, you can quickly bring LLM capabilities to your analytics pipelines whether you are summarizing customer feedback, translating responses, or experimenting with ways to extract insights from your text data. And while this step focuses on completing a single task, its also a building block towards agentic systems where agents can reason, make decisions, and act within your workflow.  This custom step helps pave that path, one prompt at a time.  

 

Watch the short demo below to see the custom step in action.

 

Figure 1: Using LLM - Azure OpenAI In-context Learning custom step to summarize customer reviews.

 

 

Requirements

To use this step you will need the following requirements:

  1. A SAS Viya 4 environment version 2025.02 or later
  2. Python configured and available to your SAS environment.
  3. The following Python packages installed:
    1. OpenAI
    2. Pandas
    3. SWAT
  4. Valid Azure OpenAI service with a large language model deployed.  Refer here for instructions.

 

 

How to Get Started

Once you have all of the requirements, use the following steps to get started: 

  1. (Conditionally optional) Start a CAS session if you are using CAS tables.
  2. Add a table to your flow
    • Keep in mind that a text column will serve as context in your prompt to your LLM.  So, make sure that the table you choose has the appropriate text needed for your prompt.
  3. Add the custom step to your flow.
    • Add the LLM – Azure OpenAI In-context Learning step into your flow from the Shared steps tab and connect the input table to the input port on the custom step.
  4. Select your text column.
    • On the Parameters tab, select your text column that you’d like to use in your prompt.
      column_selector_test_border.png
      Figure 2:  Select text column to use as context within your prompt.
       
  5. Write your prompt.
    • Provide a system prompt. System prompts typically set guidelines and instructions for the LLM
      system_prompt.png

      Figure 3:  An example of a system prompt. 

    • Provide a user prompt.  Here, you’ll add your question or command for the LLM.
      user_prompt.png
      Figure 4: Example of a user prompt for summarization. 
    • (Optional) Add examples to guide the model’s response.  This is where in-context learning comes into play.  You can include one example (one-shot prompting), a few examples (few-shot prompting), or none (zero-shot prompting) depending on how much guidance you want to give the model.
      illustrative_examples.png
      Figure 5: An illustrative example showing the LLM how to respond. 
  6. Choose whether to add your question to the output table.
    • If you'd like to include your user prompt as a column in the output table, check the checkbox next to Add question to output.
  7. Set your model parameters.
    • Temperature: Controls how creative or focused the model's response is. Lower values generate more focused answers; higher values allow for more variation.  Default is 1.
    • Top P: Sets a probability threshold for selecting the next word—another way to manage randomness in the response.  Default is 1
    • Max Tokens: Limits the length of the model's output.  As a rule of thumb, one token equals about four characters of English text.  Default is none.
    • Frequency Penalty: Discourages repetition by penalizing words that appear multiple times in the response.  Default is 0
    • Presence Penalty: Encourages novelty by applying a penalty to any word that has already been used, even once.  Default is 0.
      output_specifications.png
      Figure 6: Output specifications related to the model parameters and the output table. 
  8. Configure your model connection.
    • Provide the name of your Azure OpenAI model deployment.
    • Provide the path to your API key file. This can be stored as a text file within the SAS server. 
    • Provide your endpoint URL, model region, and API version. 
      model_config.png
      Figure 7: Example of required details about your model needed for successful execution.
  9. Connect the output table. 
    • This table will contain all the original columns, the response from the LLM, and the user prompt if the Add question to output box is checked.
  10. Run your flow and view the results.

 

This custom step makes it easy to integrate Azure OpenAI’s language models into your SAS Studio flows. Once configured, you can start generating your own LLM-driven insights based on your text data.  Enhancements are on the way to provide even more flexibility in how you interact with LLMs, so stay tuned for future updates!  Feel free to contact me with any questions or comments!

 

Some Useful Resources:

Version history
Last update:
3 weeks ago
Updated by:

sas-innovate-white.png

Missed SAS Innovate in Orlando?

Catch the best of SAS Innovate 2025 — anytime, anywhere. Stream powerful keynotes, real-world demos, and game-changing insights from the world’s leading data and AI minds.

 

Register now

SAS AI and Machine Learning Courses

The rapid growth of AI technologies is driving an AI skills gap and demand for AI talent. Ready to grow your AI literacy? SAS offers free ways to get started for beginners, business leaders, and analytics professionals of all skill levels. Your future self will thank you.

Get started