Large Language Models (LLMs) have rapidly become a powerful tool in the data and analytics world. With just a simple instruction, LLMs can execute and array of tasks such as summarization, translation, question-answering, and much, much more. Apart from their pre-existing knowledge, which has been distilled during a training process, LLMs also exhibit a behavior known as in-context learning, which refers to their ability to use the context provided to inform their response. This kind of in-context learning means that the LLM doesn't need to be fine-tuned to commit these tasks but rather it simply responds based on the instructions, examples, and context given at the time of execution.
To facilitate the interaction with an LLM and to take advantage of In-context learning within SAS Viya, we have developed a SAS Studio Custom Step, a low-code component that enables users to complete specific tasks in a reusable and streamlined manner. The LLM - Azure OpenAI In-context Learning custom step (available on a GitHub repository at https://github.com/SundareshSankaran/LLM-Azure-OpenAI-In-context-Learning ) helps you interact with your LLM to perform tasks based on your input data. All while using the UI to help guide your prompt engineering techniques. Specifically, this custom step supports in-context learning, where illustrative examples can be included in the prompt to guide the model's response. By adding it to your SAS Studio flows, you can quickly bring LLM capabilities to your analytics pipelines whether you are summarizing customer feedback, translating responses, or experimenting with ways to extract insights from your text data. And while this step focuses on completing a single task, it’s also a building block towards agentic systems where agents can reason, make decisions, and act within your workflow. This custom step helps pave that path, one prompt at a time.
Watch the short demo below to see the custom step in action.
Figure 1: Using LLM - Azure OpenAI In-context Learning custom step to summarize customer reviews. |
To use this step you will need the following requirements:
Once you have all of the requirements, use the following steps to get started:
Figure 2: Select text column to use as context within your prompt. |
Figure 3: An example of a system prompt. |
Figure 4: Example of a user prompt for summarization. |
Figure 5: An illustrative example showing the LLM how to respond. |
Figure 6: Output specifications related to the model parameters and the output table. |
Figure 7: Example of required details about your model needed for successful execution. |
This custom step makes it easy to integrate Azure OpenAI’s language models into your SAS Studio flows. Once configured, you can start generating your own LLM-driven insights based on your text data. Enhancements are on the way to provide even more flexibility in how you interact with LLMs, so stay tuned for future updates! Feel free to contact me with any questions or comments!
Catch the best of SAS Innovate 2025 — anytime, anywhere. Stream powerful keynotes, real-world demos, and game-changing insights from the world’s leading data and AI minds.
The rapid growth of AI technologies is driving an AI skills gap and demand for AI talent. Ready to grow your AI literacy? SAS offers free ways to get started for beginners, business leaders, and analytics professionals of all skill levels. Your future self will thank you.