Watch this Ask the Expert session to learn how SAS Event Stream Processing supports ONNX format models for inferencing using the ONNX Runtime technology. We’ll cover how data scientists can integrate trained ONNX format models based on Python for streaming analytics on the edge and cloud.
Watch the webinar
You will learn:
How to take advantage of models like neural nets for computer vision use cases including object detection and classification.
How the integration of ONNX Runtime with SAS Event Stream Processing combines streaming analytics with simplified testing and deployment on different GPU and CPU accelerated hardware.
How you can deploy ONNX format models to gain maximum benefit using sample projects provided through GitHub examples.
The questions from the Q&A segment held at the end of the webinar are listed below and the slides from the webinar are attached.
Q&A
Why are pre- and post- processing steps required?
The pre-processing step might be required to standardize the input in the format that the model understands, this step includes image decoding, data normalization, and tensor conversion. Similar Post-Processing might be required to transform the output into actionable information (e.g. bounding box for object detection). The needs of those steps are linked to the way the model was coded, this means that some of them might require more complex processing than others. There are some techniques, in an early development stage, to move most of pre and post-processing code inside the model itself to reduce or eliminate the needs of those external processes in the future.
Where can I find examples so I can use this in my company?
ONNX runtime ESP integration Samples are made available in GitHub at this link: https://github.com/sassoftware/iot-sas-esp-onnx-runtime. Those examples will be updated over time to include additional models. Feel free to use the issue feature if you need support or if you would like to request an additional model to be covered.
Are all ONNX Runtime execution providers supported?
As of now, we decide to support the only most relevant ones that are: CUDA, TensorRT, and OpenVINO. However, we are interested to hear about your specific requirement and use cases that might involve additional execution providers.
Recommended Resources
Operationalize ONNX Models with SAS ESP
ONNX Integration with Microsoft Blog Post
SAS Event Stream Processing Page (product and solution briefs)
Monthly SAS Viya Release Highlights
Want more tips? Be sure to subscribe to the Ask the Expert board to receive follow-up Q&A, slides and recordings from other SAS Ask the Expert webinars.
... View more