BookmarkSubscribeRSS Feed

Neural Network Models: Supervised Learning in SAS Visual Data Mining and Machine Learning

Started ‎03-22-2017 by
Modified ‎06-14-2017 by
Views 3,419

Neural Network Models (PROC NNET)

In a previous post, I summarized the supervised learning models (the regressions). In this post, I'll explore neural network models. 

 

Artificial neural networks attempt to mimic the human brain. Neural networks are universal approximators, meaning they can model any input-output relationship. Neural networks are comprised of processing elements (commonly called units or neurons).

 

Examples of neural networks are multilayer perceptrons (MLPs) and radial basis algorithms (RBAs). Multilayer perceptrons can be thought of as regressions on hidden units, and the hidden units as themselves regressions on the original inputs.

 

MLP with Two Hidden Layers.png

 

As the name “multilayer” implies, there are multiple layers. The hidden units include a default link function (an activation function), often the hyperbolic tangent. The connection weights are tuned based on “experience,” i.e., repeated iterations that provide correct or incorrect outputs.

 

SAS PROC NNET trains a multilayer perceptron neural network. This requires that the nonlinear objective function be minimized. Finding a global minimum is impractical, but a good solution can usually be found by training the neural network repeatedly using different sets of initial weight values. The optimization algorithms available in PROC NNET are the limited-memory Broyden-Fletcher-Goldfarb-Shanno algorithm (LBFGS) and the stochastic gradient descent algorithm (SGD). Currently up to 10 hidden layers are allowed.  An autoencoder, where the inputs instead of a target are used as the output layer, can be trained as well.

 

Hyperparameter tuning is available in PROC NNET to find the best values for the number of hidden layers and hidden units in each layer, the L1 and L2 regularization parameters, the annealing rate parameter (for SGD optimization), and the learning rate parameter (for SGD optimization).  There are several objective functions to choose from for the optimization algorithm as well as search methods, including one based on a genetic algorithm. 

Contributors
Version history
Last update:
‎06-14-2017 11:41 AM
Updated by:

hackathon24-white-horiz.png

The 2025 SAS Hackathon has begun!

It's finally time to hack! Remember to visit the SAS Hacker's Hub regularly for news and updates.

Latest Updates

SAS AI and Machine Learning Courses

The rapid growth of AI technologies is driving an AI skills gap and demand for AI talent. Ready to grow your AI literacy? SAS offers free ways to get started for beginners, business leaders, and analytics professionals of all skill levels. Your future self will thank you.

Get started

Article Tags