Hi,
Is it possible to automatically vary the number of hiddens units and layers to optimize neural network training?
In my tests, I have to do this manually, so I want to do this in the better way.
Thanks
Regards,
Bryan Silveira
Hey Bryan - simultaneously training a model and tuning the hyperparameters (in this case the architecture of the network) is not really a good approach...the optimizer's attempt to adjust the model parameters (in this case the weights) based on gradients of the loss function wrt those weights would be thwarted by completely changing the mathematical operations that led to that loss function. If the hyperparameter adjustment is not built-in to the training process, it will not work.
That being said, I think what you are really getting at is the ability to train a model for a given architecture, make adjustments to the architecture, train again, make adjustments, train, and so on...all automatically and intelligently. Hyperparameter tuning is a big area of research and we have an implementation built-in to our modeling procs in SAS Viya (e.g., PROC NNET). But for 9.4 (EM) you would have to do this in SAS code and write a macro to loop over values of the number of layers and neurons (in our Viya implementation, Autotune (see video here), we use optimization strategies to search the space much more effectively).
Register today and join us virtually on June 16!
sasglobalforum.com | #SASGF
View now: on-demand content for SAS users
Available on demand!
Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.
Use this tutorial as a handy guide to weigh the pros and cons of these commonly used machine learning algorithms.
Find more tutorials on the SAS Users YouTube channel.