Hi there community, Happy New Year!
Here are a couple of deep learning-related tutorials to get you going.
I think you’ll enjoy the explanation of dropout in deep learning by @RobertBlanchard in this tutorial. He tells a story about a “nose neuron” in a training process. “Dropout forces neurons in your model to become more generalists as opposed to specialists,” he explains.
@RobertBlanchard then tells us how to use batch normalization in a deep learning model. Batch normalization is typically used to solve – or at least mitigate – the internal covariate shift problem. Watch to learn more.
(Comments are closed on this message -- but visit YouTube and leave a comment on the video. Subscribe to the SAS Users YouTube channel to get more like it!)
Anna