BookmarkSubscribeRSS Feed
Jph1030
Calcite | Level 5
Hello!

This probably sounds fairly simple, but I am brand new to this, does anyone have tips on how to simply reduce the misclassification rate when using a neural network node? Thank you!
2 REPLIES 2
ger15xxhcker
Quartz | Level 8
There are several ways to reduce the misclassification rate when using a neural network node in SAS Enterprise Miner, including:

1. Increase the number of layers and nodes in the neural network.
2. Increase the amount of training data or reduce overfitting.
3. Perform feature selection or scaling of features.
4. Add regularization to the neural network architecture to reduce overfitting.
5. Adjust learning parameters such as the learning rate.
6. Experiment with different activation functions.
7. Increase or decrease the number of iterations used to train the model.
8. Use different optimizers such as Adam optimizer.
9. Try ensemble methods such as bagging or boosting.
sbxkoenk
SAS Super FREQ

Hello,

 

It's better to give a bit more info!

That makes it easier for us to give a focused answer.

 

  • Do you have a binary target or a multi-class (multinomial) target variable?
  • Is the event of interest a rare level?
  • Do you have an abundance of observations such that you can do data splitting? (there are alternatives if not)
  • What's the dimensionality in your input space (number of x-variables)? Ever heard about dimensionality reduction or feature engineering?
  • Are you running into overfitting problems (like good results on TRAIN data , but poor results on VALID data)?
  • ...

Koen

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to choose a machine learning algorithm

Use this tutorial as a handy guide to weigh the pros and cons of these commonly used machine learning algorithms.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 2 replies
  • 1286 views
  • 0 likes
  • 3 in conversation