Both weight decay (regularization, adding weight to cost function to avoid overfitting) and early stopping (stop when performed the best at validation) are generally concept in machine learning models, not unique to neural networks.
Many use both, and there's no clear preference which also works better than the other. Neural network is very easy to overfit the data, so i would recommend using both
Aurora Peddycord-Liu