문제

Whenever I train a neural network I only have it go through a few epochs ( 1 to 3). This is because I am training them on a bad CPU and it would take some time to have the neural network go though many epochs.

However, whenever my neural network performs poorly, rather than have it go through more epochs, I try to optimize the hyperparameters. This approach has generally been successful as my neural networks are pretty simple.

But is training a neural network in this manner a bad practice? Are there disadvantages to immediately going to optimize the hyperparameters rather than running the neural network for more epochs?

올바른 솔루션이 없습니다

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top