#StackBounty: #machine-learning #loss-functions #training-error Learning curve vs training (loss) curve?

Bounty: 50

In machine learning, there are two commonly used plots to identify overfitting.

One is the learning curve, which plots the training + test error (y-axis) over the training set size (x-axis).

The other is the training (loss/error) curve, which plots the training + test error (y-axis) over the number of iterations/epochs of one model (x-axis).

Why do we need both curves? Specifically, what does a learning curve tell us over a training curve? (If we just want to detect if a model overfits, the training curve seems much more efficient to plot.)


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.