#StackBounty: #logistic #regularization #overfitting Why is logistic regression particularly prone to overfitting?

Bounty: 200

Why does “the asymptotic nature of logistic regression” make it particularly prone to overfitting? (source):

enter image description here

I understand the LogLoss (cross entropy) grows quickly as $y$ (true probability) approaches $1-y’$ (predicted probability):

but why does that mean that imply “the asymptotic nature of logistic regression keep driving loss towards 0 in high dimensions without regularization”?

In my mind, just because the loss can grow quickly (if we get very close to wrong and full opposite answer), it doesn’t mean that it would thus try to fully interpolate the data.

Get this bounty!!!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.