*Bounty: 200*

*Bounty: 200*

Why does *“the asymptotic nature of logistic regression”* make it particularly prone to overfitting? (source):

I understand the **LogLoss** (cross entropy) grows quickly as $y$ (true probability) approaches $1-y’$ (predicted probability):

but **why** does that mean that imply *“the asymptotic nature of logistic regression keep driving loss towards 0 in high dimensions without regularization”*?

In my mind, just because the loss can grow quickly (if we get very *close* to wrong and full opposite answer), it doesn’t mean that it would thus try to fully interpolate the data.