Skip to content

Instantly share code, notes, and snippets.

@seenimohamed
Last active April 8, 2022 02:37
Show Gist options
  • Save seenimohamed/3dbe3cec88d0900ea39dca85a778a28c to your computer and use it in GitHub Desktop.
Save seenimohamed/3dbe3cec88d0900ea39dca85a778a28c to your computer and use it in GitHub Desktop.
Bias Variance tradeoff.

Bias Variance tradeoff.

I have read many articles about this, every time i stumble upon what actual it means intuitively.

So, here is my attempt to explain it in layman terms.

Low bias means you have less error with your training data. In other words, your model almost perfectly aligned with your training data. #overfitting

Low variance means you have very less error with your testing data. Model has been generalized. #underfitting

Ideal condition would be low bias and low variance. Mostly which we will not get. So, we need to tradeoff in such a way we are deriving towards the ideal goal.

Solution : Since we don't have much hold on the testing/unseen data, to reduce the generalization we can induce penalty while training the model. Inducing the penalty is called regularisation techniques. i.e : ridge, lasso techniques.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment