I have read many articles about this, every time i stumble upon what actual it means intuitively.
So, here is my attempt to explain it in layman terms.
Low bias means you have less error with your training data. In other words, your model almost perfectly aligned with your training data. #overfitting
Low variance means you have very less error with your testing data. Model has been generalized. #underfitting
Ideal condition would be low bias and low variance. Mostly which we will not get. So, we need to tradeoff in such a way we are deriving towards the ideal goal.
Solution : Since we don't have much hold on the testing/unseen data, to reduce the generalization we can induce penalty while training the model. Inducing the penalty is called regularisation techniques. i.e : ridge, lasso techniques.