Discovering Data
  • Home
  • Blog

#100DaysOfDataScience

Day 18 - bias-variance trade-off

7/23/2018

0 Comments

 
The following is an amalgamation of different sources

Models are often approximations, they simplify reality. This is bias. Some algorithms, such as decision trees are low bias while linear regression has high bias. Higher bias enables faster learning but
can result in underfitting

There will often be some error from fluctuations in the training data - this is variance. Again some algorithms show high variance, while others show less. For example KNN has high variance while LDA 
has low variance. High variance results in overfitting

The goal in ML is to get both low bias and low variance. However:

Increasing the bias will decrease the variance.
Increasing the variance will decrease the bias.

The bias - variance trade off is a central problem in supervised learning. I think the moral of this story is: always beware of anyone claiming they have a model that always gives 100%
0 Comments



Leave a Reply.

Proudly powered by Weebly
  • Home
  • Blog