Learning Library

← Back to Library

Bias‑Variance Tradeoff Explained

Key Points

  • The speaker illustrates underfitting and overfitting with simple graphs, showing that too few training epochs leave the model unable to capture the data, while too many epochs cause it to memorize every point.
  • Bias is described as the systematic error between predictions and true values; high bias oversimplifies the data and leads to underfitting.
  • Variance is the variability of predictions across the dataset; high variance causes the model to memorize training points and results in overfitting.
  • A model with high bias and low variance underfits, whereas a model with high variance and low bias overfits, so the goal is to achieve both low bias and low variance.
  • The bias‑variance trade‑off is visualized as total error versus model complexity, where increasing complexity reduces bias but raises variance, and the optimal model balances the two to minimize overall error.

Full Transcript

# Bias‑Variance Tradeoff Explained **Source:** [https://www.youtube.com/watch?v=tUs0fFo7ki8](https://www.youtube.com/watch?v=tUs0fFo7ki8) **Duration:** 00:04:34 ## Summary - The speaker illustrates underfitting and overfitting with simple graphs, showing that too few training epochs leave the model unable to capture the data, while too many epochs cause it to memorize every point. - Bias is described as the systematic error between predictions and true values; high bias oversimplifies the data and leads to underfitting. - Variance is the variability of predictions across the dataset; high variance causes the model to memorize training points and results in overfitting. - A model with high bias and low variance underfits, whereas a model with high variance and low bias overfits, so the goal is to achieve both low bias and low variance. - The bias‑variance trade‑off is visualized as total error versus model complexity, where increasing complexity reduces bias but raises variance, and the optimal model balances the two to minimize overall error. ## Sections - [00:00:00](https://www.youtube.com/watch?v=tUs0fFo7ki8&t=0s) **Untitled Section** - - [00:03:07](https://www.youtube.com/watch?v=tUs0fFo7ki8&t=187s) **Finding the Ideal Model Complexity** - The speaker explains that increasing model complexity lowers bias but raises variance, and the optimal solution is to select a complexity that balances both to minimize total error and avoid over- or underfitting. ## Full Transcript
0:00As a machine learning engineer, 0:01you may have experienced this dilemma. 0:04You've cleaned and processed your data 0:05and now it's time to train your machine learning model. 0:09Let's draw an example of what your graph might look like. 0:12For a data set with these following points as an example, 0:17you're probably expecting a graph that looks like this. 0:22However, after training your machine learning model, 0:25you find out that it looks like this. 0:28So obviously the data is underfitting 0:32and the model wasn't able to learn the training data well enough. 0:36So we can fix that by training the data for a longer amount of time. 0:40However, now it looks like this. 0:44It's fitting almost every single data point exactly. 0:47So it looks like the model has learned the training data a little too well. 0:52Well, why does this happen? 0:54In a previous video in our channel, 0:55we talked about how overfitting and underfitting 0:58can affect machine learning models. 1:00But let's dive deeper into the root cause of the problem, 1:02which is bias and variance. 1:05So what do those terms mean? 1:08Bias and variance are two types of error 1:11that can lead to underfitting or overfitting 1:13in machine learning models. 1:15Let's talk about bias first. 1:17Bias can be defined as the difference between the predicted values 1:22and the actual values, also known as the ground truth. 1:27When the bias is high, 1:29the model fails to recognize patterns in the data 1:31and it starts to oversimplify the data. 1:36When the data is oversimplified so much 1:38that it's not able to recognize patterns or complexities at all, 1:42we can call that underfitting. 1:46Let's talk about variants next. 1:49Variance can be defined 1:51as the variability in predictions for each value in the data set. 1:56When the variance is high, 1:58the model basically memorizes all of the points 2:00in the training data set, 2:02instead of memorizing the overall complexity 2:05and patterns behind the data. 2:08When this happens, we call that overfitting. 2:12In short, a model with high bias 2:15and low variance will tend to underfit. 2:19And on the other hand, 2:20a model with high variance and low bias will tend to overfit. 2:26We don't want our graph to look like either one of these graphs. 2:30Ideally, we want a model that has both low bias and low variance. 2:35In other words, we want a model that is able to recognize 2:39complexities and patterns in the training data, 2:42but also on data that it hasn't seen before. 2:45This is known as the bias variance trade off. 2:49All right, let's take a closer look at a graph that you may have seen before 2:53that illustrates the bias variance trade off. 2:59This is a graph that shows 3:01how the total amount of error changes 3:04as model complexity increases. 3:09We can think of model complexity 3:11as a way to measure how well a model is able to recognize 3:16relationships and patterns in data. 3:21We notice that as the model complexity increases, 3:25the total amount bias decreases. 3:30We also notice that as the model complexity increases, 3:34that the amount of variance increases. 3:42And as the variance and bias change, 3:45the total amount of error also changes. 3:52So our overall goal 3:55is to minimize both the variance and bias 3:58such that we can get the lowest amount of error, 4:01and that will usually be in this sweet spot right here. 4:06This is our ideal complexity. 4:14So, in short, the best way to fix and prevent 4:18overfitting and underfitting is to find the ideal complexity in the model 4:23that allows you to reduce both variance and bias, 4:26while also reducing the total amount of error. 4:30Thanks for watching 4:31and as always, please remember to like and subscribe.