# Lecture 25 – Regression and Least Squares¶

## DSC 10, Spring 2023¶

### Announcements¶

• Lab 7 is due tomorrow at 11:59 PM.
• It is true that your lowest lab is dropped.
• However, it's a bad idea to simply ignore this lab, because it's the only assignment on regression, which will be tested on the Final Exam.
• The Final Project is due on Tuesday 6/6 at 11:59PM.
• Issues saving your Final Project notebook? Watch this video!
• The Final Exam is on Saturday, 6/10 from 7-10PM. More details to come over the weekend, but start prepping by working through old Final Exams at practice.dsc10.com.
• Take them as if they're a real exam – time yourself, and don't use resources other than the reference sheet.
• The Grade Report has been updated – it reflects your scores on everything other than Lab 7, next week's discussion, the Final Project, and the Final Exam.
• Please fill out CAPEs! We will have an internal End-of-Quarter survey as well, to be released this weekend.
• The application to be a DSC 10 tutor in the fall has been released! The application can be found here, and more information can be found here.

### Agenda¶

• The regression line in standard units.
• The regression line in original units.
• Outliers.
• Errors in prediction.

## The regression line in standard units¶

### Example: Predicting heights 👪 📏¶

Recall, in the last lecture, we aimed to use a mother's height to predict her adult son's height.

### Correlation¶

Recall, the correlation coefficient $r$ of two variables $x$ and $y$ is defined as the

• average value of the
• product of $x$ and $y$
• when both are measured in standard units.

### The regression line¶

• The regression line is the line through $(0,0)$ with slope $r$, when both variables are measured in standard units. • We use the regression line to make predictions!
• Example: If a mother's height is 0.5 SDs above the average mother's height, and $r = 0.32$, our prediction is that her son's height will be $0.5 \cdot 0.32 = 0.16$ SDs above the average son's height.
• Issue: To use this form of the regression line, we need to know mothers' heights in standard units, but it would be more convenient to think in terms of inches.

## The regression line in original units¶

### Reflection¶

Each time we wanted to predict the height of an adult son given the height of his mother, we had to:

1. Convert the mother's height from inches to standard units.
1. Multiply by the correlation coefficient to predict the son's height in standard units.
1. Convert the son's predicted height from standard units back to inches.

This is inconvenient – wouldn't it be great if we could express the regression line itself in inches?

### From standard units to original units¶

When $x$ and $y$ are in standard units, the regression line is given by What is the regression line when $x$ and $y$ are in their original units (e.g. inches)? ### The regression line in original units¶

• We can work backwards from the relationship $$\text{predicted } y_{\text{(su)}} = r \cdot x_{\text{(su)}}$$ to find the line in original units.
$$\frac{\text{predicted } y - \text{mean of }y}{\text{SD of }y} = r \cdot \frac{x - \text{mean of } x}{\text{SD of }x}$$
• Note that $r, \text{mean of } x$, $\text{mean of } y$, $\text{SD of } x$, and $\text{SD of } y$ are constants – if you have a DataFrame with two columns, you can determine all 5 values.
• Re-arranging the above equation into the form $\text{predicted } y = mx + b$ yields the formulas:
$$\boxed{m = r \cdot \frac{\text{SD of } y}{\text{SD of }x}, \: \: b = \text{mean of } y - m \cdot \text{mean of } x}$$
• $m$ is the slope of the regression line and $b$ is the intercept.

Let's implement these formulas in code and try them out.

Below, we compute the slope and intercept of the regression line between mothers' heights and sons' heights (in inches).

So, the regression line is

$$\text{predicted son's height in inches} = 0.365 \cdot \text{mother's height in inches} + 45.858$$

### Making predictions¶

What's the predicted height of a son whose mother is 62 inches tall?

What if the mother is 55 inches tall? 73 inches tall?

## Outliers¶

### The effect of outliers on correlation¶

Consider the dataset below. What is the correlation between $x$ and $y$?

### Removing the outlier¶

Takeaway: Even a single outlier can have a massive impact on the correlation, and hence the regression line. Look for these before performing regression. Always visualize first!

## Errors in prediction¶

### Motivation¶

• We've presented the regression line in standard units as the line through the origin with slope $r$, given by $\text{predicted } y_{\text{(su)}} = r \cdot x_{\text{(su)}}$. Then, we used this equation to find a formula for the regression line in original units.
• In examples we've seen so far, the regression line seems to fit our data pretty well.
• But how well?
• What makes the regression line good?
• Would another line be better?

### Example: Without the outlier¶

We think our regression line is pretty good because most data points are pretty close to the regression line. The red lines are quite short.

### Measuring the error in prediction¶

$$\text{error} = \text{actual value} - \text{prediction}$$
• A good prediction line is one where the errors tend to be small.
• To measure the rough size of the errors, for a particular set of predictions:
1. Square the errors so that they don't cancel each other out.
2. Take the mean of the squared errors.
3. Take the square root to fix the units.
• This is called root mean square error (RMSE).
• Notice the similarities to computing the SD!

### Root mean squared error (RMSE) of the regression line's predictions¶

First, let's compute the regression line's predictions for the entire dataset.

To find the RMSE, we need to start by finding the errors and squaring them.

Now, we need to find the mean of the squared errors, and take the square root of that. The result is the RMSE of the regression line's predictions.

The RMSE of the regression line's predictions is about 2.2. Is this big or small, relative to the predictions of other lines? 🤔

### Root mean squared error (RMSE) in an arbirtrary line's predictions¶

• We've been using the regression line to make predictions. But we could use a different line!
• To make a prediction for x using an arbitrary line defined by slope and intercept, compute x * slope + intercept.
• For this dataset, if we choose a different line, we will end up with different predictions, and hence a different RMSE.

Let's compute the RMSEs of several different lines on the same dataset.

### Finding the "best" prediction line by minimizing RMSE¶

• RMSE describes how well a line fits the data. The lower the RMSE of a line is, the better it fits the data.
• There are infinitely many slopes and intercepts, and thus infinitely many RMSEs. How do we find which combination of slope and intercept have the lowest RMSE?
• If you take DSC 40A, you'll learn how to do this using calculus. For now, we'll use a function that can do it automatically – minimize.

### Aside: minimize¶

• The function minimize takes in a function as an argument, and returns the inputs to the function that produce the smallest output.
• For instance, we know that the minimizing input to the function $f(x) = (x - 5)^2 + 4$ is $x = 5$. minimize can find this, too:
• The minimize function uses calculus and intelligent trial-and-error to find these inputs; you don't need to know how it works under the hood.

### Finding the "best" prediction line by minimizing RMSE¶

We'll use minimize on rmse, to find the slope and intercept of the line with the smallest RMSE.

Do these numbers look familiar?

### Coincidence?¶

The slopes and intercepts we got using both approaches look awfully similar... 👀

### The regression line is the best line!¶

• It turns out that the regression line we defined before before minimizes the root mean squared error (RMSE) among all lines.
$$m = r \cdot \frac{\text{SD of } y}{\text{SD of }x}$$$$b = \text{mean of } y - m \cdot \text{mean of } x$$
• It is the best line, regardless of what our data looks like!
• All equivalent names:
• Line of “best fit”.
• Least squares line.
• Regression line.
• The technique of finding the slope and intercept that have the lowest RMSE is called the method of least squares.

### Quality of fit¶

• The regression line describes the "best linear fit" for a given dataset.
• The formulas for the slope and intercept work no matter what the shape of the data is.
• But the line is only meaningful if the relationship between $x$ and $y$ is roughly linear.

### Example: Non-linear data¶

What's the regression line for this dataset?