# Week 13: Regression Inference and Residuals

Now you have learned Linear Regression techniques, it is time that we start to look at the behavior of our models that we create. In other words, how do they perform? Was Linear Regression the right choice? We are looking at the residuals and the standard deviation of these errors, also called the RMSE. Something interesting is that, in the CS domain, we normally create models by optimizing for the minimum error possible. In other words, we take the derivative and minimize our error. Then we find what slope/intercept corresponds to this. This is also called **Least Squares** and creates the **exact** same line as the **Linear Regression** line. Additionally, this week we will look at how we can use linear regression for inference. In practice, it turns out that a happy marriage between **linear regression** and **bootstrapping** can be used for this by creating confidence intervals!

## Discussion

Worksheet Slides Slides Annotated Solutions

## Lab

## Meme Submission

## Spotify Playlist

Contribute to the class playlist