25 Introduction to Linear Regression

Jenna Lehmann

Linear regression is a method for determining the best-fitting line through a set of data. In a lot of ways, it’s similar to a correlation since things like r and r^2 are still used. The one difference is that the purpose of regression is prediction. The best-fitting line is calculated through the minimization of total squared error between the data points and the line.

The equation used for regression is Y = a + bx or some variation of that. If you remember from algebra class, this formula is like Y=mx + b. This is because they are both the linear equation. Although you may be asked to report r and r^2, the purpose of regression is to be able to find values for the slope (b) and the y-intercept (a) that creates a line that best fits through the data.

 

Standard Error of the Estimate

Regression equations make a prediction, and the precision of the estimate is measured by the standard error of the estimate. The standard error of the estimate is a measure of the accuracy of predictions made with a regression line and has to do with how wide the data points are scattered (strength of the correlation). In other words, it tells you how far away the points tend to be from the prediction line.

Here is a playlist of videos that may be helpful[1][2]:


This chapter was originally posted to the Math Support Center blog at the University of Baltimore on June 18, 2019.


  1. Longstreet, D. [statisticsfun]. (2012, February 5). An Introduction to Linear Regression Analysis [Video]. YouTube. https://www.youtube.com/watch?v=zPG4NjIkCjc&list=PLF596A4043DBEAE9C&index=1
  2. Longstreet's resources available through his "statistics fun" channel is extensive. While the title may be off-putting, "My Book Sucks" is an incredibly useful CC-BY licensed resource: https://www.youtube.com/user/statisticsfun/about

License

Share This Book