Reading from text, Lecture 18
- 11.6 - Here the point is estimating individual points. Given a particular
x-value, what is our estimate of the y-value.
- 11.7 - Remember, using the simple linear equation as a model is itself an
- 11.8 - The Analysis of Variance approach is another way of looking at that
test of whether or not the slope of the line is zero. In that sense it is less
general than what came before, but it is a useful way of conceptualizing what
is usually considered to be the important point.
- 11.9 - Even more useful is a design where the independent variable has values
that are used more than once: a designed experiment, with "groups". Now we can
really test whether or not the model is a good fit.
- 11.10 - Data transformations can serve a couple of useful purposes. One is
to take data that probably come from a non-linear function and make the transformed
data be from a linear function, so we can use all the machinery we just developed.
Another is to make the data conform to the "equal variances" assumption. Also in
this section are plotting procedures like the ones we have been using, and for
the same kinds of reasons. They are based on using the residuals which, remember,
we said act somewhat like the "scores" from before.
- 11.11 - Case study
- 11.12 - How to calculate correlation, which (remember back to The Algebra of
Expectation?) is a measure of the linear relationship between two variables.
Multiple Linear Regression
- 12.1 - Notice what makes a model "linear", and what would make it non-linear.
- 12.2 - Shows a couple of examples of MLR models
- 12.3 - Shows how to set up the matrix version, and calculate the estimates
of the coefficients
- 12.4 - How to calculate the next thing you need: the estimate of
&sigma2. Note that the ANOVA method once again only tests one thing: that
all of the coefficients except the intercept equal zero. (Or, all slopes zero)
- 12.5 - Tests of individual coefficients, confidence intervals for the mean at a
particular combination of independent variable values, confidence intervals for individual
points at a particular combination of independent variable values.
- We will not go on to fitting models empirically (e.g. stepwise regression)