Lecture
22

Multiple Logistic Regression

Just as in OLS regression, logistic
regression can be used with more than one predictor. The analysis options are
similar to regression. One can choose to select variables, as with a stepwise
procedure, or one can enter the predictors simultaneously, or they can be
entered in blocks.

The interpretation is similar.
Slopes __and__ odds ratios represent the "partial" prediction of
the dependent variable. A slope for a given predictor represents the average
change in y for each unit change in x, holding constant the effects of the
other variable. For instance, we might examine the prediction of CHD by age,
controlling for or holding constant the effects of gender. We might expect, for
instance, that men will have greater risk of CHD, so if our sample contains men
and women, we might want to "partial out" the effects of gender on
CHD. The odds ratio then represents the risk of CHD given an increase of 1 year
in age, controlling for or independent of gender.

Another example might be activity
level (or exercise). Because people who are more active will have a lower risk
of CHD, it might be hypothesized that the relationship between age and CHD is
partly or completely due to declining activity levels associated with age.
Thus, activity level might be considered a third variable which is responsible
for our initial relationship between age and CHD. If we controlled for activity
level, we might see a decline or an elimination of the predictive effect of
age. The odds ratios would tell us the __independent__ risk of activity
level and age for CHD.

As with OLS regression, logistic
regression can test interaction effects or curvilinear relationships. And the
analysis for these proceed similarly. With interactions, a third,
multiplicative term is computed which is the product of the two predictors. All
three are used to predict the dependent variable. With curvilinear effects, the
variable is squared to produce a new variable that tests the curvilinear
relationship. It is usually recommended that the two variables (squared and not
squared) are entered together to test the linear and cuvilinear relationships
of the predictor to the dependent variable.

As with OLS, we also want to test
the overall predictive efficiency of all predictors together. With SPSS, a
researcher might have to run several logistic regression analyses to get the
appropriate difference in chi-squares. For instance, if the researcher chooses
to do a stepwise procedure to select the significant predictors, he or she
might have to rerun the new analysis to get the difference in fit from the no
predictor model and the full model containing all the predictors.

My coverage and the text's coverage
of logistic regression has been an introductory one. If you would like to learn
more about logistic regression, the most accessible source I have found is the
following book:

Hosmer, D.W., & Lemeshow, S.
(1989). Applied logistic regression. New York: Wiley & Sons.