In 1898, Hermon Bumpus, an American biologist working at Brown University, collected data on one of the first examples of natural selection directly observed in nature. Immediately following a bad winter storm, he collected 136 English house sparrows, Passer domesticus, and brought them indoors. Of these birds, 64 had died during the storm, but 72 recovered and survived. By comparing measurements of physical traits, Bumpus demonstrated physical differences between the dead and living birds. He interpreted this finding as evidence for natural selection as a result of this storm:
bumpus <- read_csv("http://wilkelab.org/classes/SDS348/data_sets/bumpus_full.csv")
## Parsed with column specification:
## cols(
## Sex = col_character(),
## Age = col_character(),
## Survival = col_character(),
## Length = col_integer(),
## Wingspread = col_integer(),
## Weight = col_double(),
## Skull_Length = col_double(),
## Humerus_Length = col_double(),
## Femur_Length = col_double(),
## Tarsus_Length = col_double(),
## Sternum_Length = col_double(),
## Skull_Width = col_double()
## )
bumpus$Survival <- factor(bumpus$Survival)
head(bumpus)
## # A tibble: 6 x 12
## Sex Age Survival Length Wingspread Weight Skull_Length Humerus_Length
## <chr> <chr> <fct> <int> <int> <dbl> <dbl> <dbl>
## 1 Male Adult Alive 154 241 24.5 31.2 17.4
## 2 Male Adult Alive 160 252 26.9 30.8 18.7
## 3 Male Adult Alive 155 243 26.9 30.6 18.6
## 4 Male Adult Alive 154 245 24.3 31.7 18.8
## 5 Male Adult Alive 156 247 24.1 31.5 18.2
## 6 Male Adult Alive 161 253 26.5 31.8 19.8
## # ... with 4 more variables: Femur_Length <dbl>, Tarsus_Length <dbl>,
## # Sternum_Length <dbl>, Skull_Width <dbl>
The data set has three categorical variables (Sex
, with levels Male
and Female
, Age
, with levels Adult
and Young
, and Survival
, with levels Alive
and Dead
) and nine numerical variables that hold various aspects of the birds’ anatomy, such as wingspread, weight, etc.
Problem 1: Make a logistic regression model that can predict survival status from all other predictor variables. (Include the categorical predictors Sex
and Age
.) Then do backwards selection, removing the predictors with the highest P value one by one, until you are only left with predictors that have P<0.1. How many and which predictors remain in the final model?
# R code goes here.
Discussion goes here.
Problem 2: Make a plot of the fitted probability as a function of the linear predictor, colored by survival. Make a density plot that shows how the two outcomes are separated by the linear predictor. Interperet your plots in 1-2 sentences. If you had to choose a cut-off value for alive or dead, where would it be?
# R code goes here.
Discussion goes here.
Problem 3: Add rugs to both the top and bottom of the plot above. BONUS: Add a curve for the logistic function. Explain how you created this curve in 1-2 sentences.
# R code goes here.
Discussion here.