e-Statistics

Inference on parameters

The logistic regression model

$\mathrm{logit}(p_j) = \beta_0 + \beta_1 x_{1j} + \beta_2 x_{2j} + \cdots + \beta_k x_{kj}$

is obtained for $j=1,\ldots,n$. The probability $p_j$ of Yes must be identified by (i) the pair of variables for the count $N_{1j}$ of Yes and $N_{0j}$ of No, or (ii) the binary response variable $Y_{j}$ which specifies Yes or No ("1" or "0"). Predictors $x_{1{j}}$ to $x_{k{j}}$ are obtained for all the different groups or conditions, and may be summarized in n of these combinations.

  1. In order to start over again, you need to clear the model formula.

  2. From the data above, (i) a pair of columns for "Yes" count and "No" count must be selected one by one, or (ii) the single column of response is considered as a binary variable.

  3. It builds a model formula for the predictors $x_{1{j}}$ up to $x_{k{j}}$ (independent variables) in a form

    where we set columns of predictor one by one for the model.

The result of fitting the logistic regression is obtained in the table below.

For each parameter from the intercept $ \beta_0$ to the slope coefficients $ \beta_1,\ldots,\beta_k$, the summary result shows:

  1. The confidence interval (Lower, Upper) is calculated for $ \beta_i$ by using a profile likelihood method.
  2. The null hypothesis $ H_0: \beta_i = 0$ is constructed, and the Pvalue is obtained from the Wald test statistic.
  3. Each slope parameter $ \beta_i$ is interpreted as "log odds ratio (OR)" for each covariate (that is, predictor). Thus, the estimate of odds ratio (OR) becomes $ e^{\hat{\beta}_i}$. Likewise, the confidence interval for OR is obtained by applying the exponential transformation to (Lower, Upper) of $ \beta_i$.


© TTU Mathematics