Logistic Regression - Background

Background

Logistic regression can be binomial or multinomial. Binomial or binary logistic regression refers to the instance in which the observed outcome can have only two possible types (e.g., "dead" vs. "alive", "success" vs. "failure", or "yes" vs. "no"). Multinomial logistic regression refers to cases where the outcome can have three or more possible types (e.g., "better' vs. "no change" vs. "worse"). Generally, the outcome is coded as "0" and "1" in binary logistic regression as it leads to the most straightforward interpretation. The target group (referred to as a "case") is usually coded as "1" and the reference group (referred to as a "noncase") as "0". The binomial distribution has a mean equal to the proportion of cases, denoted P, and a variance equal to the product of cases and noncases, PQ, wherein Q is equal to the proportion of noncases or 1 − P. Accordingly, the standard deviation is simply the square root of PQ. Logistic regression is used to predict the odds of being a case based on the predictor(s). The odds are defined as the probability of a case divided by the probability of a non case. The odds ratio is the primary measure of effect size in logistic regression and is computed to compare the odds that membership in one group will lead to a case outcome with the odds that membership in some other group will lead to a case outcome. The odds ratio (denoted OR) is simply the odds of being a case for one group divided by the odds of being a case for another group. An odds ratio of one indicates that the odds of a case outcome are equally likely for both groups under comparison. The further the odds deviate from one, the stronger the relationship. The odds ratio has a floor of zero but no ceiling (upper limit) – theoretically, the odds ratio can increase infinitely.

Like other forms of regression analysis, logistic regression makes use of one or more predictor variables that may be either continuous or categorical data. Also, like other linear regression models, the expected value (average value) of the response variable is fit to the predictors – the expected value of a Bernoulli distribution is simply the probability of a case. In other words, in logistic regression the base rate of a case for the null model (the model without any predictors or the intercept-only model) is fit to the model including one or more predictors. Unlike ordinary linear regression, however, logistic regression is used for predicting binary outcomes (Bernoulli trials) rather than continuous outcomes. Given this difference, it is necessary that logistic regression take the natural logarithm of the odds (referred to as the logit or log-odds) to create a continuous criterion. The logit of success is then fit to the predictors using regression analysis. The results of the logit, however, are not intuitive, so the logit is converted back to the odds via the exponential function or the inverse of the natural logarithm. Therefore, although the observed variables in logistic regression are categorical, the predicted scores are actually modelled as a continuous variable (the logit). The logit is referred to as the link function in logistic regression – although the output in logistic regression is binomial and displayed in a contingency table, the logit is an underlying continuous criterion upon which linear regression is conducted.

Read more about this topic:  Logistic Regression

Famous quotes containing the word background:

    Silence is the universal refuge, the sequel to all dull discourses and all foolish acts, a balm to our every chagrin, as welcome after satiety as after disappointment; that background which the painter may not daub, be he master or bungler, and which, however awkward a figure we may have made in the foreground, remains ever our inviolable asylum, where no indignity can assail, no personality can disturb us.
    Henry David Thoreau (1817–1862)

    In the true sense one’s native land, with its background of tradition, early impressions, reminiscences and other things dear to one, is not enough to make sensitive human beings feel at home.
    Emma Goldman (1869–1940)