# Hazard Ratio - Definition

Definition

The instantaneous hazard rate is the limit of the number of events per unit time divided by the number at risk, as the time interval approaches 0.

where N(t) is the number at risk at the beginning of an interval.

The hazard ratio is the effect on this hazard rate of a difference, such as group membership (for example, treatment or control, male or female), as estimated by regression models that treat the log of the HR as a function of a baseline hazard and a linear combination of explanatory variables:

Such models are generally classed proportional hazards regression models (they differ in their treatment of, the underlying pattern the HR over time); the most well-known proportional hazard models are the Cox semiparametric proportional hazards model, and the exponential, Gompertz and Weibull parametric models.

For two individuals who differ only in the relevant membership (e.g., treatment vs. control), their predicted log-hazard will differ additively by the relevant parameter estimate, which is to say that their predicted HR will differ by, i.e., multiplicatively by the anti-log of the estimate. Thus the estimate can be considered a hazard ratio, that is, the ratio between the predicted hazard for a member of one group and that for a member of the other group, holding everything else constant.

For a continuous explanatory variable, the same interpretation applies to a unit difference.

Other HR models have different formulations and the interpretation of the parameter estimates differs accordingly.

In simplified terms the hazard ratio is used to describe time-to-event in survival analysis. It is the ratio of the rate at which subjects in two groups are experiencing events where a slower rate suggests a longer time of event-free-survival. This type of analysis is frequently used to evaluate a drug's ability to prevent disease as a function of time.