You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Least squares estimators are crucial in linear modeling, providing unbiased and efficient estimates of population parameters. Understanding their properties helps us assess the reliability of our models and make accurate predictions.

The Gauss-Markov assumptions lay the foundation for these properties. When satisfied, they ensure our estimates are unbiased and have the smallest among all linear unbiased estimators, making them the best choice for accurate analysis.

Gauss-Markov Assumptions and Implications

Conditions for Desirable OLS Estimator Properties

Top images from around the web for Conditions for Desirable OLS Estimator Properties
Top images from around the web for Conditions for Desirable OLS Estimator Properties
  • The Gauss-Markov assumptions are a set of conditions that, when satisfied, ensure that the ordinary least squares (OLS) estimators have desirable properties, such as and efficiency
  • The first assumption states that the linear regression model is correctly specified, meaning that the relationship between the dependent variable and the independent variables is linear, and no important variables are omitted
  • The second assumption is that the expected value of the error term, conditional on the independent variables, is zero which rules out any systematic relationship between the error term and the independent variables
  • The third assumption requires that the error terms have constant variance () across all observations
    • Violating this assumption leads to heteroscedasticity, which affects the efficiency of the OLS estimators
  • The fourth assumption states that the error terms are uncorrelated with each other (no autocorrelation)
    • Autocorrelation can lead to biased standard errors and inefficient estimates
  • The fifth assumption is that the independent variables are not perfectly multicollinear meaning that one independent variable cannot be an exact linear combination of the others, which would make it impossible to estimate the individual effects of the variables

Consequences of Violating Gauss-Markov Assumptions

  • If the Gauss-Markov assumptions are violated, the OLS estimators may lose some of their desirable properties
  • Omitting important variables or misspecifying the functional form of the model can lead to biased and inconsistent estimates
  • Non-zero expected value of the error term conditional on the independent variables can introduce bias in the estimates
  • Heteroscedasticity (non-constant variance of the error terms) leads to inefficient estimates and incorrect standard errors, affecting hypothesis testing and confidence intervals
  • Autocorrelation (correlation among the error terms) results in inefficient estimates and biased standard errors, which can lead to incorrect inferences
  • Perfect multicollinearity makes it impossible to estimate the individual effects of the collinear variables and can lead to unstable and unreliable estimates

Unbiasedness of OLS Estimators

Definition and Importance of Unbiasedness

  • Unbiasedness means that the expected value of the OLS estimator is equal to the true population parameter
    • On average, the estimator will provide the correct value of the parameter
  • Unbiasedness is a desirable property because it ensures that the estimator is centered around the true value of the parameter, without systematically over- or under-estimating it
  • Unbiased estimators are important for making accurate inferences and predictions based on the estimated model

Proving the Unbiasedness of OLS Estimators

  • To prove the unbiasedness of the OLS estimators, we start with the linear regression model: y=Xβ+εy = Xβ + ε, where yy is the dependent variable, XX is the matrix of independent variables, ββ is the vector of coefficients, and εε is the error term
  • The OLS estimator for ββ is given by: β^=(XX)1Xyβ̂ = (X'X)⁻¹X'y, where XX' denotes the transpose of XX, and (XX)1(X'X)⁻¹ is the inverse of the matrix product XXX'X
  • Substituting the linear regression model into the OLS estimator formula and simplifying, we obtain: β^=β+(XX)1Xεβ̂ = β + (X'X)⁻¹X'ε
  • Taking the expected value of both sides and applying the second Gauss-Markov assumption (E[εX]=0)(E[ε|X] = 0), we find that E[β^]=βE[β̂] = β, which proves the unbiasedness of the OLS estimators under the Gauss-Markov assumptions
    • This result holds because the expected value of the error term is zero, and the expected value of a constant (true ββ) is the constant itself

Variance-Covariance Matrix of OLS Estimators

Definition and Interpretation of the Variance-Covariance Matrix

  • The variance- matrix of the OLS estimators, denoted by Var(β^)Var(β̂), provides information about the variability and relationships among the estimated coefficients
  • The diagonal elements of this matrix represent the variances of the individual coefficients, which measure the precision of the estimates
  • The off-diagonal elements represent the covariances between pairs of coefficients, indicating how the estimates of different coefficients are related to each other

Derivation of the Variance-Covariance Matrix

  • To derive the variance-covariance matrix, we start with the expression for the OLS estimator: β^=β+(XX)1Xεβ̂ = β + (X'X)⁻¹X'ε
  • Taking the variance of both sides and applying the properties of variance, we obtain: Var(β^)=(XX)1XVar(ε)X(XX)1Var(β̂) = (X'X)⁻¹X'Var(ε)X(X'X)⁻¹
  • Under the third Gauss-Markov assumption (homoscedasticity), Var(ε)=σ2IVar(ε) = σ²I, where σ2σ² is the constant variance of the error terms, and II is an identity matrix
  • Substituting this into the variance formula, we get: Var(β^)=σ2(XX)1Var(β̂) = σ²(X'X)⁻¹, which is the variance-covariance matrix of the OLS estimators
    • This result shows that the variance-covariance matrix depends on the variance of the error terms (σ2σ²) and the inverse of the cross-product matrix of the independent variables (XX)1(X'X)⁻¹

Efficiency of OLS Estimators

Definition and Importance of Efficiency

  • Efficiency refers to the property of an estimator having the smallest variance among all unbiased estimators
    • An efficient estimator provides the most precise estimates of the population parameters
  • Efficiency is important because it ensures that the estimator makes the best use of the available data, providing the most accurate and reliable estimates possible
  • Efficient estimators are desirable for hypothesis testing and constructing confidence intervals, as they lead to more powerful tests and narrower intervals

Gauss-Markov Theorem and the Efficiency of OLS Estimators

  • The states that, under the Gauss-Markov assumptions, the OLS estimators are the best linear unbiased estimators (BLUE) of the population parameters
    • "Best" in this context means that the OLS estimators have the smallest variance among all linear unbiased estimators
  • The efficiency of the OLS estimators is a result of the Gauss-Markov assumptions, particularly the assumptions of homoscedasticity and no autocorrelation
  • When these assumptions hold, the OLS estimators are not only unbiased but also efficient, providing the most precise estimates of the population parameters
  • If the Gauss-Markov assumptions are violated, such as in the presence of heteroscedasticity or autocorrelation, the OLS estimators may no longer be efficient
    • In such cases, alternative estimation methods, such as generalized least squares (GLS) or robust standard errors, may be more appropriate to obtain efficient estimates
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary