You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

6.1 Least squares and maximum likelihood estimation

2 min readjuly 25, 2024

Mathematical biology uses estimation techniques to understand complex biological systems. and are two key methods for fitting models to data and inferring parameters.

These techniques have different strengths and applications in biology. Least squares is simpler and works well for linear models, while maximum likelihood is more flexible and can handle various probability distributions.

Estimation Techniques in Mathematical Biology

Principles of least squares estimation

Top images from around the web for Principles of least squares estimation
Top images from around the web for Principles of least squares estimation
  • Least squares estimation minimizes sum of squared differences between observed and predicted values
  • Objective function: S=i=1n(yif(xi,β))2S = \sum_{i=1}^n (y_i - f(x_i, \beta))^2 where yiy_i are observed values, f(xi,β)f(x_i, \beta) are predicted values, and β\beta are model parameters
  • Applied in linear regression, nonlinear curve fitting, and model calibration (population growth models)
  • Assumes normally distributed errors and (constant error variance)
  • Computationally efficient and provides unbiased estimates under certain conditions
  • Sensitive to outliers and may not be optimal for non-Gaussian error distributions (skewed data)

Application of maximum likelihood estimation

  • Maximum likelihood estimation (MLE) finds parameter values maximizing likelihood of observing given data
  • : L(θx)=P(xθ)L(\theta|x) = P(x|\theta) where θ\theta are model parameters and xx is observed data
  • MLE process:
    1. Define data probability distribution
    2. Construct likelihood function
    3. Take logarithm of likelihood function
    4. Find maximum by setting derivatives to zero
  • Used in population genetics (allele frequency estimation), phylogenetic tree reconstruction, and epidemiological models (disease transmission rates)
  • Asymptotically efficient and consistent estimator
  • Computationally intensive for complex models and requires knowledge of underlying probability distribution

Least squares vs maximum likelihood

  • Both estimate model parameters for linear and nonlinear models
  • Least squares assumes normally distributed errors, MLE accommodates various distributions
  • MLE more efficient for large samples, least squares more robust for small samples
  • Least squares simpler to implement, MLE more computationally demanding
  • MLE flexible in handling different data types, least squares primarily for continuous data
  • Least squares provides easily interpretable fit measures (), MLE offers likelihood-based model comparison (, BIC)

Implementation of estimation algorithms

  • Programming languages: Python, R, MATLAB
  • Libraries: NumPy, SciPy (numerical computations), Statsmodels (statistical modeling), Scikit-learn (machine learning)
  • Least squares algorithms:
    • Ordinary least squares (OLS) for linear models
    • Nonlinear least squares (NLS) for nonlinear models
    • Iterative methods (Gauss-Newton, Levenberg-Marquardt)
  • MLE algorithms:
    • Newton-Raphson method
    • Expectation-Maximization (EM) algorithm
    • Gradient descent and variants
  • techniques: Conjugate gradient method, Quasi-Newton methods (BFGS)
  • Model diagnostics: Residual analysis, goodness-of-fit tests, cross-validation techniques (k-fold)
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary