You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Probability distributions are the backbone of statistical inference. They help us model and understand the likelihood of different outcomes in random events. From coin flips to customer arrivals, these mathematical tools give us a way to quantify uncertainty.

In this section, we'll explore key types of distributions, their properties, and real-world applications. We'll learn how to calculate probabilities, expected values, and variances, equipping us with essential skills for data analysis and decision-making under uncertainty.

Probability Distributions: Properties and Applications

Discrete vs Continuous Distributions

Top images from around the web for Discrete vs Continuous Distributions
Top images from around the web for Discrete vs Continuous Distributions
  • Discrete probability distributions model random variables with specific, countable values (number of successes in fixed trials)
  • Continuous probability distributions model random variables with any value in a range (height or weight measurements)
  • models single trial with two outcomes (coin flip)
  • extends Bernoulli to model successes in fixed independent trials (number of heads in 10 coin flips)
  • models events in fixed interval, assuming independent occurrences at constant rate (number of customers arriving at a store in one hour)

Uniform and Normal Distributions

  • models equally likely outcomes within a range
    • Discrete uniform: rolling a fair die
    • Continuous uniform: selecting a random point on a line segment
  • Normal (Gaussian) distribution characterized by bell-shaped curve
    • Widely used to model natural phenomena (human height, IQ scores)
    • Defined by mean (μ) and (σ)
    • Approximately 68% of data falls within 1σ of mean, 95% within 2σ, 99.7% within 3σ

Properties and Applications

  • Probability distributions describe likelihood of outcomes
  • Used to calculate probabilities, expected values, and variances
  • Help in decision-making and (insurance pricing, quality control)
  • Enable statistical inference and
  • Facilitate simulation and modeling of complex systems (weather patterns, financial markets)

Modeling Real-World Phenomena with Distributions

Selecting Appropriate Distributions

  • Identify distribution based on random variable characteristics and problem context
  • Binomial for fixed trials with two outcomes (quality control in manufacturing)
  • Poisson for random events over time or space (customer arrivals, website traffic)
  • Normal for continuous variables clustering around mean (human height, test scores)
  • Exponential for waiting times between Poisson events (time between earthquakes)
  • Uniform for equally likely outcomes (random number generation, simple games)

Calculating Probabilities and Outcomes

  • Use probability distributions to compute outcome likelihoods
    • Binomial: Probability of 3 defective items in a batch of 100
    • Poisson: Likelihood of 5 customers arriving in 10 minutes
    • Normal: Probability of a person being taller than 6 feet
  • Calculate ranges of outcomes in real-world situations
    • Confidence intervals for population parameters
    • Prediction intervals for future observations

Applications in Various Fields

  • Finance: modeling stock prices, risk assessment (Black-Scholes model)
  • Biology: population genetics, epidemiology (spread of diseases)
  • Physics: quantum mechanics, thermodynamics (Maxwell-Boltzmann distribution)
  • Engineering: reliability analysis, signal processing (Gaussian noise)
  • Social sciences: survey analysis, demographic studies (income distribution)

Probability Mass vs Density Functions

Characteristics and Differences

  • Probability Mass Functions (PMFs) used for discrete random variables
  • Probability Density Functions (PDFs) used for continuous random variables
  • PMFs assign probabilities to specific values, sum of all probabilities equals 1
  • PDFs represent relative likelihood, total area under curve equals 1
  • PMF probabilities calculated directly by evaluating function at points
  • PDF probabilities calculated by integrating function over interval

Representation and Visualization

  • PMFs typically represented as bar graphs or point plots
    • Binomial distribution: bar graph showing probabilities for each number of successes
    • Poisson distribution: point plot of event probabilities
  • PDFs represented as continuous curves
    • : bell-shaped curve
    • : decreasing curve starting at y-axis

Cumulative Distribution Functions

  • (CDF) derived from both PMFs and PDFs
  • CDF represents probability of random variable being less than or equal to given value
  • For discrete distributions, CDF is step function
  • For continuous distributions, CDF is smooth curve
  • Relationship between PMF/PDF and CDF
    • Discrete: CDF(x) = sum of PMF values up to and including x
    • Continuous: CDF(x) = integral of PDF from negative infinity to x

Calculating Probabilities, Expected Values, and Variances

Probability Calculations

  • Compute probabilities using appropriate distribution function (PMF or PDF)
    • Discrete: P(X = x) = PMF(x)
    • Continuous: P(a ≤ X ≤ b) = ∫[a to b] PDF(x) dx
  • Calculate cumulative probabilities using CDF
    • P(X ≤ x) = CDF(x)
    • P(X > x) = 1 - CDF(x)
  • Examples:
    • Binomial: Probability of at least 7 successes in 10 trials with p = 0.6
    • Normal: Probability of a randomly selected person being between 5'8" and 6'2" tall

Expected Values and Variances

  • (mean) represents long-term average of random variable
    • Discrete: E(X) = Σ x * PMF(x)
    • Continuous: E(X) = ∫ x * PDF(x) dx
  • measures spread or dispersion of random variable
    • Var(X) = E[(X - μ)²] = E(X²) - [E(X)]²
  • Standard deviation is square root of variance
  • Law of the Unconscious Statistician (LOTUS) for functions of random variables
    • E[g(X)] = Σ g(x) * PMF(x) for discrete
    • E[g(X)] = ∫ g(x) * PDF(x) dx for continuous

Advanced Concepts

  • Moment-generating functions derive moments of probability distributions
    • M(t) = E[e^(tX)]
    • First derivative at t=0 gives mean, second derivative at t=0 gives variance
  • Linearity of expectation for linear combinations of random variables
    • E(aX + bY) = aE(X) + bE(Y)
  • Examples:
    • Calculate expected value and variance of number of heads in 20 coin flips
    • Determine mean and standard deviation of waiting time between bus arrivals
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary