All Study Guides Engineering Probability Unit 4
🃏 Engineering Probability Unit 4 – Discrete Random Variables & DistributionsDiscrete random variables are fundamental to probability theory, describing outcomes that can be counted. This unit explores their types, including Bernoulli, binomial, geometric, Poisson, and hypergeometric, each with unique characteristics and applications.
The unit covers probability mass functions, cumulative distribution functions, expected values, and variances. These tools help analyze discrete random variables, calculate probabilities, and make predictions in various real-world scenarios, from quality control to customer arrivals.
Key Concepts and Definitions
Discrete random variables take on a countable number of distinct values
Sample space is the set of all possible outcomes of a random experiment
Events are subsets of the sample space
Probability is a measure of the likelihood of an event occurring
Random variables assign numerical values to outcomes in a sample space
Discrete random variables have a probability distribution that specifies the probability of each possible value
Independence means the occurrence of one event does not affect the probability of another event
Types of Discrete Random Variables
Bernoulli random variables have only two possible outcomes (success or failure)
Binomial random variables count the number of successes in a fixed number of independent Bernoulli trials
Characterized by the number of trials n n n and the probability of success p p p
Geometric random variables count the number of trials until the first success occurs
Poisson random variables count the number of events occurring in a fixed interval of time or space
Characterized by the average rate of occurrence λ \lambda λ
Hypergeometric random variables count the number of successes in a fixed number of draws without replacement from a finite population
Probability Mass Functions (PMF)
PMF denoted as P ( X = x ) P(X = x) P ( X = x ) gives the probability that a discrete random variable X X X takes on a specific value x x x
PMF satisfies two conditions:
P ( X = x ) ≥ 0 P(X = x) \geq 0 P ( X = x ) ≥ 0 for all x x x
∑ x P ( X = x ) = 1 \sum_{x} P(X = x) = 1 ∑ x P ( X = x ) = 1 (sum of probabilities over all possible values is 1)
PMF can be represented as a table, graph, or formula
PMF allows calculation of probabilities for specific values or ranges of values
Example: For a fair six-sided die, the PMF is P ( X = x ) = 1 6 P(X = x) = \frac{1}{6} P ( X = x ) = 6 1 for x = 1 , 2 , 3 , 4 , 5 , 6 x = 1, 2, 3, 4, 5, 6 x = 1 , 2 , 3 , 4 , 5 , 6
Cumulative Distribution Functions (CDF)
CDF denoted as F ( x ) = P ( X ≤ x ) F(x) = P(X \leq x) F ( x ) = P ( X ≤ x ) gives the probability that a random variable X X X takes on a value less than or equal to x x x
CDF is a non-decreasing function with values between 0 and 1
CDF can be obtained by summing the PMF values up to and including x x x
F ( x ) = ∑ t ≤ x P ( X = t ) F(x) = \sum_{t \leq x} P(X = t) F ( x ) = ∑ t ≤ x P ( X = t )
CDF allows calculation of probabilities for intervals and ranges of values
Example: For a Bernoulli random variable with p = 0.7 p = 0.7 p = 0.7 , the CDF is F ( 0 ) = 0.3 F(0) = 0.3 F ( 0 ) = 0.3 and F ( 1 ) = 1 F(1) = 1 F ( 1 ) = 1
Expected Value and Variance
Expected value (mean) of a discrete random variable X X X is the weighted average of all possible values
E ( X ) = ∑ x x ⋅ P ( X = x ) E(X) = \sum_{x} x \cdot P(X = x) E ( X ) = ∑ x x ⋅ P ( X = x )
Variance measures the spread or dispersion of a random variable around its expected value
V a r ( X ) = E [ ( X − E ( X ) ) 2 ] = E ( X 2 ) − [ E ( X ) ] 2 Var(X) = E[(X - E(X))^2] = E(X^2) - [E(X)]^2 Va r ( X ) = E [( X − E ( X ) ) 2 ] = E ( X 2 ) − [ E ( X ) ] 2
Standard deviation is the square root of the variance
Expected value and variance provide important summary statistics for a random variable
Linearity of expectation: E ( a X + b Y ) = a E ( X ) + b E ( Y ) E(aX + bY) = aE(X) + bE(Y) E ( a X + bY ) = a E ( X ) + b E ( Y ) for constants a a a and b b b
Common Discrete Distributions
Bernoulli distribution: Models a single trial with two possible outcomes (success with probability p p p , failure with probability 1 − p 1-p 1 − p )
Binomial distribution: Models the number of successes in a fixed number of independent Bernoulli trials
PMF: P ( X = k ) = ( n k ) p k ( 1 − p ) n − k P(X = k) = \binom{n}{k} p^k (1-p)^{n-k} P ( X = k ) = ( k n ) p k ( 1 − p ) n − k for k = 0 , 1 , . . . , n k = 0, 1, ..., n k = 0 , 1 , ... , n
Geometric distribution: Models the number of trials until the first success occurs
PMF: P ( X = k ) = ( 1 − p ) k − 1 p P(X = k) = (1-p)^{k-1} p P ( X = k ) = ( 1 − p ) k − 1 p for k = 1 , 2 , . . . k = 1, 2, ... k = 1 , 2 , ...
Poisson distribution: Models the number of events occurring in a fixed interval of time or space
PMF: P ( X = k ) = e − λ λ k k ! P(X = k) = \frac{e^{-\lambda} \lambda^k}{k!} P ( X = k ) = k ! e − λ λ k for k = 0 , 1 , 2 , . . . k = 0, 1, 2, ... k = 0 , 1 , 2 , ...
Hypergeometric distribution: Models the number of successes in a fixed number of draws without replacement from a finite population
PMF: P ( X = k ) = ( K k ) ( N − K n − k ) ( N n ) P(X = k) = \frac{\binom{K}{k} \binom{N-K}{n-k}}{\binom{N}{n}} P ( X = k ) = ( n N ) ( k K ) ( n − k N − K ) for max ( 0 , n − ( N − K ) ) ≤ k ≤ min ( n , K ) \max(0, n-(N-K)) \leq k \leq \min(n, K) max ( 0 , n − ( N − K )) ≤ k ≤ min ( n , K )
Properties and Applications
Discrete random variables have a countable number of possible values
Discrete distributions are characterized by their PMF, CDF, expected value, and variance
Discrete distributions are used to model various real-world phenomena:
Bernoulli: Success/failure of a single trial (coin flip, defective item)
Binomial: Number of successes in a fixed number of trials (defective items in a batch, successful free throws)
Geometric: Number of trials until the first success (number of attempts to win a game)
Poisson: Number of events in a fixed interval (number of customers arriving per hour, number of defects per unit area)
Hypergeometric: Number of successes in a fixed number of draws without replacement (defective items in a sample from a lot)
Discrete distributions can be used to calculate probabilities, make predictions, and inform decision-making
Problem-Solving Techniques
Identify the type of discrete random variable and its parameters
Write the PMF or CDF based on the given information
Use the PMF or CDF to calculate probabilities for specific values or ranges of values
P ( X = x ) P(X = x) P ( X = x ) for a specific value x x x
P ( a ≤ X ≤ b ) P(a \leq X \leq b) P ( a ≤ X ≤ b ) for an interval [ a , b ] [a, b] [ a , b ]
Calculate the expected value and variance using the formulas or properties of the specific distribution
Apply the linearity of expectation to solve problems involving multiple random variables
Use the Poisson approximation to the binomial distribution when n n n is large and p p p is small
Recognize when to use each type of discrete distribution based on the problem context and assumptions
Interpret the results in the context of the problem and make appropriate conclusions or decisions