bugfree Icon
interview-course
interview-course
interview-course
interview-course
interview-course
interview-course
interview-course
interview-course

Data Interview Question

Estimating Coin's Head Probability via Maximum Likelihood

bugfree Icon

Hello, I am bugfree Assistant. Feel free to ask me for any question related to this problem

Solution & Explanation

The problem involves estimating the probability of a biased coin landing on heads using Maximum Likelihood Estimation (MLE). Here's a breakdown of the solution:

Step 1: Understanding the Problem

  • N: Total number of coin flips.
  • K: Number of times the coin landed on heads.
  • p: Probability of the coin landing on heads (unknown).

Step 2: Setting Up the Likelihood Function

The likelihood function for observing K heads in N flips, assuming each flip is independent and identically distributed, follows a binomial distribution:

L(pK,N)=(NK)pK(1p)NKL(p | K, N) = \binom{N}{K} p^K (1-p)^{N-K}

Where:

  • (NK)\binom{N}{K} is the binomial coefficient, which is constant with respect to p.
  • pKp^K: Probability of observing K heads.
  • (1p)NK(1-p)^{N-K}: Probability of observing N-K tails.

Step 3: Converting to Log-Likelihood

To simplify differentiation, we take the natural logarithm of the likelihood function:

logL(p)=log(NK)+Klog(p)+(NK)log(1p)\log L(p) = \log \binom{N}{K} + K \log(p) + (N-K) \log(1-p)

The binomial coefficient term is constant concerning p, so it can be ignored for MLE purposes.

Step 4: Differentiating the Log-Likelihood

Differentiate the log-likelihood function with respect to p:

ddplogL(p)=KpNK1p\frac{d}{dp} \log L(p) = \frac{K}{p} - \frac{N-K}{1-p}

Step 5: Setting the Derivative to Zero

To find the maximum likelihood estimate, set the derivative equal to zero:

KpNK1p=0\frac{K}{p} - \frac{N-K}{1-p} = 0

Step 6: Solving for p

Rearrange and solve the equation:

Kp=NK1p\frac{K}{p} = \frac{N-K}{1-p}

K(1p)=p(NK)K(1-p) = p(N-K)

KKp=pNpKK - Kp = pN - pK

K=pNK = pN

p=KNp = \frac{K}{N}

This result makes intuitive sense: the probability of heads is estimated by the proportion of heads observed in the total number of flips.

Conclusion

The Maximum Likelihood Estimator (MLE) for the probability of a biased coin landing on heads is simply the ratio of the number of heads observed to the total number of flips, p=KNp = \frac{K}{N}. This estimator maximizes the likelihood of observing the given data under the assumption of a binomial distribution.