\(% Differentiation % https://tex.stackexchange.com/a/60546/ \newcommand{\Diff}{\mathop{}\!\mathrm{d}} \newcommand{\DiffFrac}[2]{\frac{\Diff #1}{\Diff #2}} \newcommand{\DiffOp}[1]{\frac{\Diff}{\Diff #1}} \newcommand{\Ndiff}[1]{\mathop{}\!\mathrm{d}^{#1}} \newcommand{\NdiffFrac}[3]{\frac{\Ndiff{#1} #2}{\Diff {#3}^{#1}}} \newcommand{\NdiffOp}[2]{\frac{\Ndiff{#1}}{\Diff {#2}^{#1}}} % Evaluation \newcommand{\LEvalAt}[2]{\left.#1\right\vert_{#2}} \newcommand{\SqEvalAt}[2]{\left[#1\right]_{#2}} % Epsilon & Phi \renewcommand{\epsilon}{\varepsilon} \renewcommand{\phi}{\varphi} % Sets \newcommand{\NN}{\mathbb{N}} \newcommand{\ZZ}{\mathbb{Z}} \newcommand{\QQ}{\mathbb{Q}} \newcommand{\RR}{\mathbb{R}} \newcommand{\CC}{\mathbb{C}} \newcommand{\PP}{\mathbb{P}} \renewcommand{\emptyset}{\varnothing} % Probabililty \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Corr}{Corr} \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Expt}{E} \DeclareMathOperator{\Prob}{P} % Distribution \DeclareMathOperator{\Binomial}{B} \DeclareMathOperator{\Poisson}{Po} \DeclareMathOperator{\Normal}{N} \DeclareMathOperator{\Exponential}{Exp} \DeclareMathOperator{\Geometric}{Geo} \DeclareMathOperator{\Uniform}{U} % Complex Numbers \DeclareMathOperator{\im}{Im} \DeclareMathOperator{\re}{Re} % Missing Trigonometric & Hyperbolic functions \DeclareMathOperator{\arccot}{arccot} \DeclareMathOperator{\arcsec}{arcsec} \DeclareMathOperator{\arccsc}{arccsc} \DeclareMathOperator{\sech}{sech} \DeclareMathOperator{\csch}{csch} \DeclareMathOperator{\arsinh}{arsinh} \DeclareMathOperator{\arcosh}{arcosh} \DeclareMathOperator{\artanh}{artanh} \DeclareMathOperator{\arcoth}{arcoth} \DeclareMathOperator{\arsech}{arsech} \DeclareMathOperator{\arcsch}{arcsch} % UK Notation \DeclareMathOperator{\cosec}{cosec} \DeclareMathOperator{\arccosec}{arccosec} \DeclareMathOperator{\cosech}{cosech} \DeclareMathOperator{\arcosech}{arcosech} % Paired Delimiters \DeclarePairedDelimiter{\ceil}{\lceil}{\rceil} \DeclarePairedDelimiter{\floor}{\lfloor}{\rfloor} \DeclarePairedDelimiter{\abs}{\lvert}{\rvert} \DeclarePairedDelimiter{\ang}{\langle}{\rangle} % Vectors \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\bvect}[1]{\overrightarrow{#1}} % https://tex.stackexchange.com/a/28213 % \DeclareMathSymbol{\ii}{\mathalpha}{letters}{"10} % \DeclareMathSymbol{\jj}{\mathalpha}{letters}{"11} % \newcommand{\ihat}{\vect{\hat{\ii}}} % \newcommand{\jhat}{\vect{\hat{\jj}}} \newcommand{\ihat}{\textbf{\^{ı}}} \newcommand{\jhat}{\textbf{\^{ȷ}}} \newcommand{\khat}{\vect{\hat{k}}} % Other Functions \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\tr}{tr} % Other Math Symbols \DeclareMathOperator{\modulo}{mod} \newcommand{\divides}{\mid} \newcommand{\notdivides}{\nmid} \newcommand{\LHS}{\text{LHS}} \newcommand{\RHS}{\text{RHS}} \newcommand{\degree}{^{\circ}}\)

2017.2.12 Question 12

  1. Let \(X \sim \Poisson (\lambda )\) and \(Y \sim \Poisson (\mu )\). \(X\) and \(Y\) take values of non-negative integers. Hence, for any non-negative integer \(r\), we have \begin {align*} \Prob (X + Y = r) & = \sum _{t = 0}^{r} \Prob (X = t, Y = r - t) \\ & = \sum _{t = 0}^{r} \Prob (X = t) \Prob (Y = r - t) \\ & = \sum _{t = 0}^{r} \frac {\lambda ^t}{e^\lambda \cdot t!} \cdot \frac {\mu ^{r - t}}{e^\mu \cdot (r - t)!} \\ & = \frac {1}{e^{\lambda + \mu }} \cdot \sum _{t = 0}^{r} \frac {\lambda ^t \mu ^{r - t}}{t! (r - t)!} \\ & = \frac {1}{e^{\lambda + \mu } r!} \cdot \sum _{t = 0}^{r} \frac {r! \lambda ^t \mu ^{r - t}}{t! (r - t)!} \\ & = \frac {1}{e^{\lambda + \mu } r!} \cdot \sum _{t = 0}^{r} \binom {r}{t} \lambda ^t \mu ^{r - t} \\ & = \frac {1}{e^{\lambda + \mu } r!} (\lambda + \mu )^r \\ & = \frac {(\lambda + \mu )^r}{e^{\lambda + \mu } r!}, \end {align*}

    which is precisely the probability mass function for \(\Poisson (\lambda + \mu )\), and hence \(X + Y \sim \Poisson (\lambda + \mu )\).

  2. We consider the probability mass function for the number of fishes Adam has caught in this situation. Given \(X + Y = k\), the only values that \(X\) can take are \(0, 1, \cdots , k\), and hence consider \(x = 0,1, \cdots , k\), we have \begin {align*} \Prob (X = x \mid X + Y = k) & = \frac {\Prob (X = x, X + Y = k)}{\Prob (X + Y = k)} \\ & = \frac {\Prob (X = x, Y = k - x)}{\Prob (X + Y = k)} \\ & = \frac {\Prob (X = x) \cdot \Prob (Y = k - x)}{\Prob (X + Y = k)} \\ & = \frac {\frac {\lambda ^x}{e^{\lambda } x!} \cdot \frac {\mu ^{k - x}}{e^{\mu } (k - x)!}}{\frac {(\lambda + \mu )^{k}}{e^{\lambda + \mu } k!}} \\ & = \frac {\lambda ^x \mu ^{k - x}}{(\lambda + \mu )^{k}} \cdot \frac {k!}{x! (k - x)!} \\ & = \binom {k}{x} \cdot \left (\frac {\lambda }{\lambda + \mu }\right )^{x} \cdot \left (\frac {\mu }{\lambda + \mu }\right )^{k - x}. \end {align*}

    This is precisely the probability mass function for the binomial distribution \(\Binomial \left (k, \frac {\lambda }{\lambda + \mu }\right )\), and we can say that \[ \left (X \mid X + Y = k\right ) \sim \Binomial \left (k, \frac {\lambda }{\lambda + \mu }\right ). \]

  3. When the first fish is caught, this is \(X + Y = 1\), and \(X = 1\). Hence, the probability is \[ \Prob (X = 1 \mid X + Y = 1) = \binom {1}{1} \cdot \left (\frac {\lambda }{\lambda + \mu }\right )^{1} \cdot \left (\frac {\mu }{\lambda + \mu }\right )^{1 - 1} = \frac {\lambda }{\lambda + \mu }. \]
  4. There is a probability of \(\frac {\lambda }{\lambda + \mu }\) of Adam catching the first fish, and in this case the waiting time is first for the fish to come up (which is \(\frac {1}{\lambda + \mu }\)), plus the waiting time of Eve’s fish to come up (which is \(\frac {1}{\mu }\)), summed together. This applies the other way around as well if Eve catches the first fish.

    Hence, the expected time is \begin {align*} & \phantom {=} \frac {\lambda }{\lambda + \mu } \cdot \left (\frac {1}{\lambda + \mu } + \frac {1}{\mu }\right ) + \frac {\mu }{\lambda + \mu } \cdot \left (\frac {1}{\lambda + \mu } + \frac {1}{\lambda }\right ) \\ & = \frac {1}{\lambda + \mu } \cdot \left (\frac {\lambda }{\lambda + \mu } + \frac {\lambda }{\mu } + \frac {\mu }{\lambda + \mu } + \frac {\mu }{\lambda }\right ) \\ & = \frac {1}{\lambda + \mu } \cdot \left (1 + \frac {\lambda ^2 + \mu ^2}{\lambda \mu }\right ) \\ & = \frac {\lambda ^2 + \lambda \mu + \mu ^2}{\lambda \mu (\lambda + \mu )}. \end {align*}