solution stringlengths 442 6.58k | problem stringlengths 373 2.16k |
|---|---|
(a) We compare the truth table with the indicators.
$$
\begin{array}{ccc}
E_1 & E_2 & A \\
\hline
\text{T} & \text{T} & \text{T} \\
\text{T} & \text{F} & \text{T} \\
\text{F} & \text{T} & \text{T} \\
\text{F} & \text{F} & \text{F}
\end{array}
\qquad\qquad
\begin{array}{ccc}
I_1 & I_2 & I_A \\
\hline
1 & 1 & 1 \\
1 & 0... | Let $E_1, E_2, E_3$ be events. Let $I_1, I_2, I_3$ be the corresponding indicators such that $I_1 = 1$ if $E_1$ occurs and $I_1 = 0$ otherwise.
(a) Let $I_A = 1 - (1 - I_1)(1 - I_2)$. Verify that $I_A$ is the indicator for the event $A$, where $A = (E_1 \lor E_2)$ (that is, "$E_1$ or $E_2$"), and show that
$$
\operat... |
Let $R$ be “rain”, $\bar{R}$ be “dry”, $P$ be “rain predicted”
We require $\operatorname*{Pr}(R\mid P)$ . By Bayes’ theorem, this is
$$
{\begin{array}{l l l}{\operatorname*{Pr}(R\mid P)}&{=}&{{\frac{\operatorname*{Pr}(R)\operatorname*{Pr}(P\mid R)}{\operatorname*{Pr}(R)\operatorname*{Pr}(P\mid R)+\operatorname*{P... | In a certain place it rains on one third of the days. The local evening newspaper attempts to predict whether or not it will rain the following day. Three quarters of rainy days and three fifths of dry days are correctly predicted by the previous evening’s paper. Given that this evening’s paper predicts rain, what is t... |
Let $$ D $$ be "1st defective item is 13th to be made."
We require $$\operatorname{Pr}(X = i \mid D)$$ for $$i = 0, \dots, 5$$.
Now,
$$
\operatorname{Pr}(D \mid X = i) = \left(1 - \frac{i}{100}\right)^{12} \left(\frac{i}{100}\right)
$$
and
$$
\operatorname{Pr}(X = i) = \frac{1}{6}.
$$
By Bayes' theorem,
$$
\oper... | A machine is built to make mass-produced items. Each item made by the machine has a probability $p$ of being defective. Given the value of $p$ , the items are independent of each other. Because of the way in which the machines are made, $p$ could take one of several values. In fact $p=X/100$ where $X$ has a discrete un... |
Let $D$ be “ $2$ out of 5 imperfect.” Let $M$ be “machine defective” and let $M$ be “machine not defective.”
We require $\operatorname*{Pr}(M\mid D)$ .
Now
$$
\mathrm{Pr}(D\mid M)=\left(\begin{array}{l}{{5}}\\ {{2}}\end{array}\right){0.2^{2}0.8^{3}}
$$
and
$$
\mathrm{Pr}(D\mid\bar{M})=\left(\begin{array}... | There are five machines in a factory. Of these machines, three are working properly and two are defective. Machines which are working properly produce articles each of which has independently a probability of 0.1 of being imperfect. For the defective machines this probability is 0.2.
A machine is chosen at random an... |
Prior probabilities: $\operatorname*{Pr}(A)=0.6$ ,$\mathrm{Pr}(B)=0.2$ ,$\operatorname*{Pr}(C)=0.2$ .Likelihood: $\operatorname*{Pr}(6\mid A)=1/6$ ,$\operatorname*{Pr}(6\mid B)=0.8$ ,$\operatorname*{Pr}(6\mid C)=0.04$ .Prior $\times$ likelihood:
$$
\begin{array}{r}{\operatorname*{Pr}(A)\operatorname*{Pr}(6\mid A)=0.... | A dishonest gambler has a box containing 10 dice which all look the same. However there are actually three types of dice.
There are 6 dice of type $A$ which are fair dice with $\operatorname*{Pr}(6\mid A)=1/6$ (where $\operatorname*{Pr}(6\mid A)$ is the probability of getting a 6 in a throw of a type $A$ die). There... |
Prior $\times$ likelihood:
$$
\begin{array}{r l}{\operatorname*{Pr}(x)\operatorname*{Pr}(y\mid x)}&{=\begin{array}{r l}{{\binom{5}{3}}\cup6^{z}0.4^{4^{5-x}}\left(\begin{array}{l}{x}\\ {y}\end{array}\right)0.3^{y_{0}}.7^{x-y}}\\ &{=\begin{array}{r l}{{\frac{5!}{x!(5-x)!^{5}y!(x-y)!}}\left({\frac{0.6\times0.7}{0.4}}\r... | In a forest area of Northern Europe there may be wild lynx. At a particular time the number $X$ of lynx can be between 0 and 5 with
$$
\operatorname*{Pr}(X=x)={\left(\begin{array}{l}{5}\\ {x}\end{array}\right)}\,0.6^{x}0.4^{5-x}\quad(x=0,\ldots,5).
$$
A survey is made but the lynx is difficult to spot and, given ... |
Notation:
$M$ :Migration started $\bar{M}$ :Migration not started $W$ :No fish in 60 minutes
Prior: $\mathrm{Pr}(M)=0.4,\ \mathrm{Pr}(\bar{M})=0.6$
(a) Likelihood:
$$
{\begin{array}{l l l}{\operatorname*{Pr}(W\mid M)}&{=}&{e^{-60/20}=e^{-3}=0.04979}\\ {\operatorname*{Pr}(W\mid{\bar{M}})}&{=}&{1}\end{array}}... | A particular species of fish makes an annual migration up a river. On a particular day there is a probability of 0.4 that the migration will start. If it does then an observer will have to wait $T$ minutes before seeing a fish, where $T$ has an exponential distribution with mean 20 (i.e. an exponential(0.05) distributi... |
(a) i.
$$
\begin{array}{r c l}{{\displaystyle\int_{0}^{\infty}f^{(0)}(\lambda)~d\lambda}}&{{=}}&{{\displaystyle k_{0}\left\{\int_{0}^{\infty}e^{-\lambda}~d\lambda+\int_{0}^{\infty}\lambda e^{-\lambda}~d\lambda\right\}}}\\ {{\displaystyle}}&{{=}}&{{\displaystyle k_{0}\{1+1\}=2k_{0}}}\end{array}
$$
Hence $k_{0}=1/2... | We are interested in the mean, $\lambda$ , of a Poisson distribution. We have a prior distribution for $\lambda$ with density
$$
f^{(0)}(\lambda)=\left\{\begin{array}{c c}{{0}}&{{(\lambda\leq0)}}\\ {{k_{0}(1+\lambda)e^{-\lambda}}}&{{(\lambda>0)}}\end{array}\right..
$$
(a) i. Find the value of $k_{0}$ .ii. Find th... |
(a) i.
$$
\begin{array}{r c l}{{\displaystyle\int_{0}^{1}f^{(0)}(\theta)~d\theta}}&{{=}}&{{\displaystyle k_{0}\left\{\int_{0}^{1}\theta^{2}(1-\theta)~d\theta+\int_{0}^{1}\theta(1-\theta)^{2}~d\theta\right\}}}\\ {{\displaystyle}}&{{=}}&{{\displaystyle k_{0}\left\{\frac{\Gamma(3)\Gamma(2)}{\Gamma(5)}+\frac{\Gamma(2)\G... | We are interested in the parameter, $\theta$ , of a binomial $(n,\theta)$ distribution. We have a prior distribution for $\theta$ with density
$$
f^{(0)}(\theta)=\left\{\begin{array}{c c}{k_{0}\{\theta^{2}(1-\theta)+\theta(1-\theta)^{2}\}}&{(0<\theta<1)}\\ {0}&{(\mathrm{otherwise})}\end{array}\right..
$$
(a) i. F... |
(a) i.
$$
\int_{0}^{1}f^{(0)}(\theta)\ d\theta=k_{0}\int_{0}^{1}\theta^{2}(1-\theta)^{3}\ d\theta=k_{0}{\frac{\Gamma(3)\Gamma(4)}{\Gamma(7)}}.
$$
Hence
$$
k_{0}={\frac{\Gamma(7)}{\Gamma(3)\Gamma(4)}}={\frac{6!}{2!3!}}={\frac{6\times5\times4}{2}}={\underline{{60}}}.
$$
ii. Prior mean
$$
\begin{array}{l l ... | We are interested in the parameter $\theta$ , of a binomial $(n,\theta)$ distribution. We have a prior distribution for $\theta$ with density
$$
f^{(0)}(\theta)=\left\{\begin{array}{c c}{k_{0}\theta^{2}(1-\theta)^{3}}&{(0<\theta<1)}\\ {0}&{(\mathrm{otherwise})}\end{array}\right..
$$
(a) i. Find the value of $k_{0... |
(a) i. Value of $k_{0}$ :
$$
\int_{0}^{\infty}\lambda^{3}e^{-\lambda}\ d\lambda=\int_{0}^{\infty}\lambda^{4-1}e^{\lambda}\ d\lambda=\Gamma(4)=3!=6
$$
Hence
$$
k_{0}=\frac{1}{6}.
$$
ii. Prior mean:
(1 mark)
$$
\begin{array}{l l l}{\displaystyle\mathrm{E}_{0}(\lambda)}&{=}&{\displaystyle\int_{0}^{\infty... | We are interested in the parameter $\lambda$ of a Poisson(λ) distribution. We have a prior distribution for $\lambda$ with density
$$
f^{(0)}(\lambda)=\left\{\begin{array}{l l}{{0}}&{{(\lambda<0)}}\\ {{k_{0}\lambda^{3}e^{-\lambda}}}&{{(\lambda\geq0)}}\end{array}\right..
$$
(a) i. Find the value of $k_{0}$ .
ii... |
(a) i. The expression given is proportional to the prior density since
$$
\begin{array}{r c l}{{\displaystyle\frac{\Gamma(6)}{\Gamma(2)\Gamma(4)}}}&{{=}}&{{\displaystyle\frac{5!}{3!}=30}}\\ {{\mathrm{and~}\;\displaystyle\frac{\Gamma(2)}{\Gamma(1)\Gamma(1)}}}&{{=}}&{{1}}\end{array}
$$
Now we only need to show that... | In a fruit packaging factory apples are examined to see whether they are blemished. A sample of $n$ apples is examined and, given the value of a parameter $\theta$ , representing the proportion of aples which are blemished, we regard $x$ , the number of blemished apples in the sample, as an obervation from the binomial... |
(a) In the prior $a=1.5$ and $b=1.5$ . So the mean is
$$
{\frac{a}{a+b}}={\frac{1.5}{3.0}}=\underline{{0.5}}.
$$
The variance is
$$
\frac{a b}{(a+b)^{2}(a+b+1)}=\frac{1.5\times1.5}{3^{2}\times4}=\frac{1}{16}
$$
so the standard deviation is
$$
\frac{1}{4}=\underline{{0.25}}.
$$
(b) Using R the prior pr... | In a small survey, a random sample of 50 people from a large population is selected. Each person is asked a question to which the answer is either “Yes” or “No.” Let the proportion in the population who would answer “Yes” be $\theta$ . Our prior distribution for $\theta$ is a beta(1.5, 1.5) distribution. In the survey,... |
(a) The mean is $a/b=3$ and the variance is $a/b^{2}=4$ . So
$$
{\frac{9}{4}}={\frac{a^{2}/b^{2}}{a/b^{2}}}=a,
$$
giving $a=2.25$ and
$$
b={\frac{2.25}{3}}={\underline{{0.75}}}.
$$

Figure 1: Prior (dashes) and posterior (sol... | The populations, $n_{i}$ , and the number of cases, $x_{i}$ , of a disease in a year in each of six districts are given in the table below.
<html><body><table><tr><td>Population n</td><td>Cases</td><td></td></tr><tr><td>120342</td><td></td><td>2</td></tr><tr><td>235967</td><td></td><td>5</td></tr><tr><td>243745</td>... |
Since the prior distribution is uniform the prior density is a constant. Therefore the posterior density is proportional to the likelihood. the likelihood is
$$
L=\prod_{i=1}^{4}\prod_{j=1}^{4}p_{i j}^{n_{i j}}
$$
where $n_{i j}$ is the observed number of transitions from rock $i$ to rock $j$ .
The posterior d... | Geologists note the type of rock at fixed vertical intervals of six inches up a quarry face. At this quarry there are four types of rock. The following model is adopted.
The conditional probability that the next rock type is $j$ given that the present type is $i$ and given whatever has gone before is $p_{i j}$ . Cle... |
(a) Prior mean:
$$
{\frac{a}{b}}=16,
$$
Prior variance:
$$
{\frac{a}{b^{2}}}=64.
$$
Hence $\underline{{a=4}}$ and $b=0.25$ .
(1 mark)
(b) From the data $\textstyle s=\sum_{i=1}^{20}x_{i}=400$ Prior density proportional to
$$
\lambda^{4-1}e^{-0.25\lambda}
$$
Likelihood proportional to
$$
\prod... | The numbers of sales of a particular item from an Internet retail site in each of 20 weeks are recorded. Assume that, given the value of a parameter $\lambda$ , these numbers are independent observations from the Poisson(λ) distribution.
Our prior distribution for $\lambda$ is a gamma $(a,b)$ distribution.
(a) Ou... |
(a) Variance of beta $(a,b)$ :
$$
\frac{a b}{(a+b+1)(a+b)^{2}}
$$
Variance of $\mathrm{{beta}}(a,a)$ :
$$
{\frac{a^{2}}{(2a+1)(2a)^{2}}}={\frac{1}{4(2a+1)}}
$$
$$
{\frac{1}{4(2a+1)}}={\frac{1}{4^{2}}}\;\;\;\Rightarrow\;\;\;(2a+1)=4\;\;\Rightarrow\;\;\;{\underline{{a=1.5}}}
$$
(1 mark)
(b) Prior: beta(... | In a medical experiment, patients with a chronic condition are asked to say which of two treatments, A, B, they prefer. (You may assume for the purpose of this question that every patient will express a preference one way or the other). Let the population proportion who prefer A be $\theta$ . We observe a sample of $n$... |
(a) Median
$$
e^{-\lambda m}=\frac12\quad\mathrm{so}\quad\lambda m=\log2\quad\mathrm{so}\quad m=\frac{\log2}{\lambda}.
$$
(1 mark)
(b) We have $\lambda=(\log2)/m$ so
$$
\begin{array}{l c l}{{k_{1}=\displaystyle\frac{\log2}{46.2}}}&{{=}}&{{\underline{{{0.0150}}}}}\\ {{k_{2}=\displaystyle\frac{\log2}{6.0}}}&{... | The survival times, in months, of patients diagnosed with a severe form of a terminal illness are thought to be well modelled by an exponential $(\lambda)$ distribution. We observe the survival times of $n$ such patients. Our prior distribution for $\lambda$ is a $\mathrm{gamma}(a,b)$ distribution.
(a) Prior beliefs... |
(a) Prior distribution is Dirichlet(4,2,2,3).
So $A_{0}=4+2+2+3=11$ .The prior means are
$$
{\frac{a_{0,i}}{A_{0}}}.
$$
The prior variances are
$$
\frac{a_{0,i}}{(A_{0}+1)A_{0}}-\frac{a_{0,i}^{2}}{A_{0}^{2}(A_{0}+1)}.
$$
Prior means:
$$
\begin{array}{r c l c r}{{\theta_{11}:}}&{{\quad}}&{{\frac{4}{11}... | I recorded the attendance of students at tutorials for a module. Suppose that we can, in some sense, regard the students as a sample from some population of students so that, for example, we can learn about the likely behaviour of next year’s students by observing this year’s. At the time I recorded the data we had had... |
Let $\begin{array}{r}{N=\sum_{j=1}^{J}n_{j}}\end{array}$ The likelihood is
$$
\begin{array}{r l}{L}&{={\displaystyle\prod_{j=1}^{J\neq1}(2\pi)^{-|\!\!-\!\!1/2|}\tau^{1/2}\tau^{1/2}\exp\left\{-\frac{\tau}{2}(y_{i,j}-\mu_{i})^{2}\right\}}}\\ &{={\displaystyle}}\\ &{={\displaystyle}}\\ &{=}&{(2\pi)^{-N/2}\tau^{N/2}\exp... | Suppose that we have $J$ samples and, given the parameters, observation $i$ in sample $j$ is
$$
y_{i,j}\sim N(\mu_{j},\ \tau^{-1})
$$
for $i=1,\dots,n_{j}$ and $j=1,\dots,J$ .
Let $\underline{{\boldsymbol{\mu}}}=(\mu_{1},\ldots,\mu_{J})^{T}$ , let $\bar{\underline{{y}}}=(\bar{y}_{1},\dots,\bar{y}_{J})^{T}$ , a... |
Likelihood:
$$
\begin{array}{r c l}{{L}}&{{=}}&{{\displaystyle\prod_{i=1}^{n}\frac{\beta^{\alpha}y_{i}^{\alpha-1}e^{-\beta y_{i}}}{\Gamma(\alpha)}}}\\ {{}}&{{=}}&{{\displaystyle\frac{\beta^{n\alpha}}{[\Gamma(\alpha)]^{n}}T_{2}^{\alpha-1}e^{-\beta T_{1}}}}\\ {{}}&{{=}}&{{g(\alpha,\beta,T_{1},T_{2})h(y)}}\end{array}
... | We make $n$ observations $y_{1},\ldots,y_{n}$ , which, given that values of parameters $\alpha,~\beta$ , are independent observations from a $\mathrm{gamma}(\alpha,\beta)$ distribution. Show that the statistics $T_{1},\ T_{2}$ are sufficient for $\alpha,~\beta$ where
$$
T_{1}=\sum_{i=1}^{n}y_{i}\qquad\qquad\qquad\ma... |
(a) Prior mean: $M_{0}=2.5$
Prior precision:
$$
P_{0}=\frac{1}{0.5^{2}}=4
$$
Data precision:
$$
n\tau={\frac{10}{0.05^{2}}}=4000
$$
Posterior precision: $P_{1}=4+4000=4004$
Sample mean: $\bar{y}=3.035$
Posterior mean:
$$
M_{1}={\frac{4\times2.5+4000\times3.035}{4004}}=3.0345
$$
Posterior vari... | Ten measurements are made using a scientific instrument. Given the unknown value of a quantity $\theta$ , the natural logarithms of the measurements are independent and normally distributed with mean $\log\theta$ and known standard deviation 0.05.
Our prior distribution is such that $\log\theta$ has a normal distrib... |
(a) Prior density proportional to
$$
\prod_{j=1}^{12}\theta_{j}^{2-1}
$$
Likelihood proportional to
$$
\prod_{j=1}^{12}\theta_{j}^{x_{j}}
$$
Posterior density proportional to
$$
\prod_{j=1}^{12}\theta_{j}^{x_{j}+2-1}
$$
i.e. ${\mathrm{Dirichlet}}(x_{1}+2,\ x_{2}+2,\ .\dots\ ,x_{12}+2)$ Posterior distr... | Walser (1969) gave the following data on the month of giving birth for 700 women giving birth for the first time. The births took place at the University Hospital of Basel, Switzerland.
<html><body><table><tr><td>Month</td><td></td><td>No. of births</td><td>Month</td><td>No. of births</td><td>Month</td><td>No. of bi... |
(a) Likelihood:
$$
\begin{array}{r c l}{{{\cal L}}}&{{=}}&{{\displaystyle\prod_{i=1}^{m}\left(\begin{array}{c}{{n}}\\ {{x_{i}}}\end{array}\right)\theta^{x_{i}}(1-\theta)^{n-x_{i}}}}\\ {{}}&{{=}}&{{\displaystyle\left\{\prod_{i=1}^{m}\left(\begin{array}{c}{{n}}\\ {{x_{i}}}\end{array}\right)\right\}\theta^{s}(1-\theta)... | Potatoes arrive at a crisp factory in large batches. Samples are taken from each batch for quality checking. Assume that each potato can be claasified as “good” or “bad” and that, given the value of a parameter $\theta$ , potatoes are independent and each has probability $\theta$ of being “bad.”
(a) Suppose that $m$... |
(a) Prior distribution is Dirichlet(4,2,2,3). So $A_{0}=4+2+2+3=11$ .The prior means are
$$
{\frac{a_{0,i}}{A_{0}}}.
$$
The prior variances are
$$
\frac{a_{0,i}}{(A_{0}+1)A_{0}}-\frac{a_{0,i}^{2}}{A_{0}^{2}(A_{0}+1)}.
$$
Prior means:
$$
\begin{array}{r c l c r}{{\theta_{11}:}}&{{\quad}}&{{\frac{4}{11}}}&... | (Some of this question is also in Problems 4). I recorded the attendance of students at tutorials for a module. Suppose that we can, in some sense, regard the students as a sample from some population of students so that, for example, we can learn about the likely behaviour of next year’s students by observing this yea... |
From the data
$$
\sum_{i=1}^{n}y_{i}=1028.9\qquad\qquad\qquad\sum_{i=1}^{n}y_{i}^{2}=53113.73
$$
$$
{\bar{y}}=51.445\qquad\quad s_{n}^{2}={\frac{1}{n}}\sum_{i=1}^{n}(y_{i}-{\bar{y}})^{2}={\frac{1}{n}}\left\{53113.73-{\frac{1}{20}}1028.9^{2}\right\}=9.09848
$$
(a) Prior mean: $M_{0}=60.0$
Prior precision: $P... | Samples are taken from twenty wagonloads of an industrial mineral and analysed. The amounts in ppm (parts per million) of an impurity are found to be as follows.
We regard these as independent samples from a normal distribution with mean $\mu$ and variance $\sigma^{2}=\tau^{-1}$ .
Find a 95% posterior hpd interva... |
(a) We have
$$
\begin{array}{r c l}{{P_{0}}}&{{=}}&{{0.01}}\\ {{P_{d}}}&{{=}}&{{n\tau=30\times0.04=1.2}}\\ {{P_{1}}}&{{=}}&{{0.01+1.2=1.21}}\\ {{M_{0}}}&{{=}}&{{20}}\\ {{\bar{y}}}&{{=}}&{{22.4}}\\ {{M_{1}}}&{{=}}&{{\frac{P_{0}M_{0}+P_{d}\bar{y}}{P_{1}}=\frac{0.01\times20+1.2\times22.4}{1.21}=22.380}}\end{array}
$$ ... | We observe a sample of 30 observations from a normal distribution with mean $\mu$ and precision $\tau$ . The data, $y_{1},\dotsc,y_{30}$ , are such that
$$
\sum_{i=1}^{30}y_{i}=672\qquad\qquad\qquad{\mathrm{and}}\qquad\qquad\sum_{i=1}^{30}y_{i}^{2}=16193.
$$
(a) Suppose that the value of $\tau$ is known to be 0.0... |
Data:
$$
n=15\qquad\sum y=-284\qquad\sum y^{2}=6518
$$
$$
{\bar{y}}=-18.9333\qquad\quad s_{n}^{2}={\frac{1}{15}}\left\{6518-{\frac{284^{2}}{15}}\right\}={\frac{1140.9333}{15}}=76.06222
$$
Calculate posterior:
$$
\begin{array}{r l}{d_{0}}&{=0.7}\\ {v_{0}}&{=2.02/0.7=2.8857}\\ {c_{0}}&{=0.003}\\ {m_{0}}&{=0}\... | The following data come from the experiment reported by MacGregor et al. (1979). They give the supine systolic blood pressures (mm Hg) for fifteen patients with moderate essential hypertension. The measurements were taken immediately before and two hours after taking a drug.
<html><body><table><tr><td>Patient</td><t... |
(a) The likelihood is
$$
\begin{array}{r c l}{{{\cal L}}}&{{=}}&{{\displaystyle\prod_{i=1}^{n}2\rho^{2}t_{i}\exp[-(\rho t_{i})^{2}]}}\\ {{}}&{{=}}&{{\displaystyle2^{n}\rho^{2n}\left(\prod_{i=1}^{n}t_{i}\right)\exp[-\rho^{2}\sum_{i=1}^{n}t_{i}^{2}]}}\end{array}
$$
The log likelihood is
$$
l=n\log2+2n\log\rho+\s... | The lifetimes of certain components are supposed to follow a Weibull distribution with known shape parameter $\alpha=2$ . The probability density function of the lifetime distribution is
$$
f(t)=\alpha\rho^{2}t\exp[-(\rho t)^{2}]
$$
for $0<t<\infty$ .
We will observe a sample of $n$ such lifetimes where $n$ is... |
(a) $\lambda\sim\mathrm{gamma}(5,1)$ so $2\lambda\sim\mathrm{gamma}(5,1/2)$ , i.e. gamma(10/2, 1/2), i.e. $\chi_{10}^{2}$ .From tables, $95\%$ interval, $3.247<2\lambda<20.48$ . That is
$$
\underline{{1.6235}}<\lambda<10.24
$$
(b) Prior density prop. to $\lambda^{5-1}e^{-\lambda}$ .
Likelihood
$$
L=\prod_{i... | Given the value of $\lambda$ , the number $X_{i}$ of transactions made by customer $i$ at an online store in a year has a $\mathrm{Poisson}(\lambda)$ distribution, with $X_{i}$ independent of $X_{j}$ for $i\neq j$ . The value of $\lambda$ is unknown. Our prior distribution for $\lambda$ is a gamma(5,1) distribution. ... |
Prior:
$$
\tau\sim\mathrm{gamma}\left(\frac{4}{2},\ \frac{18}{2}\right)\quad\mathrm{so}\quad d_{0}=4,\ d_{0}v_{0}=18,\ v_{0}=4.5.
$$
$$
\mu\mid\tau\sim N(500,~(0.005\tau)^{-1})\quad\mathrm{so}\quad m_{0}=500,~c_{0}=0.005.
$$
Data:
$$
\sum y=9857,\;\;\;\;\;n=20,\;\;\;\;\bar{y}=\frac{9857}{20}=492.85
$$
$$... | The amounts of rice, by weight, in 20 nominally 500g packets are determined. The weights, in $\mathrm{g}$ , are as follows.
496 506 495 491 488 492 482 495 493 496
487 490 493 495 492 498 491 493 495 489
Assume that, given the values of parameters $\mu,\ \tau$ , the weights are independent and each has a norma... |
(a) Likelihood:
$$
L=\prod_{i=1}^{8}\frac{e^{-\lambda_{i}}\lambda_{i}^{y_{i}}}{y_{i}!}
$$
Log likelihood:
$$
\begin{array}{r c l}{l}&{=}&{-\displaystyle\sum\lambda_{i}+\sum y_{i}\log\lambda_{i}-\sum\log(y_{i}!)}\\ &{=}&{\displaystyle-\sum\lambda_{i}+\sum y_{i}(\alpha+\beta t_{i})-\sum\log(y_{i}!)}\end{array}
$... | A machine which is used in a manufacturing process jams from time to time. It is thought that the frequency of jams might change over time as the machine becomes older. Once every three months the number of jams in a day is counted. The results are as follows.
$$
{\begin{array}{l}{{\mathrm{Observation~}}i}\\ {{\math... |
README.md exists but content is empty.
- Downloads last month
- 13