#StackBounty: #bayesian #estimation #bootstrap Is it acceptable to use Boostrap/Jackknife to estimate the variance of an MAP estimators?

Bounty: 50

Suppose we obtain a point estimate using a maximum a posteriori estimator $hat{theta}_{MAP}$. Note that I’m aware Bayesian approaches generally do seek point estimates, but suppose this is an example where a point estimator is specifically needed, and we want to incorporate some prior information using a prior distribution. Then we need some way to quantify the accuracy of our estimator $hat{theta}_{MAP}$ – and suppose that an analytical approach to deriving it is not available.

My question is whether it is acceptable to use standard bootstrap or jackknife methods to estimate the variance of $hat{theta}_{MAP}$? For example, suppose we obtain thousands of bootstrap resamples, each with a boostrap estimate $hat{theta}_{MAP}^b$, and estimate the standard error of our estimator based off of these boostrap sample estimates.

The reason I am confused is that all mentions of stanndard bootstrap/jackknife that I have read consider it solely in the frequentist domain, and I can’t really find any reference to normal bootstrap/jackknife when it comes to Bayesian point estimates. Again, I guess this is because Bayesian approaches generally do not seek point estimates over descriptive statistics of the posterior.


Get this bounty!!!

#StackBounty: #hypothesis-testing #estimation #maximum-likelihood #unbiased-estimator Testing the equality of two multivariate mean vec…

Bounty: 100

Let $X_1,…,X_{n_1}$ be an i.i.d. sample from $N_p(mu_1,Sigma)$ and let $Y_1,…,Y_{n_2}$ be an independent sample from $N_p(mu_2,Sigma)$, for some $mu_1,mu_2 in mathbb{R}^p$ and some invertible, $ptimes p$ positive definite matrix $Sigma$.

I would like to test the hypothesis $H_0 : mu_1=mu_2$ vs $H_1 : mu_1 neqmu_2$. I would also like to find the maximum likelihood estimators under $H_0$, i.e $(hat{mu_0},hat{Sigma_0})$ which maximises the constrained likelihood function $overline{L}(mu,Sigma)=L(mu,mu,Sigma)$

And so here’s how I try to do it:

From previous deductions, I have found that the unbiased MLE for $Sigma$ is $S=frac{1}{n_1-n_2-2}biggl(sum^{n_1}_{i=1}(x_i-hat{mu_1})(x_1-hat{mu_1})^T+sum^{n_2}_{i=1}(y_i-hat{mu_2})(y_i-hat{mu_2})^Tbiggr)$

Ando so if we set $S_x=frac{1}{n_1-1}sum^{n_1}_{i=1}(x_i-hat{mu_1})(x_1-hat{mu_1})^T$ and $S_y=frac{1}{n_2-1}sum^{n_2}_{i=1}(y_i-hat{mu_2})(y_i-hat{mu_2})^T$

$Rightarrow S_p=frac{n_1-1}{n_1+n_2-2}S_x+frac{n_2-1}{n_1+n_2-2}S_y$ is the pooled variance.

The $T^2$ test statistic for testing $H_0 : mu_1=mu-2$ is $$biggl(frac{1}{n_1}+frac{1}{n_2}biggr)^{-1}(hat{mu_1}-hat{mu_2})^TS_p^{-1}(hat{mu_1}-hat{mu_2})sim frac{n_1+n_2-2}{n_1+n_2-p-1}F_{p,n_1+n_2-p-1}$$

Hence a $100(1-alpha)$% confidence region can be formed from

$$mathbb{P}biggl(T^2leq frac{n_1+n_2-2}{n_1+n_2-p-1}F_{p,n_1+n_2-p-1}(alpha)biggr)=1-alpha$$

So, in this context, can I say that $hat{mu_0}=hat{mu_1}=hat{mu_2}$?

And then $hat{Sigma_0}=S_0=frac{1}{n_1-n_2-2}biggl(sum^{n_1}_{i=1}(x_i-hat{mu_0})(x_1-hat{mu_0})^T+sum^{n_2}_{i=1}(y_i-hat{mu_0})(y_i-hat{mu_0})^Tbiggr)$

Would this be correct?


Get this bounty!!!

#StackBounty: #estimation #maximum-likelihood #pdf How to fit a unnormalized parametric distribution with MLE?

Bounty: 50

I’m somewhat familiar with parametric estimation using MLE in the context of fitting the parameters of a distribution given a sample. Is there a way of generalizing this approach to unnormalized models (for instance, neural networks)? Na├»vely maximizing the predicted log-likelihood would simply lead to the model predicting high values everywhere.


Get this bounty!!!

#StackBounty: #estimation #inference #standard-error Standard error of estimated covariance

Bounty: 50

Let $X_1,…,X_n$ and $Y_1,…,Y_n$ be two independent random samples from $mathcal{N}(mu, sigma^2)$ where both $mu$ and $sigma$ are unknown parameters.

I estimate their covariance using:
$$hat{operatorname{cov}}(X, Y) = operatorname{E}{big[(X_i – operatorname{E}[X])(Y_i – operatorname{E}[Y])big]} $$

with replacing $operatorname{E}[X]$ and $operatorname{E}[Y]$ by the according sample mean.

How do i calculate the standard error of $hat{operatorname{cov}}(X, Y)$?


Get this bounty!!!

#StackBounty: #estimation #binomial #beta-distribution #measurement-error How to model errors around the estimation of proportions – wi…

Bounty: 100

I have a situation I’m trying to model. I would appreciate any ideas on how to model this, or if there are known names for such a situation.

Background:

Let’s assume we have a large number of movies (M). For each movie, I’d like to know the proportion of people in the population who enjoy watching these movies. So for movie $m_1$ we’d say that $p_1$ proportion of the population would say "yes" to "did you enjoy watching this movie?" question. And the same for movie $m_j$, we’d have proportion $p_j$ (up to movie $m_M$).

We sample $n$ people, and ask each of them to say if they enjoyed watching movies $m_1, m_2, …, m_M$ of the movies. We can now easily build estimations for $p_1, …, p_M$ using standard point estimates, and build confidence intervals for these estimations using the standard methods (ref).

But there is a problem.

Problem: measurement error

Some of the people in the sample do not bother to answer truthfully. They instead just answer yes/no to the question regardless of their true preference. Luckily, for some sample of the M movies, we know the true proportion of people who like the movies. So let’s assume that M is very large, but that for the first 100 movies (of some indexing) we know the real proportion.
So we know the real values of $p_1, p_2, …, p_{100}$, and we have their estimations $hat p_1 , hat p_2, …, hat p_{100}$. While we still want to know the confidence intervals that takes this measurement error into account for $p_{101} , p_{102}, …, p_M$, using our estimators $hat p_{101} , hat p_{102}, …, hat p_M$.

I could imagine some simple model such as:

$$hat p_i sim N(p_i, epsilon^2 + eta^2 )$$

Where $eta^2$ is for the measurement error.

Questions:

  1. Are there other reasonable models for this type of situation?
  2. What are good ways to estimate $eta^2$ (for the purpose of building confidence interval)? For example, would using $hat eta^2 = frac{1}{n-1}sum (p_i – hat p_i)^2$ make sense? Or, for example, it makes sense to first take some transformation of the $p_i$ and $hat p_i$ values (using logit, probit or some other transformation from the 0 to 1, to an -inf to inf scale)?


Get this bounty!!!

#StackBounty: #estimation #confirmatory-factor How to calculate confirmatory factor analysis by hand?

Bounty: 50

I am trying to learn how confirmatory factor analysis works and the way I learn is best by understanding how the calculations work by hand using a pen, paper, and a calculator, and then replicate this calculation manually in R. I’m also trying to learn to understand formula notation. Particularly, I looking for guidance on how to calculate

  1. Factor scores and factor loadings
  2. Model fit indices: Chi sq., RMSEA, CFI, TLI, SRMR

Through this process, I want to answer other questions such as:

  • Why are factor loadings set at 1?
  • How to use values from a CFA in a measurement model with a path analysis model using latent variables?

Here is example data from the HolzingerSwineford1939 dataset from the lavaan package. I selected a few variables: id, school, x1 - x9

dput(ex_data)
structure(list(id = c(1L, 2L, 3L, 4L, 5L, 6L, 7L, 8L, 9L, 11L, 
12L, 13L, 14L, 15L, 16L, 17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 
25L, 26L, 27L, 28L, 29L, 30L, 31L, 33L, 34L, 35L, 36L, 38L, 39L, 
40L, 41L, 42L, 43L, 44L, 45L, 46L, 47L, 48L, 49L, 50L, 51L, 52L, 
54L, 55L, 56L, 57L, 58L, 60L, 62L, 63L, 64L, 65L, 66L, 67L, 68L, 
69L, 70L, 71L, 72L, 73L, 74L, 75L, 76L, 77L, 78L, 79L, 80L, 81L, 
82L, 83L, 85L, 86L, 87L, 88L, 89L, 90L, 91L, 93L, 94L, 95L, 96L, 
97L, 98L, 99L, 100L, 101L, 102L, 103L, 104L, 105L, 106L, 108L, 
109L, 110L, 111L, 112L, 113L, 114L, 115L, 116L, 117L, 118L, 119L, 
120L, 121L, 122L, 123L, 124L, 125L, 126L, 127L, 129L, 130L, 131L, 
132L, 133L, 134L, 135L, 136L, 137L, 138L, 139L, 140L, 142L, 143L, 
144L, 145L, 146L, 147L, 148L, 149L, 150L, 151L, 152L, 153L, 154L, 
155L, 156L, 157L, 158L, 159L, 160L, 162L, 163L, 164L, 165L, 166L, 
167L, 168L, 201L, 202L, 203L, 204L, 205L, 206L, 208L, 209L, 210L, 
211L, 212L, 213L, 214L, 215L, 216L, 217L, 218L, 219L, 220L, 221L, 
222L, 223L, 224L, 225L, 226L, 227L, 228L, 229L, 230L, 231L, 232L, 
233L, 234L, 235L, 236L, 237L, 238L, 239L, 240L, 241L, 242L, 243L, 
244L, 245L, 246L, 247L, 248L, 249L, 250L, 251L, 252L, 253L, 254L, 
256L, 257L, 258L, 259L, 260L, 261L, 262L, 263L, 264L, 265L, 266L, 
267L, 268L, 269L, 270L, 271L, 272L, 273L, 274L, 275L, 276L, 277L, 
278L, 279L, 280L, 281L, 282L, 283L, 284L, 285L, 286L, 287L, 288L, 
289L, 290L, 291L, 292L, 293L, 294L, 295L, 296L, 297L, 298L, 299L, 
300L, 302L, 303L, 304L, 305L, 306L, 307L, 308L, 309L, 310L, 311L, 
312L, 313L, 314L, 315L, 316L, 317L, 318L, 320L, 321L, 322L, 323L, 
324L, 325L, 326L, 327L, 328L, 329L, 330L, 331L, 333L, 334L, 335L, 
336L, 337L, 338L, 339L, 340L, 341L, 342L, 343L, 344L, 345L, 346L, 
347L, 348L, 349L, 351L), school = structure(c(2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L), .Label = c("Grant-White", 
"Pasteur"), class = "factor"), x1 = c(3.3333333, 5.3333333, 4.5, 
5.3333333, 4.8333333, 5.3333333, 2.8333333, 5.6666667, 4.5, 3.5, 
3.6666667, 5.8333333, 5.6666667, 6, 5.8333333, 4.6666667, 4.3333333, 
5, 5.6666667, 6.3333333, 5.8333333, 6.6666667, 5, 3.8333333, 
5.6666667, 5.3333333, 5.5, 6, 4.6666667, 5, 3.5, 3, 5, 4.1666667, 
3.3333333, 4.8333333, 5.5, 3.8333333, 6.3333333, 5.8333333, 3.8333333, 
3.1666667, 1.8333333, 4.1666667, 6.3333333, 6, 7.1666667, 3.1666667, 
4.3333333, 4.5, 5.5, 7, 3.8333333, 5.1666667, 5, 5.6666667, 4, 
5.8333333, 3.8333333, 4.1666667, 5.3333333, 4, 5.3333333, 5.3333333, 
3.6666667, 6.5, 4, 4.6666667, 2.8333333, 4.6666667, 0.66666667, 
4.3333333, 5, 5, 4.1666667, 3.8333333, 5.6666667, 1.6666667, 
6.3333333, 4, 4.5, 4.6666667, 4.8333333, 4.8333333, 3, 6.3333333, 
5.5, 5.3333333, 3.3333333, 5.5, 4, 3.8333333, 5.1666667, 4.3333333, 
4.1666667, 5, 4.3333333, 7.5, 5, 5.5, 6.1666667, 6.5, 4.3333333, 
4.6666667, 6.8333333, 4.5, 6.8333333, 5.5, 6.3333333, 4.1666667, 
5.6666667, 6.3333333, 6.1666667, 5.1666667, 4.1666667, 4.1666667, 
5.1666667, 4.3333333, 3.5, 4.6666667, 4.8333333, 5, 7.5, 5.3333333, 
6.1666667, 5.3333333, 4.3333333, 4.8333333, 6, 5, 4.1666667, 
5.6666667, 5, 3.5, 7.3333333, 6.1666667, 4.8333333, 5.6666667, 
5.5, 5.1666667, 3.1666667, 5, 7.1666667, 7.3333333, 2, 3.8333333, 
4.1666667, 4.6666667, 5.8333333, 6.6666667, 6.8333333, 5.3333333, 
4.8333333, 5.1666667, 6.3333333, 4.8333333, 3.8333333, 5.5, 5.6666667, 
4.8333333, 2.6666667, 5, 6, 4.6666667, 5, 3.3333333, 4.5, 5.3333333, 
6.3333333, 2.8333333, 5.6666667, 4.1666667, 5.5, 6.6666667, 5, 
6, 4, 6.6666667, 5, 7, 5.5, 5.3333333, 5.1666667, 4.5, 5.1666667, 
5.1666667, 2.8333333, 5, 4.6666667, 3.1666667, 4.6666667, 6.3333333, 
5.6666667, 3, 2.6666667, 3, 5.3333333, 5.6666667, 3.5, 4.6666667, 
6.5, 5.3333333, 4.6666667, 4, 5.1666667, 4.8333333, 4.8333333, 
2.6666667, 4.1666667, 4.1666667, 6.1666667, 5, 4.8333333, 6.1666667, 
4.6666667, 3.6666667, 6.3333333, 4.8333333, 4.1666667, 4.6666667, 
3.8333333, 1.8333333, 7.5, 3.1666667, 6.8333333, 5.8333333, 5.6666667, 
3.1666667, 4.1666667, 6, 3.1666667, 4.5, 3.5, 4.5, 4.3333333, 
3.3333333, 5.5, 6.3333333, 5.5, 6, 6.3333333, 5.1666667, 6, 5.1666667, 
4.6666667, 6.3333333, 4.8333333, 5.8333333, 5.8333333, 3.8333333, 
5.1666667, 8.5, 5.5, 3.5, 6.1666667, 6.1666667, 6.1666667, 6.5, 
6.5, 3, 4.6666667, 4.8333333, 4.1666667, 4.8333333, 5.3333333, 
5, 5.8333333, 2.6666667, 6.3333333, 4.3333333, 5.5, 4.6666667, 
4.5, 4.3333333, 5, 3.6666667, 5.3333333, 5.1666667, 5.3333333, 
6.6666667, 3.5, 5.1666667, 4, 6.8333333, 5, 5.8333333, 5.6666667, 
4.1666667, 4, 6, 3.3333333, 4.6666667, 5.6666667, 5.6666667, 
5.8333333, 6.1666667, 4, 3, 4.6666667, 4.3333333, 4.3333333), 
    x2 = c(7.75, 5.25, 5.25, 7.75, 4.75, 5, 6, 6.25, 5.75, 5.25, 
    5.75, 6, 4.5, 5.5, 5.75, 4.75, 4.75, 6.75, 5.25, 8.75, 8, 
    8.5, 6.25, 5.5, 5.5, 4, 5.25, 5, 6, 4.5, 5.75, 6, 5.25, 6, 
    3.75, 5.25, 7, 4.5, 4, 7.75, 5.75, 5, 5.25, 5.25, 5.5, 5.5, 
    8.5, 4.75, 5.5, 6.25, 5.75, 6, 7.5, 4.75, 6, 5.25, 4.75, 
    5.25, 6.5, 5.75, 5.75, 6, 6.75, 5, 5.75, 6, 9.25, 5.75, 5, 
    5, 4.5, 9.25, 4.5, 6, 5.25, 5.25, 7, 5.75, 5.5, 5.25, 6, 
    8, 8.25, 6.5, 5.25, 6.25, 7, 6, 7.25, 5, 6.5, 6.25, 4.75, 
    3.5, 5.5, 5.75, 4.75, 7.5, 3.75, 6.25, 3.75, 6.5, 7.75, 5.25, 
    9, 6.5, 6.5, 6.75, 5.5, 4.25, 9.25, 5.25, 6.25, 6.5, 7.5, 
    5.75, 5.75, 5.25, 7.25, 4.75, 6, 5.5, 7, 5.25, 6.75, 8.75, 
    7, 7, 7.5, 4.5, 8, 5.5, 6.25, 5.25, 8, 7.5, 5.25, 6.25, 6.75, 
    5, 3.75, 7.5, 9, 6.75, 5.5, 5.5, 6, 5.75, 4.75, 5.5, 7.25, 
    4.75, 5, 6, 6.75, 5.75, 4.75, 5.5, 6, 5.75, 6.25, 6.25, 8.25, 
    6.25, 6.25, 6.25, 6.5, 5.25, 7.75, 5.25, 7, 7.75, 7.75, 5.75, 
    5.5, 7, 4, 6.5, 6, 6.75, 6.75, 5.5, 5.25, 5.75, 5.75, 8.5, 
    7.5, 5, 4.75, 5.5, 5, 6.25, 7, 5.5, 5, 7.5, 5.25, 5.25, 5.5, 
    5, 6, 6.5, 6, 5, 4.75, 6.5, 7.5, 5.75, 6.25, 6.75, 7.75, 
    6.5, 6, 8.5, 5, 8.5, 5.25, 5.25, 6.25, 6.75, 5, 5, 9.25, 
    3.75, 5.25, 7.5, 6, 5.5, 5.25, 7, 5.75, 5.25, 2.25, 5.5, 
    5.5, 6.75, 5.5, 6.5, 6.25, 7.25, 7.25, 6.25, 7, 6.5, 6, 9.25, 
    8.5, 6, 6.25, 5.25, 7, 6.5, 7, 6.5, 8, 5, 6.5, 6.5, 8.5, 
    4, 7, 5.75, 5, 8.25, 5.25, 5.25, 7, 8.5, 7.5, 6.25, 7, 6.75, 
    5.5, 7.25, 8, 5.75, 6, 5.75, 6.25, 5.25, 5, 6.25, 6, 6.25, 
    5.75, 5, 6.75, 5.5, 6.25, 6.5, 4.75, 6, 5.5, 5.25, 7, 6.5, 
    7, 6, 5.5, 6.75, 6), x3 = c(0.375, 2.125, 1.875, 3, 0.875, 
    2.25, 1, 1.875, 1.5, 0.75, 2, 2.875, 4.125, 1.75, 3.625, 
    2.375, 1.5, 2.25, 4, 3, 2, 4.125, 1.875, 1.625, 1.25, 3.375, 
    4.5, 2.125, 4.25, 0.75, 1.375, 0.25, 1.75, 2.375, 1.5, 0.5, 
    3.5, 2.25, 3.875, 2.5, 1.625, 1.25, 1, 1.875, 2.75, 4.5, 
    4, 1.375, 2.75, 1.125, 3.75, 2.125, 3.25, 1.75, 4.125, 2.125, 
    3.25, 3.875, 2, 1.75, 3.375, 3.625, 1.375, 1.25, 3.625, 2.5, 
    4, 3.625, 0.875, 0.75, 0.75, 3.375, 2.625, 3.25, 2.125, 2.375, 
    2.125, 1.375, 4.125, 2.5, 1.75, 4.25, 2.25, 1.75, 0.625, 
    2.5, 2.875, 2.75, 3.25, 2.25, 2.625, 3.375, 2.125, 0.875, 
    0.875, 4.25, 0.5, 2.125, 1.375, 2.75, 4.375, 1.875, 1.875, 
    1.625, 4.375, 3.125, 0.75, 3.75, 1.625, 2.25, 4.375, 2.625, 
    4.25, 1.75, 3.875, 1, 2.5, 4, 3.375, 1.625, 1.5, 2.375, 4.25, 
    4.125, 4.125, 3.125, 1, 1.125, 3.25, 1.625, 4.375, 3.5, 1.75, 
    2.25, 2.625, 3.625, 4.5, 2.75, 4.5, 2.5, 1.5, 4.5, 3.875, 
    4, 0.625, 1.875, 1.25, 1.625, 2.625, 4.375, 4.25, 2.625, 
    2.375, 2.375, 1.125, 1.25, 0.5, 2.125, 2.75, 1.125, 1.25, 
    2.5, 4.5, 1.125, 1.375, 0.75, 0.75, 1, 1.5, 0.75, 3, 2.25, 
    3.75, 2.5, 2.5, 2.75, 1.75, 3.25, 2.375, 3.375, 2, 1.875, 
    0.625, 0.5, 1.375, 0.375, 1.625, 1.625, 1, 1, 1.75, 1.625, 
    1.25, 0.625, 1, 2.125, 1.125, 1.125, 1, 1.75, 3.125, 1.25, 
    1, 2.75, 1.625, 3.125, 1.875, 0.625, 3.25, 1.875, 2, 2.375, 
    1.25, 2.125, 0.5, 2.375, 2.25, 2.25, 3.75, 1.625, 2.25, 1.125, 
    3.625, 0.875, 1.375, 4.125, 2, 1.5, 1.875, 4.125, 0.75, 0.875, 
    1.75, 4, 1, 1.875, 1.75, 0.875, 4.25, 2, 4.375, 3.5, 1.625, 
    3.625, 0.75, 3.5, 2.75, 1.875, 3.125, 0.375, 3.125, 4.25, 
    2.25, 2.125, 1.375, 2.625, 3.625, 3, 4.125, 0.5, 2.25, 1.125, 
    1.375, 1.375, 1.875, 2.625, 2.375, 4, 4.125, 0.875, 1.5, 
    4, 2, 1.25, 3.25, 1, 1, 1.625, 2.125, 1.875, 1.375, 3.25, 
    2, 1.875, 1.25, 2, 2.125, 1.625, 0.75, 2.125, 0.875, 1.125, 
    1.625, 2.375, 1.25, 3, 1.375, 1.625, 1.875, 0.5, 3.375), 
    x4 = c(2.3333333, 1.6666667, 1, 2.6666667, 2.6666667, 1, 
    3.3333333, 3.6666667, 2.6666667, 2.6666667, 2, 2.6666667, 
    2.6666667, 4.6666667, 5, 2.6666667, 2, 2, 4.3333333, 3.6666667, 
    1.6666667, 2, 3.3333333, 2.6666667, 2.3333333, 1.6666667, 
    2.6666667, 1.6666667, 2, 2.6666667, 2, 1.6666667, 2.6666667, 
    3.3333333, 1.3333333, 1.6666667, 2.6666667, 3, 4, 3, 2.6666667, 
    1.6666667, 1.6666667, 2, 4.6666667, 3, 0.66666667, 2.6666667, 
    2, 3.6666667, 1.6666667, 2.3333333, 0.66666667, 3.3333333, 
    2, 2.6666667, 1.6666667, 2.6666667, 1, 1.3333333, 4, 2.6666667, 
    1.6666667, 3.3333333, 2.6666667, 3.6666667, 3, 2.6666667, 
    3, 4.3333333, 2, 1.3333333, 1.6666667, 3.3333333, 3, 1.6666667, 
    4, 0, 4, 1, 1.6666667, 3, 0.33333333, 2.3333333, 1, 3.3333333, 
    2.6666667, 4, 1.3333333, 3.6666667, 2.6666667, 2, 1, 2.6666667, 
    4.6666667, 4, 3.3333333, 5.3333333, 1.6666667, 3.6666667, 
    2.6666667, 3.6666667, 4.3333333, 3.3333333, 2.6666667, 2.6666667, 
    5.3333333, 2.6666667, 5.6666667, 3.3333333, 3.6666667, 4.3333333, 
    5, 4, 1, 3.6666667, 2.6666667, 3, 2.3333333, 2.6666667, 2.6666667, 
    3.3333333, 4, 2.3333333, 3.3333333, 4.3333333, 4.3333333, 
    3.3333333, 3, 2, 4.3333333, 4, 1.3333333, 2, 2.6666667, 4, 
    4.3333333, 3.3333333, 2.3333333, 3.3333333, 3.3333333, 2, 
    3.6666667, 6, 2.6666667, 3.6666667, 4, 0.66666667, 5.6666667, 
    2.6666667, 5, 3, 3, 1.3333333, 3, 3, 3.3333333, 2.6666667, 
    3.6666667, 3, 2.6666667, 3.3333333, 5.6666667, 3.3333333, 
    3.6666667, 3, 3.3333333, 0.33333333, 3.3333333, 1.6666667, 
    4.6666667, 2.3333333, 3.6666667, 3.3333333, 2.6666667, 4.3333333, 
    1.6666667, 4.3333333, 4.6666667, 2.3333333, 2.6666667, 3, 
    4.3333333, 3, 1.3333333, 3, 3, 3, 3, 1.6666667, 2.6666667, 
    3, 3, 0.66666667, 2, 6.3333333, 5, 3, 3, 2.6666667, 4.6666667, 
    3.6666667, 2, 3.6666667, 2.6666667, 3.3333333, 2.3333333, 
    2.3333333, 1.6666667, 2, 4.6666667, 2.3333333, 3, 6, 2.6666667, 
    3.3333333, 6, 2.3333333, 3, 1.6666667, 2, 4.3333333, 3.3333333, 
    5, 3, 2.6666667, 3, 3.3333333, 2.6666667, 5, 2.3333333, 3, 
    5, 2.6666667, 1.6666667, 3, 4.3333333, 5.3333333, 5, 3, 4.3333333, 
    3.3333333, 3.6666667, 3.6666667, 2.6666667, 3.3333333, 3.3333333, 
    4, 3.6666667, 3, 4, 6, 5, 4, 5.3333333, 2.6666667, 2.3333333, 
    5.6666667, 3.3333333, 3.3333333, 3.3333333, 2, 4, 4.3333333, 
    4.6666667, 4, 6, 4.6666667, 4.6666667, 3.3333333, 4.6666667, 
    3.3333333, 2.6666667, 2.6666667, 2.3333333, 1.3333333, 3, 
    4, 1.6666667, 2.6666667, 2.3333333, 5, 2.6666667, 4, 2.6666667, 
    3.3333333, 4.3333333, 2.3333333, 3, 3.3333333, 2.6666667, 
    1.3333333, 3.3333333, 3.3333333, 3, 3, 2.6666667, 2.3333333, 
    3.6666667, 3.6666667, 3.6666667), x5 = c(5.75, 3, 1.75, 4.5, 
    4, 3, 6, 4.25, 5.75, 5, 3.5, 4.5, 4, 4, 5.5, 4.25, 4, 2.5, 
    5.25, 3.75, 2.5, 3.25, 5.75, 3, 3.75, 3.5, 2.25, 3, 3, 3.25, 
    3.5, 3, 5.25, 4.25, 3, 4.75, 4, 3, 5.25, 5.25, 4.25, 4.25, 
    3.75, 3, 5.5, 4.25, 1, 3.25, 2.25, 5.5, 2.25, 4.5, 1.5, 3.5, 
    4.5, 4.5, 3, 5.5, 2.5, 3, 5.5, 5, 2.5, 4.5, 5, 4.5, 3.75, 
    2.75, 2.5, 4, 3, 2.25, 2.75, 5.75, 3.25, 2.25, 4.75, 1.5, 
    3.25, 1.5, 2, 4.75, 2.25, 3.5, 1.5, 5.75, 4, 5, 1.25, 4.5, 
    2.75, 3.75, 2, 3.25, 4.75, 4.25, 5.75, 6.75, 2.75, 4.5, 3.25, 
    3.75, 4.5, 4.5, 5.75, 5.5, 5.75, 2.75, 5.5, 3.5, 4.75, 4.25, 
    3.5, 5.25, 2, 4.75, 4.75, 2.75, 2.25, 3.25, 4.25, 3, 6.5, 
    3.75, 5, 4, 5.5, 5, 6, 2.75, 6.25, 4.25, 2, 3.75, 5, 6.25, 
    5.75, 5.25, 4.5, 5.75, 5.75, 1.5, 5.25, 6.5, 5, 5, 5, 2.5, 
    5.5, 3.75, 5.75, 5.5, 4.25, 2.25, 2.5, 4.75, 4.25, 4.25, 
    4.75, 4.75, 6.25, 5.75, 6.25, 4.5, 5.25, 5.25, 4, 1.75, 2.75, 
    2.5, 5.5, 3, 5.75, 5.75, 4.25, 6, 4.25, 6, 6.5, 3.25, 4.25, 
    5, 4.75, 2.75, 4, 4.75, 4.25, 5, 3.5, 3, 3.75, 5.75, 5.5, 
    1, 4.5, 6, 5, 5.75, 3, 4.5, 4.25, 5.75, 2.5, 4.75, 5.5, 5.5, 
    4, 5, 1.75, 4, 7, 5.25, 5.25, 6, 4.5, 4.25, 6.5, 4, 4, 1.5, 
    3, 4, 4, 4.75, 4, 3.75, 3.5, 4, 4.25, 6, 3.25, 4.75, 6.25, 
    3.25, 2.75, 4.75, 6.25, 6.25, 5.75, 5.75, 5.5, 6.25, 5.75, 
    6.5, 4.25, 4.75, 5, 5, 6, 5.25, 5.75, 6.5, 5.75, 4, 6.25, 
    4.75, 4.75, 6.5, 4.75, 5.25, 6.25, 6, 4.75, 5.75, 5, 6, 6.75, 
    5.5, 5.5, 3.75, 5.25, 4.75, 4.75, 4.75, 3.75, 3.25, 4.75, 
    4.75, 2.75, 4.5, 4.75, 6, 4.5, 5.25, 3.5, 4, 6.25, 4, 4.75, 
    4.75, 5, 3, 4.25, 4.5, 3.25, 4.25, 4.25, 4, 5.75, 4.5, 5.75
    ), x6 = c(1.2857143, 1.2857143, 0.42857143, 2.4285714, 2.5714286, 
    0.85714286, 2.8571429, 1.2857143, 2.7142857, 2.5714286, 1.5714286, 
    2.7142857, 2.2857143, 1.5714286, 3, 0.71428571, 1.2857143, 
    1.7142857, 3.7142857, 2.5714286, 0.57142857, 2.1428571, 3.1428571, 
    1, 1.2857143, 1.4285714, 0.71428571, 1.1428571, 2, 1.8571429, 
    1.8571429, 1, 2, 1.8571429, 0.85714286, 1.4285714, 2, 1.7142857, 
    3.2857143, 3.1428571, 2.1428571, 1.5714286, 0.57142857, 1.4285714, 
    2, 1.2857143, 1, 1.2857143, 1, 2.1428571, 0.14285714, 1.4285714, 
    0.14285714, 2.1428571, 2.1428571, 1.4285714, 1.7142857, 2.4285714, 
    1.2857143, 1, 2.5714286, 2.1428571, 0.71428571, 2.4285714, 
    1.4285714, 3, 1, 1.8571429, 1.5714286, 2.7142857, 1, 1.1428571, 
    1.5714286, 2.7142857, 0.71428571, 1, 2.2857143, 1.1428571, 
    1.8571429, 1.8571429, 1.5714286, 2.4285714, 0.57142857, 0.71428571, 
    1, 3.7142857, 0.85714286, 2.1428571, 0.57142857, 1.7142857, 
    1.8571429, 0.42857143, 1.5714286, 1.1428571, 2.1428571, 2.4285714, 
    2.8571429, 2.7142857, 1.4285714, 3.2857143, 1, 1.7142857, 
    1.8571429, 2.8571429, 5, 2.1428571, 3.1428571, 2, 4, 1.4285714, 
    3, 1.5714286, 1.8571429, 1.8571429, 0.85714286, 1.8571429, 
    2.7142857, 1, 1.1428571, 2, 1.2857143, 2, 3.5714286, 1.5714286, 
    2.8571429, 1.1428571, 3, 2.2857143, 2.4285714, 0.57142857, 
    3.1428571, 2.2857143, 0.57142857, 1.8571429, 2.4285714, 2.5714286, 
    2.8571429, 2.1428571, 2.5714286, 3, 2.2857143, 0.42857143, 
    4.7142857, 6.1428571, 0.85714286, 3, 2.5714286, 1.4285714, 
    4.1428571, 1, 4.8571429, 2.1428571, 2.4285714, 1.1428571, 
    1.4285714, 2.1428571, 1.4285714, 1.4285714, 2.7142857, 1.5714286, 
    3.4285714, 2.5714286, 5.8571429, 1.5714286, 1.1428571, 2.2857143, 
    1.8571429, 1.5714286, 2, 1.4285714, 3.7142857, 1.5714286, 
    2.5714286, 5, 2.8571429, 5.1428571, 1.5714286, 3.5714286, 
    3.4285714, 0.57142857, 1.8571429, 2.4285714, 2.7142857, 1, 
    2.2857143, 2.4285714, 1.8571429, 3, 2.8571429, 0.57142857, 
    1.1428571, 2.1428571, 2.8571429, 0.28571429, 1.8571429, 4.7142857, 
    3.5714286, 3.4285714, 1.1428571, 1.4285714, 1.5714286, 3.2857143, 
    1.4285714, 2.5714286, 2.4285714, 2.5714286, 1.7142857, 1.8571429, 
    0.71428571, 1.8571429, 2.8571429, 2, 2.4285714, 3.8571429, 
    0.85714286, 2.1428571, 4.4285714, 1, 2.2857143, 0.57142857, 
    0.85714286, 2.1428571, 2.2857143, 2.7142857, 2, 1.7142857, 
    1.4285714, 2.4285714, 2.4285714, 5.5714286, 2.7142857, 3.1428571, 
    4.2857143, 1.8571429, 1.5714286, 1.5714286, 3.7142857, 2.1428571, 
    3, 3.2857143, 2.4285714, 3.8571429, 2.5714286, 1.7142857, 
    2.1428571, 2.7142857, 2.7142857, 3.2857143, 2.7142857, 1.7142857, 
    2.1428571, 5.4285714, 4.2857143, 4.7142857, 4.5714286, 2.7142857, 
    2.7142857, 4.2857143, 2.2857143, 2.7142857, 3.2857143, 1.2857143, 
    2.4285714, 3.7142857, 4.1428571, 3.8571429, 4.7142857, 3.4285714, 
    4.1428571, 1.8571429, 2, 2.4285714, 2.1428571, 1.8571429, 
    1.4285714, 2.4285714, 2, 2.1428571, 2.2857143, 1.8571429, 
    1.1428571, 3.2857143, 1.7142857, 2.7142857, 1.7142857, 1.4285714, 
    4.5714286, 1.5714286, 3.2857143, 1.4285714, 0.85714286, 1.5714286, 
    2, 2, 1.5714286, 2.8571429, 1, 1, 4.2857143, 2, 3.1428571
    ), x7 = c(3.3913043, 3.7826087, 3.2608696, 3, 3.6956522, 
    4.3478261, 4.6956522, 3.3913043, 4.5217391, 4.1304348, 3.7391304, 
    3.6956522, 5.8695652, 5.1304348, 4, 4.0869565, 3.6956522, 
    4, 3.9130435, 3.4782609, 2.6086957, 4.4782609, 3.4782609, 
    5.826087, 4.6956522, 5.7391304, 4.1304348, 2.826087, 5.1304348, 
    4.6521739, 4.826087, 2.0434783, 2.6956522, 5.3913043, 2.7826087, 
    3.1304348, 3.826087, 4.0869565, 5.5217391, 3.2173913, 4.9130435, 
    2.826087, 3.9565217, 3.9565217, 5.3913043, 4.826087, 3.3478261, 
    6.1304348, 5.6521739, 2.5652174, 2.3478261, 3.6521739, 3.2608696, 
    3.4782609, 4.173913, 4.0434783, 4.173913, 3.4347826, 3.3913043, 
    3.5217391, 5.2608696, 4.9130435, 2.6521739, 5.5217391, 3.3043478, 
    5.6956522, 3.6086957, 3.8695652, 6.826087, 4.7391304, 3.6086957, 
    3.7391304, 5.2608696, 3.4782609, 3.4782609, 3.1304348, 5.6521739, 
    5.6956522, 5.5652174, 4.6956522, 5.8695652, 5.1304348, 5.3478261, 
    5.7826087, 4.9565217, 4.6086957, 4.0869565, 5.5217391, 3.5652174, 
    4.173913, 6.6521739, 5.3913043, 3.5652174, 3.6086957, 4.1304348, 
    3.5652174, 5.5217391, 5.1304348, 5.5652174, 4.9130435, 4.2608696, 
    4.5217391, 5.8695652, 3.826087, 3, 6.3478261, 7.4347826, 
    4.173913, 5.6956522, 4.4782609, 3.2608696, 2.8695652, 4.5652174, 
    4.6956522, 3.5652174, 4.826087, 5.3043478, 5.5652174, 3.6956522, 
    4.0869565, 5.826087, 4.6086957, 3.7391304, 3.6521739, 5, 
    4, 5.1304348, 4.9130435, 3.3043478, 4.5217391, 4.5652174, 
    7.2608696, 4.9130435, 4.6956522, 3.4347826, 4.3913043, 3.3478261, 
    4.4347826, 6.2608696, 6.4347826, 6.0434783, 3.3043478, 3.826087, 
    5.3478261, 2.3913043, 5.6086957, 6.9565217, 4.5217391, 3.5652174, 
    5.1304348, 5.6521739, 5.7826087, 5.3913043, 5.3913043, 4.6956522, 
    3.6086957, 3, 2.826087, 2.173913, 4.9565217, 4.8695652, 4.0869565, 
    5.6086957, 4.173913, 4.4782609, 3.8695652, 3.826087, 4.4782609, 
    3.6086957, 4.3043478, 2.1304348, 2.826087, 2.8695652, 3.1304348, 
    4.1304348, 3.5652174, 4.6086957, 3.826087, 4.8695652, 5.4782609, 
    2.9565217, 3.2173913, 4.173913, 2.3913043, 3.5217391, 2.2173913, 
    2.2608696, 4.6956522, 3.5217391, 2.173913, 4.3478261, 2.4347826, 
    3.5217391, 1.3043478, 4.0434783, 4, 4.1304348, 2.5217391, 
    2.826087, 2.6521739, 3.5652174, 2.6956522, 2.6521739, 3.0869565, 
    2.6521739, 3.6956522, 2.7826087, 4.1304348, 2, 3.7391304, 
    3.826087, 3.8695652, 4.3478261, 3.4782609, 3.173913, 3.4782609, 
    5.2173913, 2.9130435, 4.6086957, 4.3913043, 1.8695652, 3.173913, 
    3.173913, 4.3913043, 5.3913043, 3.9130435, 3.8695652, 5.1304348, 
    2.9565217, 3.5217391, 2.9565217, 3.826087, 2.9565217, 2.4347826, 
    3.3043478, 5.6521739, 3.6521739, 4.7391304, 4.7391304, 6.4782609, 
    4.8695652, 4.4782609, 5.826087, 3.6956522, 4, 4.7826087, 
    3.6521739, 4, 5.7391304, 5.7826087, 6.0434783, 5.7826087, 
    4.173913, 5.1304348, 3.4782609, 3.3913043, 4.4347826, 5, 
    4.1304348, 2.9565217, 3.7826087, 4.3478261, 4.826087, 3.3913043, 
    5.0869565, 4.173913, 6.3043478, 4.3043478, 4.2173913, 5, 
    4.6086957, 3, 5.8695652, 3.6956522, 4.7391304, 4, 3.3913043, 
    3.5652174, 4.6086957, 3.6521739, 3.4782609, 2.3913043, 3.3478261, 
    2.7826087, 4.2608696, 3.4782609, 3.3478261, 5.8695652, 5.2173913, 
    2.4347826, 5.826087, 4.173913, 5.4782609, 3.3913043, 4.173913, 
    3.0434783, 5.0869565, 4.6086957, 4, 5.0869565, 4.0869565), 
    x8 = c(5.75, 6.25, 3.9, 5.3, 6.3, 6.65, 6.2, 5.15, 4.65, 
    4.55, 5.7, 5.15, 5.2, 4.7, 4.35, 3.8, 6.65, 5.25, 4.85, 5.35, 
    4.6, 5.45, 4.6, 5.3, 4.6, 6.25, 5.1, 5.55, 5.85, 4.85, 6.95, 
    3.65, 4.3, 4.35, 5.2, 3.75, 4, 3.5, 5.45, 4.5, 5.1, 5.3, 
    4.75, 5.6, 5, 5, 6.35, 8, 5.9, 4.8, 5.05, 5.15, 4.2, 5.1, 
    4.8, 5.7, 5.25, 5.8, 4.55, 6.15, 6.2, 5.35, 3.85, 5.45, 4.5, 
    5.4, 5.75, 6.1, 5.7, 5.95, 5.5, 5.75, 7.35, 4.6, 5.7, 5.75, 
    5.7, 4, 7.35, 5.15, 6.45, 5.75, 5.2, 5.75, 4.85, 6.85, 5.4, 
    4.55, 5.55, 5.95, 7.5, 7.8, 6.4, 4.8, 5.35, 3.6, 5.85, 5.55, 
    6.1, 5.15, 5.3, 4.4, 6.25, 4.85, 6.2, 6.5, 5.7, 6.8, 6.4, 
    4.15, 3.9, 6, 5.35, 4.3, 3.95, 5.85, 6.85, 6.6, 5.2, 5.55, 
    6.4, 6.4, 7.6, 5.35, 6.9, 6.95, 5.65, 5.25, 5.1, 6.25, 7.8, 
    6.35, 6, 5.25, 6, 5.85, 5.1, 5.15, 6.55, 8.3, 5.25, 5.25, 
    7.1, 5.75, 5.6, 4.9, 6.25, 5, 6.3, 4.75, 7.55, 7.9, 7.5, 
    6.15, 5.6, 6.15, 4.1, 4.9, 4.3, 5.15, 6.1, 5.65, 6.95, 4.75, 
    5.7, 5.05, 5.35, 6.8, 5.4, 4.35, 4.2, 4, 5.65, 5.2, 5.5, 
    5.15, 4, 6.05, 5.75, 10, 5.8, 5.9, 6.05, 5.9, 6.4, 4.6, 4.7, 
    5.65, 5.8, 3.6, 7.2, 6.5, 4.8, 3.05, 5.55, 5.4, 4.6, 4.55, 
    4.2, 5, 5.2, 4.15, 3.85, 4.7, 3.5, 6.4, 3.7, 6.4, 3.6, 4.95, 
    5.7, 6.05, 6.05, 5.6, 4.2, 5.05, 5.45, 4.4, 6.35, 5.3, 5, 
    3.65, 5.05, 5.6, 5.95, 6.3, 5.8, 5.8, 4.1, 5.2, 4.85, 5.45, 
    4.55, 3.75, 5.7, 5.25, 6.35, 5.55, 7.15, 7.8, 6.3, 4.9, 5.1, 
    4.95, 4.75, 6.75, 6.6, 6.25, 5.9, 5.55, 5.5, 8.05, 6.1, 5.6, 
    4.75, 5.05, 5.65, 6.8, 5.95, 4, 4.65, 9.1, 6.25, 5.15, 5.4, 
    5.65, 5.85, 5.4, 6, 6.45, 5.3, 6.5, 7.1, 5.45, 7, 4.05, 4.45, 
    5.4, 7.15, 6.3, 5.85, 4, 4.7, 4.7, 6.6, 5.25, 5.95, 6.3, 
    6.55, 5.75, 5.5, 5.75, 6, 5.15, 4.85, 4.25, 5.6, 6.05, 6, 
    6.2, 6.95), x9 = c(6.3611111, 7.9166667, 4.4166667, 4.8611111, 
    5.9166667, 7.5, 4.8611111, 3.6666667, 7.3611111, 4.3611111, 
    4.3055556, 4.1388889, 5.8611111, 4.4444444, 5.8611111, 5.1388889, 
    5.25, 5.4444444, 5.75, 4.9166667, 5.3888889, 7, 5, 6.7777778, 
    4.1388889, 4.3333333, 4.5277778, 4.4166667, 8.6111111, 5.4444444, 
    5.9722222, 3.3611111, 4.8055556, 5.6388889, 4.8333333, 4.9166667, 
    5.3055556, 5.0833333, 5.1111111, 4.8888889, 4.6388889, 4.7777778, 
    2.7777778, 6.6666667, 6.8611111, 5, 6.2777778, 5.4444444, 
    6.0555556, 5.5277778, 5.5, 4.7777778, 5.3888889, 4.1944444, 
    4.4166667, 5.5555556, 5.6388889, 4.75, 4.8333333, 3.9444444, 
    6.1388889, 4.7777778, 5.3333333, 5.8333333, 5.0277778, 6.3055556, 
    6.1944444, 4.25, 3.9166667, 5.4166667, 4.6111111, 5.1666667, 
    5.9722222, 4.2777778, 4.5833333, 4.6666667, 6.4722222, 5.6388889, 
    5.75, 5.3333333, 6.0277778, 5.3055556, 5.7777778, 4.9722222, 
    4.5833333, 5.4722222, 5.9722222, 5.1388889, 4.8888889, 6.6666667, 
    5.4444444, 6.1111111, 5.7222222, 3.9444444, 3.7777778, 5.4444444, 
    4.2222222, 7.2222222, 4.6111111, 5.75, 6.25, 6.5833333, 4.1111111, 
    4.8333333, 6.0277778, 6.1666667, 5.1944444, 7, 7.5277778, 
    3.3611111, 4.8611111, 5.4444444, 5.4444444, 6, 6.3333333, 
    5.4166667, 5, 6.4444444, 5.1666667, 5.4444444, 6.8611111, 
    3.3055556, 6.5, 4.7777778, 6.3888889, 5.6666667, 4.9166667, 
    4.9722222, 5.7777778, 6.3055556, 7, 7.1944444, 4.8055556, 
    3.7222222, 6.1666667, 6.1111111, 5.4444444, 4.5555556, 6.8888889, 
    7.0833333, 6.2222222, 5.7222222, 5.7777778, 6.6111111, 5.9722222, 
    3.8611111, 6.3055556, 4.75, 5.4722222, 5.2222222, 6.1666667, 
    6.9444444, 5.4166667, 5.1944444, 4.1388889, 4.6944444, 4.3333333, 
    5.4166667, 6.3333333, 4, 4.4444444, 5.5833333, 9.25, 4.8333333, 
    5.4722222, 4.9444444, 3.8055556, 4.2777778, 5.5833333, 4.0833333, 
    4.75, 3.4722222, 5.1666667, 6.1666667, 4.4722222, 5.6944444, 
    3.2777778, 6.25, 5.1388889, 6.5555556, 6.0833333, 5.3055556, 
    5, 4.9444444, 5.25, 5.0555556, 3.9722222, 5, 4.4166667, 4.5, 
    5.6944444, 5.5, 5.5277778, 3.1111111, 5.1388889, 4.1111111, 
    4.9444444, 4.8055556, 3.8055556, 5.9444444, 6.7777778, 3.9722222, 
    4.4166667, 4.2777778, 3.3333333, 5.6111111, 4.5833333, 3.8055556, 
    3.3611111, 5.9444444, 6.1944444, 6.1666667, 5.7222222, 6.3611111, 
    4.4722222, 5.1388889, 5.6944444, 5, 4.8611111, 4.7777778, 
    4.0277778, 3.6111111, 6.1111111, 3.2222222, 6.1111111, 6.7222222, 
    6.5555556, 7, 4.7222222, 5.8333333, 4.1388889, 5.3055556, 
    5.0277778, 4.9166667, 4.3333333, 6.6944444, 6.3888889, 5.5555556, 
    6.8333333, 6.8611111, 6.3055556, 4.6666667, 6.2222222, 5.25, 
    5.9722222, 5.5277778, 4.0833333, 6.4444444, 5.1944444, 6.5277778, 
    5.5277778, 6.9166667, 5.8888889, 3.6666667, 4.75, 4.9722222, 
    5.8333333, 7.3888889, 6.6666667, 3.4722222, 4.7222222, 6.3055556, 
    5.7222222, 5.5, 6.5833333, 5.7222222, 6.2777778, 5.9722222, 
    7, 5.0833333, 5.1944444, 5.5, 6.25, 6.1111111, 4.9444444, 
    4.1666667, 5.3055556, 5.0555556, 5.5555556, 5.9722222, 6.5277778, 
    4.2777778, 3.75, 5.25, 6.6666667, 5.6944444, 5.4166667, 5.6666667, 
    5.7222222, 5, 6.7777778, 4.8333333, 4.5, 6.3333333, 5.7777778, 
    5.6666667, 5.25, 6.0833333, 7.6111111, 4.3888889, 5.1666667
    )), row.names = c(NA, -301L), class = "data.frame")


Get this bounty!!!

#StackBounty: #estimation #inference #fisher-information #efficiency Deriving C-R inequality from H-C-R bound

Bounty: 50

As mentioned in the title, I want to derive the Cramer-Rao Lower bound from the Hammersly-Chapman-Robbins lower bound for the variance of a statistic $T$.
The statement for the H-C-R lower bound is the following,

Let $mathbf{X} sim f_{theta}(.)$ where $theta in Theta subseteq mathbb{R}^k.$ Suppose $T(mathbf{X})$ is an unbiased estimator of $tau(theta)$ where $tau colon Theta to mathbb{R}$. Then we have,
begin{equation}
text{Var}{theta}(T) ge displaystyle sup{Delta in mathcal{H}{theta}}, displaystyle frac{[tau(theta + Delta)]^2}{mathbb{E}{theta}left(frac{f_{theta + Delta}}{f_{theta}} – 1right)^2}
end{equation
}

where $mathcal{H}_{theta} = {alpha in Theta colon text{ support of } f text{ at } theta + alpha subseteq text{ support of } f text{ at } theta}$

Now when $k = 1$ and the regularity conditions hold, taking $Delta to 0$ gives the following inequality,
begin{equation}
text{Var}{theta}(T) ge displaystyle frac{[tau'(theta)]^2}{mathbb{E}{theta} left( frac{partial }{partial theta} log f_{theta}(mathbf{X}) right)^2}
end{equation
}

which is exactly the C-R inequality for univariate case.

However, I want to derive the general form of C-R inequality from the H-C-R bound, i.e. when $k > 1$. But, I have not been able to do it. Though, I was able to figure out that we would have to use $mathbf{0} in mathbb{R}^k$ instead of $0$ and $|Delta|$ to obtain the derivatives, which was obvious anyways, I couldn’t get to any expression remotely similar to the C-R inequality. One of the difficulty arises while dealing with the squares. Since for the univariate case, we were able to take the limit inside and as a result got the square of the derivative. While, for the latter case, we cannot take the limit inside, because the derviate in this case would be a vector and we will have the expression containg the square of a vector which is absurd.

I want to know how to derive the C-R inequality in the latter case?


Get this bounty!!!

#StackBounty: #bayesian #estimation #inference #prevalence Optimization of pool size and number of tests for prevalence estimation via …

Bounty: 100

I’m trying to devise a protocol for pooling lab tests from a cohort in order to get prevalence estimates using as few reagents as possible.

Assuming perfect sensitivity and specificity (if you want to include them in the answer is a plus), if I group testing material in pools of size $s$ and given an underneath (I don’t like term “real”) mean probability $p$ of the disease, the probability of the pool being positive is:

$$p_w = 1 – (1 – p)^s$$

if I run $w$ such pools the probability of having $k$ positive wells given a certain prevalence is:

$$p(k | w, p) = binom{N}{k} (1 – (1 – p)^k)^s(1 – p)^{s(w-k)}$$

that is $k sim Binom(w, 1 – (1 – p)^s)$.

To get $p$ I just need to maximize the likelihood $p(k | w, p)$ or use the formula $1 – sqrt[s]{1 – k/w}$ (not really sure about this second one…).

My question is, how do I optimize $s$ (maximize) and $w$ (minimize) according to a prior $p$ in order have the most precise estimates, below a certain level of error?


Get this bounty!!!

#StackBounty: #regression #hypothesis-testing #multiple-regression #estimation #linear-model Multiple Linear Regression Coefficient Est…

Bounty: 100

A multiple linear regression model is considered. It is assumed that $$ Y_i = beta_1x_{i1} + beta_2x_{i2} + beta_3x_{i3} + epsilon_i$$ where $epsilon$-s are independent and have the same normal distribution with zero expectation and unknown variance $sigma^2$. 100 measurements are made, i.e $i = 1,2,…, 100.$ The explanatory variables take the following values: $x_{i1} = 2$ for $1 leq i leq 25$ and $0$ otherwise, $x_{i2} = sqrt{2}$ for $26 leq i leq 75$ and $0$ otherwise, $x_{i3} = 2$ for $76 leq i leq 100$ and $0$ otherwise.

a) Let $hat{beta_1},hat{beta_2}, hat{beta_3}$ be least squares estimators of $beta_1, beta_2, beta_3$. Prove that in the considered case $hat{beta_1},hat{beta_2}, hat{beta_3}$ are independent, and $$Var(hat{beta_1}) = Var(hat{beta_3}) = Var(hat{beta_3})$$ Do these properties hold in the general case? If not, give counterexamples.

b) Perform a test for $$H_0: beta_1 + beta_3 = 2beta_2$$vs.$$H_1: beta_1 + beta_3 neq 2beta_2$$ The significance level is 0.05. The least squares estimates of $beta_1, beta_2$ and $beta_3$ are $0.9812, 1.8851$ and $3.4406$, respectively. The unbiased estimate of the variance $sigma^2$ is $3.27$.

For a) I know the OLS estimator for $hat{beta} = (X^TX)^{-1}X^Ty$, and $Var(hat{beta}) = sigma^2 (X^TX)^{-1}$. But I don’t know how to attain explicit expressions for each of the coefficients from this. Although it seems quite clear that the estimators are independent, for instance $P(hat{beta_3} = beta_3, hat{beta_1} = 0, hat{beta_2} = 0) = P(hat{beta_3} = beta_3)$ but I don’t how to write a proper proof. I believe the estimators are generally dependent and have unequal variance, but can’t come up with any particular examples.

For b) not sure what test-statistic to use (t or F) and how to set it up. Also don’t know the standard errors of the coefficients


Get this bounty!!!