#StackBounty: #probability #statistical-significance #mathematical-statistics #experiment-design Significance level example from "…

Bounty: 50

The paper reads:

Fisher immediately realized that this argument fails because every
possible result with the 6 pairs has probability (1/2)^6 = 1/64, so
every result is significant at 5%. Fisher avoided this absurdity by
saying that any outcome with just I W and 5 R’s, no matter where that
W occurred, is equally suggestive of discriminatory powers and so
should be included. There are 6 such possibilities, including the
actual outcome, so the relevant probability for (a) above is 6(1/2)^6
= 6/64 = .094, so now the result is not significant at 5%.

I do not understand how 1/64 is significant at 5% but 6/64 is not. It makes more sense to me that bigger of two numbers would be deemed significant as it describes something that happens more often.

What is wrong with my reasoning?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #mathematical-statistics #ica #example Numerical Example of Independent Component Analysis

Bounty: 50

Can somebody explain ICA(Independently Component Analysis) with a small practical example over here.
I have seen lot of programs and libraries written and you can just apply that to your data to find ICA components. One such library is famous python FastICA?

There is a whole Book on ICA, some Tutorial on ICA , some nicely explained PPT on ICA and some Practical Use of ICA to remove ECG artifacts But none of those references gave some practical small example to explain those mathematical concepts behind the ICA stepwise.

It would be I am sure very useful for beginners like me to understand the step wise mathematics of the ICA as just applying library is not enough for deep understanding if ICA.


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!

#StackBounty: #self-study #mathematical-statistics Neyman-Pearson test at level $alpha$

Bounty: 50

Let $P_0=mathcal{N}(0,1)$ and $P_1=mathcal{N}(mu,sigma^2)$ with $sigma > 1$

I would like to show that the Neyman-Pearson test of $P_0$ vs. $P_1$ at level $alpha$ has the form
$$varphi_{}(x)=mathbb{1}{[x notin(mu/(1-sigma^2)pmdelta{})]}$$

for some $delta_{}=delta_{}(mu,sigma,alpha)>0$ and also to determine special case of $varphi_{*}$ for $mu=0$

What I have tried: Let $f_0$ be the density of probability distribution $P_0$, so $f_0=frac{1}{sqrt{2pi}}exp(-frac{x^2}{2})$ and similarly for $f_1=frac{1}{sqrt{2pi} sigma}exp(-frac{(x-mu)^2}{2sigma^2})$.

Then I found what is monotone density ratios:

$frac{f_1}{f_0}=frac{1}{sigma} frac{exp(-frac{(x-mu)^2}{2sigma^2})}{exp(-frac{x^2}{2})}=frac{1}{sigma} exp(-frac{(x-mu)^2}{2sigma^2}+frac{x^2}{2})=g(T(x))$

$g(t)=frac{1}{sigma} exp(-frac{(t-mu)^2}{2sigma^2}+frac{t^2}{2})$
and $T(x)=x$.

Then, let $H: mathbb{R} to [0,1]$ be an auxiliary function.
$$H(r):=P_{0}(T>r)$$ for any $r in mathbb{R}$.
From this, we can determine $k_{alpha}:=min {r in mathbb{R} : H(r) leq alpha}$

and $$gamma_{alpha}=frac{alpha-P_0(T>k_{alpha})}{P_0(T=k_{alpha})} in (0,1)$$
In my lecture notes I have $varphi_{*}=gamma_{alpha}mathbb{1}_{[T(x)=k_{alpha}]}+mathbb{1}_{[T(x)>k_{alpha}]}$ for (UMP right sided-test) but I have problem with $gamma_{alpha}$ which in this case seems to be 1 and to define $k_{alpha}$ and what about $mathbb{1}_{[T(x)>k_{alpha}]}$?


Get this bounty!!!