Relationship between L2-norm of the noisy observations

by nOp   Last Updated June 12, 2019 06:19 AM

Assume $\mathbf X_1^n$ is a vector of size $n$ whose elements are either $+1$ or $-1$. Then, we define $$\mathbf Y^n=\mathbf X_1^n+\mathbf N^n$$ where $\mathbf N^n$ is Gaussian additive noise with distribution $\mathbf N^n\sim \mathcal{N}(\mathbf 0,\sigma^2\mathbf I)$, i.e., noise elements are identically independently distributed (i.i.d) with variance $\sigma^2$.

Now, consider $\mathbf X_2^n$ which is also a vector of $+1$'s and $-1$'s, and is the same as $\mathbf X_1^n$ except for $k<n$ positions in which its values is opposite of the $\mathbf X_1^n$. For example for $n=5$ and $k=2$, we could have $$\mathbf X_1^5=[-1,-1,+1,+1,-1];\\ \mathbf X_2^5=[+1,-1,+1,+1,+1];\\ $$ where two vectors differ in the first and the last positions. My question is can we obtain a relationship between $||\mathbf Y^n-\mathbf X_1^n||^2_2$ and $||\mathbf Y^n-\mathbf X_2^n||^2_2$. Numerically, I have seen that as $k$ gets large $||\mathbf Y^n-\mathbf X_2^n||^2_2$ gets larger than $||\mathbf Y^n-\mathbf X_1^n||^2_2$. I am looking for a mathematical expression to relate the two.



Related Questions


Updated September 24, 2018 21:19 PM

Updated September 27, 2018 00:19 AM

Updated August 25, 2018 13:19 PM

Updated October 24, 2017 20:19 PM

Updated March 01, 2017 14:19 PM