Given 3 random variables $X_1$, $X_2$ and $Y$. $Y$ and $X_1$ are independent. $Y$ and $X_2$ are independent. Intuitively I would assume that $Y$ and $X_1+X_2$ are independent. Is this the case, and how can I prove it formally?
Note that $X_1 + X_2$ is a function of $Z = (X_1,X_2)$ because if you take $$f(x,y) = x+y$$ you get $X_1 + X_2 = f(Z)$.
It is a well known theorem of probability that if $R_1$ and $R_2$ are independent random variables and $f_1$ and $f_2$ are measurable functions then $f_1(R_1)$ is independent of $f_2(R_2)$ (Theorem 10.4 of "Probability: A Graduate Course" 2nd ed. by Allan Gut).
Since $f$ is measurable and Y is independent of $Z$ we know that $Y$ is also independent of $f(Z) = X_1 + X_2$. Note that we took $f_1$ as the identity function and $f_2 = f$.
It is possible to construct X1, X2, Y such that the above conditions are satisfied, but Y is a function of Z = (X1, X2):
Contradicts the answer statement: 'Y is independent of Z'