I am trying to justify that a (normalized) solution $\phi$ in $L^2(\mathbb{R}^n)$ of:
$-\Delta\phi+f(x)\phi=K\phi$, with $f(x)=0$ in $\Omega$, $f(x)=M$ in $\Omega^c$
has to vanish outside $\Omega$ when $M\to\infty$. To do so, I have integrated both sides of the equation against $\phi$, and, assuming the integrals exist we have:
$-\int_{\mathbb{R}^n}\phi\Delta\phi$+$M\int_{\Omega^c}\phi^2$ = K
What I would like to do is integrate by parts the first integral
$-\int_{\mathbb{R}^n}\phi\Delta\phi=\int_{\mathbb{R}^n}|\nabla\phi|^2+\int_{\partial\mathbb{R}^n}(\nabla\phi\cdot\nu)\phi=\int_{\mathbb{R}^n}|\nabla\phi|^2\geq0$
so as to then get $M\int_{\Omega^c}\phi^2 \leq K$ therefore, $\int_{\Omega^c}\phi^2\leq\frac{K}{M}\to0$ as $M\to\infty$. And hence, $\phi=0$ in $\Omega^c$.
Intuitively, since $\int_{\mathbb{R}^n}\phi^2=1$, I'd think that it has to vanish at infinity and therefore I can do $\int_{\partial\mathbb{R}^n}(\nabla\phi\cdot\nu)\phi=0$, but I am not certain and feel like maybe I am forgetting something.
Any help would be appreciated, thank you in advance!