# Fading absorption in non-linear elliptic equations

### Moshe Marcus

Department of Mathematics, Technion Haifa, Israel### Andrey Shishkov

Institute of Appl. Math. and Mech., NAS of Ukraine, Donetsk, Ukraine

## Abstract

We study the equation $−Δu+h(x)∣u∣_{q−1}u=0$, $q>1$, in $R_{+}=R_{N−1}×R_{+}$ where $h∈C(R_{+} )$, $h⩾0$. Let $(x_{1},…,x_{N})$ be a coordinate system such that $R_{+}=[x_{N}>0]$ and denote a point $x∈R_{N}$ by $(x_{′},x_{N})$. Assume that $h(x_{′},x_{N})>0$ when $x_{′}=0$ but $h(x_{′},x_{N})→0$ as $∣x_{′}∣→0$. For this class of equations we obtain sharp necessary and sufficient conditions in order that singularities on the boundary do not propagate in the interior.

## Cite this article

Moshe Marcus, Andrey Shishkov, Fading absorption in non-linear elliptic equations. Ann. Inst. H. Poincaré Anal. Non Linéaire 30 (2013), no. 2, pp. 315–336

DOI 10.1016/J.ANIHPC.2012.08.002