I have a Mercer Kernel, $K\colon X \times X \rightarrow \mathbb{R}$, i.e. it is continuous, symmetric and postive definite on a compact domain $X \subset \mathbb{R}^n$. Also, I have a set of $m$ samples of the form $$ [(x_1,y_1),(x_2,y_2), \cdots,(x_m,y_m)] $$ where $x_i \in X$ and all $y_i \in \mathbb{R}$ are bounded s.t $|y_i| \leq M$ and satisfy $$ \Big(\gamma m \ \text{Id} + K[\textbf{x}] \Big) a = \textbf{y} $$ where $\gamma$, $m>0$ are positive constants, $K[\textbf{x}]$ is a $m \times m$ matrix s.t. $K_{ij}[\textbf{x}]=K(x_i,x_j)$.
I've been trying to get a upper bound on $\displaystyle \sum_{i=1}^m |a_i|$ from the above equation.
Can anyone help me with this ? (The above equation is a part of well known Representer theorem in machine learning)