The equation is given by
$$ \sum_{n=1}^N \min(\gamma, \beta a_n)=N$$ where $\beta$ is the variable with $\beta\in[0,\sqrt\gamma/\min(a_n\mid a_n>0)]$, $ \gamma $ is a constant with $1\le\gamma\le N$, and $a_n$ is an nonnegative constant.
I know that the function $f(\beta)=\sum_{n=1}^N \min(\gamma, \beta a_n)-N$ is strictly increasing. Also, $f(0)<0 $ and $ f(\sqrt\gamma/\min(a_n\mid a_n > 0 )) > 0$. Thus, there must exist an unique solution for the above equation. How can I find the solution numerically? e.g. using matlab. Increasing $ \beta$ slowly works, but it is not accurate.

The fastest generic (read: linear convergence) algorithm given a continuous function and an interval $[a,b]$ with $f(a) < 0$ and $f(b) > 0$ is called bisection (sometimes binary search):
The guaranteed accuracy ($\frac{b-a}2$) is halved in each iteration, so we converge linearly to the solution. In other words we need linear time if the accuracy is given in digits. This algorithm is optimal and always converges as long as $f$ is continuous.
A slight modification to prevent numerical problems if a root is on an interval boundary (i.e. found exactly), but the value returned isn't exactly $0$ would be to also return if $|f(\frac{a+b}2)| < \delta$ for some tolerance $\delta > 0$.