Suppose I have a set of vectors $\{\mathbf s_1, \dots, \mathbf s_N\} \subset \mathbb Z^n$, and I want to find all conic combinations that give the zero vector. In other words, I want to find all vectors $\mathbf x \in \mathbb R^N$ such that $\mathbf x \geq \mathbf 0$ and
\begin{equation} \sum_{i=1}^N x_i \mathbf s_i = \mathbf 0. \end{equation}
Is there an algorithm for solving this problem?
Update: I was a little imprecise in saying that I wanted to find "all vectors...". To clarify, the set of $\mathbf x$ vectors that satisfy $\mathbf x \geq \mathbf 0$ and \begin{equation} \sum_{i=1}^N x_i \mathbf s_i = \mathbf 0. \end{equation} is a cone. I want to find the extreme rays of this cone.
Ordinary Gaussian elimination will give you a parametric equation for the subspace of all $x$s that satisfy your equation. The requirement that $\mathbf x\ge 0$ now easily translates to $N$ simultaneous linear inequalities in the parameter space.
Arguably those $N$ simultaneous linear inequalities is the best you can hope for to "find all vectors" -- in that such a system of linear inequalities is the standard way to describe a convex polytope in $\mathbb R^k$ in in the first place, such as when giving input to a linear programming solver.
If the parameter space is $2$- or perhaps $3$-dimensional you can probably understand the solution space "by eye". For higher dimensions it may not be clear whether there are solutions at all; a practical approach could be to feed all the inequalities to an off-the-shelf linear programming solver and ask it to minimize, say, $x_1$.