A regular Fourier Series will turn out to contain only sine terms if the target function is odd, only cosine terms if the target function is even, and both sine terms and cosine terms if the target function is neither even nor odd.
By contrast, my understanding is that Fourier Sine Series and Fourier Cosine Series are completely different animals, and that the mathematician can choose to represent their target function using a regular Fourier Series, a Fourier Sine Series, or a Fourier Cosine Series at their discretion by temporarily introducing a regular extension, odd extension, or even extension, respectively, regardless of the target function's actual even or odd status. It is perhaps natural to use regular Fourier Series to represent functions within intervals centered around $0$, and Fourier Sine Series or Fourier Cosine Series to represent functions within intervals whose left boundary is $0$, but it should be possible to simply $u$-substitute any finite interval function into either of these forms, do the Fourier Series/Fourier Sine Series/Fourier Cosine Series in $u$, and then revert the substitution.
On the other hand, this would mean that any function (continuous or piecewise smooth, a.k.a., any function that could be represented by a regular Fourier Series) could be represented by a single family of sinusoids, either sines or cosines, which seems to be at odds with the Linear Algebra terminology often associated with Fourier Series. For example, thinking of the sines and cosines in a Fourier Series as a basis leads me to believe that you should expect to require both to represent an arbitrary function, not just one or the other.
Is it always possible to represent any function that could be represented by a regular Fourier Series by your choice of Fourier Sine Series or Fourier Cosine Series? If so, how is this in concord with the Linear Algebraic explanation of Fourier Series?
This is sort of a complicated question. For starters, the situation really is different from your standard linear algebra setting because there are infinitely many basis vectors and so convergence comes into play. There are various theorems about conditions on a given function that guarantee convergence in some form. Remember, there are different modes of convergence to consider ie (pointwise convergence, uniform, $L^2$, $L^p$,...). if you have encountered these before. So to answer part of your question, at least in the context of continuous and integrable periodic functions the Fourier series will always give you a representation. Now, you can drop the continuity requirement and add in some other conditions and get pointwise convergence around various points.
As far as your question about cosine vs sine series, some functions will just have one or the other. This is in analogue with linear algebra in the sense that some vectors just don't have certain directions in their components, these components are 0. So for even functions, their sine component is 0.
Not sure this answers much of your question but I guess the point I am trying to make is that Fourier analysis is pretty complicated but very interesting. In fact, this is what lead to the development of Lebesgue integration.
Some other things to look at:
Dirichlet kernel,
Theory of distributions,
Fourier transform