In two days, I am giving a presentation about a search engine I have been making the past summer. My research involved the use of singular value decompositions, i.e., $A = U \Sigma V^T$. I took a high school course on Linear Algebra last year, but the course was not very thorough, and though I know how to find the SVD of a matrix, I don't know how to explain what I have in my hands after the matrix has been decomposed.
To someone who has taken linear algebra, I can say that I can decompose a matrix $A$ into matrix $\Sigma$, whose diagonal holds the singular values, and matrices $U$ and $V$ whose columns represent the left and right singular vectors of matrix $A$. I am not sure how to explain what a singular value or what left/right singular vectors are. I can still be satisfied if there is no easy way to explain what this decomposition means, but I always prefer keeping the audience as informed as possible.
Much of linear algebra is about linear operators, that is, linear transformations of one space to itself. A typical result is that by choosing a suitable basis for the space, the operator can be expressed in a simple matrix form, for instance, diagonal. However, this does not apply to all operators.
The singular value decomposition is the only main result about linear transformations between two different spaces. It says that by choosing suitable bases for the spaces, the transformation can be expressed in a simple matrix form, a diagonal matrix. And this works for all linear transformations. Moreover, the bases are very nice: orthogonal bases.
Geometrically, the SVD means that spheres of the proper dimension in the domain are transformed into ellipsoids in the codomain. Since the transformation may not be injective, the dimension of the ellipsoid is at most the dimension of the sphere. So you get some distortion along some axes and some collapsing along other axes. And that is all the transformation does. Every linear transformation.