Tensors, as I understand, are a sort of functions that contain information on how to transform a set of vectors and dual vectors, represented by a matrix. However, what I don't understand is the differing notation used to represent them. For example, I've seen both
$g_{\mu \nu} = \begin{pmatrix}-1 && 0 &&0&&0\\0&&1&&0&&0\\0&&0&&1&&0\\0&&0&&0&&1 \end{pmatrix} \tag*{}$
i.e., the metric tensor, denoted with the indices below, and other tensors denoted, say, $T^{\mu \nu}$ or $T^{\mu}_{\nu}$. These would transform different types, as in, two vectors ($g_{\mu \nu}$), two dual vectors, or a vector and a dual vector. I'm unable to understand how to, given a tensor, rearrange the components to form other tensors, say take $g_{\mu \nu}$ and find $g^{\mu}_{\nu}$, $g^{\mu \nu}$, or even $g_{\nu}^{\mu}$ for example. How would one do this?
The concept you're looking for is 'raising and lowering indices.'
https://en.wikipedia.org/wiki/Raising_and_lowering_indices
There is a little bit of confusion I think, although I could be wrong, because I have never seen a tensor written $g_\mu^\nu$ to be anything relating to the metric. Typically both coefficients down is just the metric, and both coefficients up is the inverse matrix.
The article contains an example or two. Be aware that the Einstein summation convention is used throughout.