I'm struggling to understand what the purpose of the dummy variable $t$ in the probability generating function is?
I know it takes a value between 0 and 1, and have heard it described as a 'relative ratio'. But physically what does it mean? Why, when this variable takes a value of 1, will the function also be equal to 1?
$G_{X} (t) = E (t^{x})$
$G_{X} (1) = 1$
I assume your question is about the probability generating function of a discrete random variable $X$ taking nonnegative integer values, defined as $$G_X(t) = E(t^X) = \sum_{k \ge 0} \Pr(X = k)\, t^k.$$
Your second question on why $G_X(1) = 1$ is easy to answer: when $t=1$, we have $$G_X(1) = E(1^X) = E(1) = 1$$ as $1^X$ is always $1$ no matter what the value of $X$. Or you can use the fact that $$G_X(1) = \sum_{k \ge 0} \Pr(X = k) = 1$$ as probabilities must add up to $1$.
For your first question, the variable $t$ is used to encapsulate the entire probability distribution in a single function. It matters not so much for its physical meaning (that after all is why you're calling it a dummy variable), but for the fact that it enables a compact representation of the entire probability distribution in one function $G_X$.
All the information about the distribution of $X$ can be recovered from the function $G_X$; for instance $\Pr(X=k)$ can be recovered by asking "what is the coefficient of $t^k$ in $G_X(t)$?"
You may like to look at generating functions (see for example the fantastic book generatingfunctionology by Wilf) of which this is a special case. Given any sequence of numbers $a_0, a_1, a_2, a_3, \dots$, their generating function $A(t) = a_0 + a_1t + a_2t^2 + a_3 t^3 + \dots$ gives a (possibly) compact representation of the entire sequence as a single function $A$. As Wilf says in the first sentence on page 1 of the book, "A generating function is a clothesline on which we hang up a sequence of numbers for display". The purpose of the dummy variable $t$ is to provide that clothesline on which we attach the probabilities $\Pr(X=k)$.
A particular value of $t$ has (to the best of my knowledge) no special meaning beyond that; what matters the entire function (how it varies with varying $t$, etc.) Thus for instance differentiating $G_X(t)$ with respect to $t$ can help you find the expected value (and higher moments), taking the $k$th derivative at $0$ can help you find the coefficient of $t^k$, it converts sums of independent random variables to products of probability generating functions ($G_{X+Y}(t) = G_X(t)G_Y(t)$ if $X$ and $Y$ are independent), etc. It is, in short, a transform of the probability distribution into a different representation, that can be more convenient for certain purposes.