I came across a definition of the δ-function as a generalized function in Mathematics Handbook for Science and Engineering which made me curious.
They first define a test function $\varphi: \mathbb R \to \mathbb C$ satisfying
1) $\varphi \in \mathcal C^\infty(\mathbb R)$
2) $\displaystyle\lim_{|t|\to\infty} t^p \frac{\mathrm d^q \varphi(t)}{\mathrm dt^q} = 0$ for all $p,q\geq 0$
and denote the class of all test functions $S$.
They then define a sequence $\displaystyle\{\varphi_k\}_{k=0}^\infty$ with $\varphi_k \in S$ as zero sequence in $S$ if and only if
$\displaystyle\lim_{n\to\infty} \max_{t \in \mathbb R} \left| \frac{\mathrm d^q \varphi(t)}{\mathrm dt^q}\right| = 0$ for all $p,q\geq 0.$
They then denote the value of a functional $f : S \to \mathbb C$ as $(f|\varphi)$ and define a distribution as a continuous linear functional on $S$, i.e. for all $\varphi, \psi \in S$ and $\alpha, \beta \in \mathbb C$ we have
$(f|\alpha \varphi + \beta \psi) = \alpha(f|\varphi) + \beta(f|\psi)$
and
$\displaystyle\lim_{k\to\infty} (f|\varphi_k) \equiv 0$ for all zero sequences $\displaystyle\{\varphi_k\}_{k=0}^\infty$ in $S$
They then let $g : \mathbb R \to \mathbb R$ be a piece-wise continuous function satisfying the integral equation
$\displaystyle\int_{\mathbb R} \left( 1+ t^2\right)^{-m} \left|g(t)\right| \; \mathrm dt < \infty$ for some $m \in \mathbb Z.$
Then
$(f|\varphi) = \displaystyle\int_{\mathbb R} g(t) \varphi(t) \; \mathrm dt$
is said to define a regular distribution. Non-regular distributions are called singular.
Finally, they define the Dirac $\delta$-function as a singular distribution by
$(\delta|\varphi) = \varphi(0)$
Now, surely you can claim that this is a sound definition. Could one prove the existence of a such functional?
If there was another distribution $\eta$ with $(\eta|\varphi) = \varphi(0)$ for all test functions $\varphi$, then $(\eta-\delta|\varphi) = (\eta|\varphi) - (\delta|\varphi) = \varphi(0) - \varphi(0) = 0$
Hence $\eta-\delta = 0\iff \eta = \delta$ which shows uniqueness.
Existence is simply a consequence of the definition. $\delta:S\to\mathbb C, \varphi \to \varphi(0)$ *is* a continuous linear functional on S. Hence $\delta$ exists as an element of the dual space $S^*$.
Remark: Obviously every element of $S^*$ is unique. (Either $a=b$ or $a\neq b$). My guess is that you rather confused this with the observation that for regular distributions, there might be different functions $f,g$ inducing the same distribution. If $\int \varphi(x) f(x) dx = \int \varphi(x) g(x) dx$ for all test functions $\varphi$, we can only conclude that $f(x)-g(x)$ is orthogonal to all test functions. Depending on the space of test functions such $f,g$ might exist or not. This however does not apply to the dirac delta. The reason we still write $\int \varphi(x)\delta(x) dx$ instead of say $\delta(\varphi)$ is because of Riesz's theorem.
The theorem states that if $H$ is a Hilbert space, then any coninuous linear functional $\psi$ on $H$ takes the form $\psi(\varphi) = (f_\psi,\varphi)$ for some $f_\psi\in H$. Note that $S$ is not a Hilbert-space! Since $\delta$ is provably non-regular, you cannot find $f_\delta\in S$ such that $\delta(\varphi) = (f_\delta,\varphi)$ for all $\varphi$. But it motivates why we carry on with the notation $\int \varphi(x)\delta(x) dx :=\delta(\varphi)$ as if the dirac delta existed as a function.