(I actually thought this up while listening to a lecture on Goedel's [first] Incompleteness Theorem.)
A rudimentary presentation of the diagonal argument:
(I used the set of all binary sequences, but feel free to use any other uncountable set)
Let $\mathbf{f}:\Bbb{N}\to\bigoplus_{n\in N}\{0,1\}$. Order each binary sequence $\mathbf{f}(n)$ into a list according to $n$...
$$\begin{matrix} &1&2&3&4&\cdots\\1\mapsto&0&\cdots&\cdots&\cdots&\cdots\\2\mapsto&0&1&\cdots&\cdots&\cdots\\3\mapsto&1&0&1&\cdots&\cdots\\4\mapsto&0&1&1&0&\cdots\\\vdots&\vdots&\vdots&\vdots&\vdots&\ddots\end{matrix}$$
Let $h:\Bbb{N}\to\{0,1\}$, where $h(n)=1-f_n(n)$. Let $v$ be the sequence $\bigoplus_{n=1}^\infty h(n)$. It follows that $v$ is not in the above list (diagonal argument). From this, we may conclude that the set of binary sequences $\bigoplus_{n\in\Bbb{N}}\{0,1\}$ is uncountable.
Except...
What if we place $v$ before the first entry in our list? The list is still countable, and now it contains $v$. We can then map each element of our new list to a natural number indicating its position in the list...
$$\begin{matrix} &1&2&3&4&\cdots\\1\mapsto&0&\cdots&\cdots&\cdots&\cdots\\2\mapsto&0&1&\cdots&\cdots&\cdots\\3\mapsto&1&0&1&\cdots&\cdots\\4\mapsto&0&1&1&0&\cdots\\\vdots&\vdots&\vdots&\vdots&\vdots&\ddots\\?\mapsto&1&0&0&1&\cdots\end{matrix}\quad{\Huge\to}\quad\begin{matrix} &1&2&3&4&\cdots&\cdots\\?\mapsto&1&0&0&1&\cdots&\mapsto1\\1\mapsto&0&\cdots&\cdots&\cdots&\cdots&\mapsto2\\2\mapsto&0&1&\cdots&\cdots&\cdots&\mapsto3\\3\mapsto&1&0&1&\cdots&\cdots&\mapsto4\\4\mapsto&0&1&1&0&\cdots&\mapsto5\\\vdots&\vdots&\vdots&\vdots&\vdots&\ddots&\vdots\end{matrix}\quad\text{(still countable)}$$
As long as we do not allow self referential statements (e.g. we do not regard $\bigoplus_{n=1}^\infty h(n)$ as referring to the new ordering of the list - which would effectively reduce to an elaborate form of the liar's paradox), this seems like a perfectly valid argument. From a technical standpoint, all that I've really done is shown that $|\omega|=|\omega+1|=\aleph_0$. So, why isn't this a counterexample to Cantor's diagonal argument? (I don't believe that it is, but I want to know why it isn't.)
I'm sure the answer is obvious, but I can't seem to figure it out for the life of me.
EDIT: the OP has clarified their stance, which is actually self-consistent and interesting; I've significantly edited my answer to address this.
The context
The "first response" to any argument against Cantor is generally to point out that it's fundamentally no different from how we establish any other universal proposition: by showing that the property in question (here, non-surjectivity) holds for an "arbitrary" witness of the appropriate type (here, function from $\omega$ to $2^\omega$). For example, the irrationality of $\sqrt{2}$ and the non-existence of a largest natural number both fit this pattern.
Generally this is a good point to push back on - many people who take issue with Cantor's argument perceive a difference between it and these basic nonexistence proofs, and so defending the analogy is helpful. The original version of my answer did this, and can be read in the edit history. The key point here is that ultimately any self-consistent approach which accepts the logical rules leading to such basic facts as the above - together with very basic set existence axioms (forming the antidiagonal sequence from a given list only needs quantifier-free comprehension) - must accept Cantor.
So those are the four options: accept Cantor, defang the concept of set to the point of near-meaninglessness, weaken the underlying logic of all of mathematics, or be self-inconsistent. The first is the standard one, and in my experience the second and fourth are what the die-hard anti-Cantorians generally choose between.
However, the OP's response is in my opinion far more interesting. For whatever reason I think some genuinely cool stuff - both mathematical and philosophical - happens if we look at alternative logical systems. And from the comments below, it seems that this is what the OP is doing. So let me forget Cantor entirely and address that: what logical rule underlies non-existence arguments of the "arbitrary candidate" type, and why might we want to consider rejecting it (this could be an argument against it or just an observation that rejecting it opens the door to something cool)?
The rule
Let's recall the type of argument. I want to prove "No object has property $P$." So I consider an arbitrary object $a$. I show that $a$ does not have property $P$. From this, I conclude that no object has property $P$.
The key logical step here, from the OP's perspective (and I agree), is (2) (in particular, (4) is really "dual" to (2)): to critique it is to take the question "What can an 'arbitrary' object be?" seriously. Now the reaction of the vast majority of the mathematical community, including me, is that (2)/(4) are completely valid. However, it is possible to react differently (and there are those who have), and this begins with the following position:
This is the position that we can only bring in objects from some pre-existing set (indeed, that this is what sets do), and so we can only prove nonexistence results within already-established sets. If we don't accept the set-hood of the class $C$ of maps $\omega\rightarrow 2^\omega$, we can't fully instantiate from it. (Note that rejecting the set-hood of $C$ but keeping our underlying logic classical doesn't save us from Cantor since classical logic doesn't care about quantifier scope; so even though there's an 'axioms of set theory' flavor here, this really is an underlying-logic issue.)
But this doesn't fully save the OP - it prevents us from proving un-scoped propositions in the level of syntax, but we still have "existence failure" semantically unless we accept the possibility of some objects which belong to no set at all - that is, fundamentally non-instantiable objects.
(In the comments below the OP brings definability (and NBG) into the picture, but this is really a red herring. Definability and instantiability need not be related directly in any way - if we're being so skeptical, I see no reason to be surprised by definable non-instantiable objects or instantiable non-definable objects, and definability is a technical concept which should not be abused.)
At this point it's hard to avoid similarities with existing ideas: paraconsistency, relevance logic, dialetheism, Meinongianism, etc. These provide mathematical and philosophical treatments of impossible objects and impossible worlds - and there's definitely going to be a tight connection between impossibility and non-instantiability (the same ideas should be useful for analyzing each). Indeed, I think there is already some literature around non-instantiability but I'm not familiar with it at all.
The rejection
In my opinion, the best justification for taking a weak logical system seriously isn't philosophical but rather mathematical: show me that there are genuinely interesting mathematical objects/situations which that system permits but which classical logic rejects. So: what might an active rejection of $(\forall R)$ look like?
Well, many nonexistence proofs implicitly introduce an idea of an "improvement process." E.g. given an $f:\omega\rightarrow 2^\omega$, we get a $r_f\in 2^\omega\setminus ran(f)$ ... and this suggests to us that we append $r_f$ to $f$ (either at the beginning to get a new genuine list, or at the end and think about maps to arbitrary countable ordinals; it doesn't really matter here).
There's a natural (extremely informal) principle which suggests itself now:
(I could have set "completed object," but I think that's overloading the idea - the point isn't that it's not further alterable, but that the particular process we're looking at doesn't do anything. But this is a bit of a technical object.)
This principle is strongly anti-classical: taken naively, it says for example that every real number is rational (since real numbers can be approximated by rational numbers), that there is a largest natural number (since we can always make a given natural number bigger), and - yes - that the reals are countable (since we can always make better countings). Serious work would have to be done to express it precisely and in such a way that it didn't yield a completely trivial or uninteresting mathematics. In fact, I don't think I've ever seen work treating it seriously, precisely because of how overwhelmingly ridiculous it is. But there might indeed be something interesting there. Of course, this is still not the standard approach, and the OP does need to recognize that.
The summary
Cantor's argument relies on a logical rule about quantification; this rule is implicitly assumed by basically every mathematical theory, so the OP's claim that "most axiomatic systems fail to prevent the existence of certain objects, it is possible, through careful exploitation of axioms, to show that [such] a proof ... does not exist" is incorrect.
Moreover, rejecting that logical rule throws into doubt every non-existence argument.
But this is something the OP recognizes and accepts. So while the OP's claims about classical systems are incorrect, their general stance is self-consistent.
Finally, it's worth noting that rejecting that rule does not take us fully into terra incognita, although it does take us well outside the realm of classical logic. This isn't an issue ignored by mathematics; it's simply one where the mathematical community has adopted, for better or for worse (and personally I'd argue for better), a logical system in which it is resolved in a particular way.