I am looking to understand relativization in non-Turing Machine settings. For example, what is the "natural" relativization of Graph Isomorphism to an oracle A? How about HAMPATH?
Specifically, they say that relativization is an obstacle to proofs of $P?NP$ because there are oracles A relative to which $P^A=NP^A$ and oracles B relative to which $P^B \neq NP^B$.
However, consider e.g. HAMPATH and the following oracle A: A is a function that takes a number n to some number in $[n!]$. So you pass it $(n,x)$ and it tells you if $x=A(n)$. So you solve HAMPATH and if you find a hamiltonian path you pass it to A to see if it is "valid". That turns HAMPATH into HAMPATH + an exponential search, so $HAMPATH^A$ is provably not in $P^A$ (and $NP^A \neq P^A$).
Now, why would this be an obstacle to a proof of $P=NP$? If you claim that your algorithm solves HAMPATH in polynomial time, it obviously won't solve $HAMPATH^A$ in polynomial time with polynomial calls to the oracle. Relativization would only be an obstacle if your algorithm would also run in polynomial time in situations where $P^A \neq NP^A$, but it obviously doesn't for this oracle. How do we know that there are separating oracles where this is less obvious? Why is relativization a compelling argument?
In general, what does an oracle look like in a graph theory setting?
It's an "obstacle" in the sense that any proof of $P=NP$ or $P\ne NP$ will have to contain at least one core step that is not valid relative to arbitrary oracles.
This means that solutions will necessarily have to consider more low-level details of actual computation mechanisms than if they could stay very abstract and say, "consider some cost model and any class of problems that is closed under such-and-such operations ..."
It's not clear to me why you would think this looks any different with graph-related problem specifically.