I am reading Computability Theory by Cooper and I do not understand the notation in the definition on the page 230:
Let $\{A^s\}_{s \ge 0}$ be a $\Delta_2$-approximating sequence for $A \in \Delta_2$. Then the computation function $C_A$ for $\{A^s\}_{s \ge 0}$ is defined by
$C_A(x) = \mu s \ge x [A^s \upharpoonright x = A \upharpoonright x ].$
What is $A \upharpoonright x$?
What does $\mu s \ge x$ denote? I know that the definition of the $\mu$-operator in $\mu s [g(n,m)=0]$, but do not know how to extend the definition to $\mu s \ge x$.
The notation $A \upharpoonright x$ refers to the initial segment of $A$ with length $x$. That is, $A \upharpoonright x$ is the sequence consisting of the first $x$ values of $A$, where we interpret $A$ as an (infinite) sequence.
The $ \geq x$ in $\mu s \geq x$ functions like $\in T$ the notation $\forall y \in T$. It is a scope: we restrict ourselves to considering only those values which are at least $x$.
Thus, $C_A(x)$ is the least integer $s$ such that both (1) $s\geq x$ and (2) $A^s$ and $A$ agree for their first $x$ values.