I'm trying to find the flux out of a cube (with side length $2$) centered at the origin. The vector field is $\vec F = -x\vec i$, so the only relevant faces of the cube are the two perpendicular to the $x$-axis (the other faces are parallel to the vector field and so don't affect the flux).
Let's say the orientation of the cube is outward, so for the face at $(1,0,0)$ the orientation vector $\vec n$ is equal to $\vec i$. For the face at $(-1,0,0)$, $\vec n = -\vec i$
And I know the area vector for a face to be the area of the face (a scalar) multiplied by the orientation vector (a vector): $\vec A = \vec n A $. The two area vectors we have are $\vec i A$ and $-\vec i A$.
When we want to create a flux integral, we have to consider an infinitesimal portion of the surface in question, so we consider $\Delta \vec A$. This should be equal to $\vec i \Delta A$, right? But for some reason my textbook says that $\Delta \vec A = \vec i \|\vec A\|$, i.e. the magnitude of the infinitesimal area vector is equal to the magnitude of the ENTIRE area of the surface...can anyone explain why this is?
Since in this case the entire face has the same normal vector $\vec i$, it seems that the book is indicating with $\Delta \vec A$ not an infinitesimal portion but indeed just $\vec i ||\vec A||$.