I have come across the question below in a sample exam paper. The subject is Computer Graphics. I do not know how to "Draw the basis vector for the x and y axis", and I'm not so sure about deducing the 3 × 3 homogeneous coordinate matrix M either. I know what a basis vector is in Linear Algebra.. and I am thinking perhaps, that before the transformation(the first cup), the basis for x = (1,0) and y = (0, 1) (These should be column vectors), such that you just have a basis for R2, but I don't know how that could change after the transformation.. I also know what a homogeneous coordinate matrix is.. a matrix where the vectors have been augmented I believe. I can probably figure this part out if I am clear on the first part with the basis vectors. Could anyone help me to understand this? (The word 'Draw' has thrown me a bit too.. I guess I'm to draw these vectors?) Could anyone help me to understand this?
2026-03-25 23:44:00.1774482240
How to find the basis vector for the x and y axis
743 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in CHANGE-OF-BASIS
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- What is meant by input and output bases?
- Change of Basis of Matrix: Two points of view
- Change of Basis (Transformation Matrix)
- Diagonalization and change of basis
- Change of Basis Matrix. Doubt about notation.
- Why does the method of getting a transition matrix seems reversed to me?
- Finding bases to GF($2^m$) over GF($2$)
- Block diagonalizing a Hermitian matrix
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?

When you’re working with homogeneous coordinates, you’re really working in a projective rather than a Euclidean space, but it’s convenient to pretend that you’re working in the next-higher dimension Euclidean space instead. You can identify lines through the origin in this space with points in the original projective space, creating an isomorphism between the two. More importantly for the present purpose, affine transformations of the real projective plane correspond to linear transformations of $\mathbb R^3$, which allows you to represent them as matrices. In particular, the affine transformation $A_{2\times2}\mathbf p+\mathbf t$ is represented by the matrix $$M = \left[\begin{array}{c|c}A_{2\times2}&\mathbf t\\ \hline \mathbf 0^T&1\end{array}\right].$$ (If the first two columns have non-zero elements at the bottom, this matrix represents a projective transformation of the plane instead.) This is a homogeneous matrix: it can be scaled by any non-zero amount without changing the transformation it represents because we can scale the homogeneous coordinates of a point in the same way.
Recall from linear algebra that the columns of a transformation matrix are the images of the basis vectors. Thus, the columns of $A_{2\times2}$ are the images of the standard basis vectors in the plane. You can determine these by comparing horizontal and vertical line segments in the original image with their counterparts in the transformed image. Once you have those, all you need is the translation $\mathbf t$. An easy way to find this is to apply $A_{2\times2}$ to some vertex in the original image and compare the result to the corresponding vertex in the final image. In this case, the original image includes the coordinate origin, so you can compute the translation directly, without applying the linear part of the transformation.
Alternatively, you can apply brute force to compute $M$: Pick three noncolinear points $\mathbf p_1$, $\mathbf p_2$ and $\mathbf p_3$ in the original image and find their counterparts $\mathbf q_1$, $\mathbf q_2$ and $\mathbf q_3$ in the transformed image. Using their homogeneous coordinates, $$M = \begin{bmatrix}\mathbf q_1 & \mathbf q_2 & \mathbf q_3\end{bmatrix}\begin{bmatrix}\mathbf p_1 & \mathbf p_2 & \mathbf p_3\end{bmatrix}^{-1}.$$ Since the $\mathbf p$’s are noncolinear, their homogeneous coordinate vectors will be linearly independent, so the matrix on the right is invertible.