How would I go about proving the computational bound of an operation where I'm repeatedly taking a dot product between an $n\times n$ square matrix and its self for up to $n$ times? (Is this even the correct place for me to ask this question?)
2026-04-01 14:35:58.1775054158
How to prove computational cost of taking matrix powers
1.7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in PROOF-WRITING
- how is my proof on equinumerous sets
- Do these special substring sets form a matroid?
- How do I prove this question involving primes?
- Total number of nodes in a full k-ary tree. Explanation
- Prove all limit points of $[a,b]$ are in $[a,b]$
- $\inf A = -\sup (-A)$
- Prove that $\sup(cA)=c\sup(A)$.
- Supremum of Sumset (Proof Writing)
- Fibonacci Numbers Proof by Induction (Looking for Feedback)
- Is my method correct for to prove $a^{\log_b c} = c^{\log_b a}$?
Related Questions in COMPUTER-SCIENCE
- What is (mathematically) minimal computer architecture to run any software
- Simultaneously multiple copies of each of a set of substrings of a string.
- Ackermann Function for $(2,n)$
- Algorithm for diophantine equation
- transforming sigma notation into harmonic series. CLRS A.1-2
- Show that if f(n) is O(g(n) and d(n) is O(h(n)), then f(n) + d(n) is O(g(n) + h(n))
- Show that $2^{n+1}$ is $O(2^n)$
- If true, prove (01+0)*0 = 0(10+0)*, else provide a counter example.
- Minimum number of edges that have to be removed in a graph to make it acyclic
- Mathematics for Computer Science, Problem 2.6. WOP
Related Questions in COMPUTATIONAL-COMPLEXITY
- Product of sums of all subsets mod $k$?
- Proving big theta notation?
- Little oh notation
- proving sigma = BigTheta (BigΘ)
- sources about SVD complexity
- Is all Linear Programming (LP) problems solvable in Polynomial time?
- growth rate of $f(x)= x^{1/7}$
- Unclear Passage in Cook's Proof of SAT NP-Completeness: Why The Machine M Should Be Modified?
- Minimum Matching on the Minimum Triangulation
- How to find the average case complexity of Stable marriage problem(Gale Shapley)?
Related Questions in COMPUTATIONAL-MATHEMATICS
- The equivalent of 'quantum numbers' for a mathematical problem
- Skewes' number, and the smallest $x$ such that $\pi(x) > \operatorname{li}(x) - \tfrac1n \operatorname{li}(x^{1/2})$?
- Approximating a derivative through Newton interpolation
- What is the value of $2x+3y$?
- Good free calculator for manipulating symbolic matrices of 6x6 and larger?
- How to convert an approximation of CCDF for a standard normal to an approximation with a different mean and variance?
- Simple recursive algorithms to manually compute elementary functions with pocket calculators
- Asymptotic notation proof
- Graph layout that reflects graph symmetries
- What is the most efficient computation of the permanent?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The computational complexity of matrix multiplication is between $O(n^{2.37})$ and $O(n^3)$, depending on naivety the algorithm you use. Let's use $O(n^k)$.
So if $A\in\mathbb{R}^{n\times n}$, and you compute $A^m$ by naive matrix multiplication, it will cost $O(n^km)$.
If you want to be a little smarter and compute the power by diagonalization, the complexity of finding the eigendecomposition is generally considered around $O(n^3)$, although it technically has no finite bound on the number of operations of course (e.g. see here or here). Then you can take the matrix power by just taking $n$ numbers to the $m$th power (at worst $O(nm)$ and can be better (e.g. here)) and doing $2$ matrix multiplications (negligible). So the cost will be more like $O(n^3 +nm)$, which can be much cheaper even in this naive setup.
Edit: here's some details on why naive matrix multiplication has that many operations.
Let $A\in\mathbb{R}^{n\times m}$, $B\in\mathbb{R}^{m\times k}$. Then $C=AB\in\mathbb{R}^{n\times k}$.
Standard matrix multiplication for one element is given by: $$ C_{ij} = \sum_{p=1}^m A_{ip} B_{pj} $$ How many operations is this (for one element)? It is $m$ additions and $m$ multiplications, i.e. $2m$ floating point operations. But there at $nk$ elements to fill. Thus, naive "dot product" multiplication requires $2nkm$ operations. If $n=k=m$, this is $O(n^3)$.
Now, if $A\in\mathbb{R}^{n\times n}$ and you want to compute $A^w=A\ldots A$ naively, it will take $w$ matrix multiplications, each costing $O(n^3)$. Hence, the cost is $O(n^3w)$.