Divide By Vector

498 Views Asked by At

In linear Naive Bayesian with multivariate Gaussian distribution: $$\mu_i , \mu_j, x$$ are all vectors of the same dimensions. So I have this equation that does a vector division by somehow and I don't know why this is right so here is what happens it takes the vector $$ (\mu_i - \mu_j)^T$$ common and dividing it from some other term that doesn't have it.

enter image description here

2

There are 2 best solutions below

0
On BEST ANSWER

All of the products here are matrix multiplications in a sense, because they've gone to the trouble of using the transpose. They could have just said that they were working with dot products, but either way, its possible to pick apart what is going on.

For any non zero column vector, $v$, $$ 1 = \frac{|v|^2}{|v|^2} = \frac{v^T v}{v^T v}$$ and they've just multiplied that last term by $1$, which changes nothing, and then expanded it out using the relationship I just wrote down, where $v = \mu_i - \mu_j,$ and then factored out the transpose, which works because matrix multiplication distributes over addition.

Hope that helps.

0
On

For a non-zero vector $x$, we can define a pseudo inverse as follows: $$x^{+} = \frac{x^*}{x^*x}$$

http://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_pseudoinverse#Vectors