I'm currently in an Artificial intelligence class and have been given a challenging homework problem. My Linear algebra isn't quite up to scratch, and the problem is based around simplifying the multiplication of some vectors.
Normally we have a solution model of the form $ y = wx+b $ that can perform linear regression.
During a discussion of multilayer perceptrons and neural networks which take the output of a neuron (function) and input it into another function, we were given the following question:
Simplify the following: $${w_3[w_2(w_1 \overrightarrow{x} + b_1) + b_2] + b_3}$$
I believe the intent of this question is to show that this does not add any complexity to the model, and can be reduced to something similar to this:
$$ \overrightarrow{w}\overrightarrow{x} + \overrightarrow{b} $$
Is this assumption correct? If so, what initial steps would I take to simplify this. I'd like to do the work myself so tips or some identity in linear algebra would suffice.
Think about the distributive property of matrix multiplication.