Many neural networks(or multi-layer perceptron) are set up with weight matrices $W_1...W_n$ where $n$ is the number of layers minus one. Each weight matrix $W_k$ represents the connection weights between layer $k$ and $k+1$ is my understanding(may not be wholly correct but gets the idea across).
My question is can the weights for a non-standard network where neurons in layer $k$ can connect to any other non-input layers be represented in a matrix or series of matrices?