Matrix multiplication in "linear" categories (Exercise 23.1 in Conceptual Mathematics)

236 Views Asked by At

This is a follow-up to my question here. In Lawvere's Conceptual Mathematics, linear categories (apparently called additive categories elsewhere) are defined as seen in this paragraph:

Linear category definition

Lawvere proceeds to define a 'product' of two matrices in a linear category as follows:

Matrix product definition

Addition of maps $f+g : A \to B$ is defined as the entry $h$ in the matrix product $\pmatrix{1_{AA}&f\\0_{BA}&1_{BB}} \cdot \pmatrix{1_{AA}&g\\0_{BA}&1_{BB}} = \pmatrix{1_{AA}&h\\0_{BA}&1_{BB}}$, where $f,g : A \to B$.

The exercise by Lawvere in this section asks to prove that matrix multiplication 'works' in linear categories. Matrix multiplication

Problem is, even after having received an answer to my previous question, I am still at a complete loss as to how one might prove this. How does one prove this identity?

2

There are 2 best solutions below

2
On BEST ANSWER

(Note: I write compositions from left to right.)

  1. Since we know $A+B\cong A\times B$ and that products and coproducts are only determined up to isomorphisms, we can eventually choose the same object for both.
    More specifically, supposed we are given a choice of product and coproduct objects, together with their projections and inclusions, we can adjust e.g. the inclusions and use the chosen product object $A\times B$ to serve as coproduct.
    By the conditions, it follows that the maps arising by the product rule $(1,\,0):A\to A\times B$ and $(0,\,1):B\to A\times B$ will be the corresponding injections, and then $\pmatrix{1_A\\0_{AB}}=\pi_A$ and $\pmatrix{0_{AB}\\1_B}=\pi_B$.
    With this setting, $\pmatrix{1_A&0_{AB}\\0_{BA}&1_B}=1_{A\times B}=\alpha$.

  2. Obviously, the matrix entries can be recovered by composing with the adequate inclusions and projections.
    So, on one hand, it's enough to prove the base case $$\pmatrix{f_{AX}& f_{AY}}\pmatrix{g_{XU}\\g_{YU}}\ =\ f_{AX}\,g_{XU}\,+\,f_{AY}\,g_{YU}$$ on the other hand, for $f,g:A\to B$, the definition of $f+g$ boils down to $$f+g\ :=\ \pmatrix{1_A& f}\pmatrix{g\\1_B}$$ Well, this is not the lucky order of addition for our purpose (and we haven't yet proved commutativity). But, applying the natural swap $A\times B\to B\times A$, we arrive to $$\pmatrix{1_A& f}\pmatrix{g\\1_B}\ =\ \pmatrix{f&1_A}\pmatrix{1_B\\g}$$

  3. Put it together, we want to prove $$\pmatrix{a&b}\pmatrix{c\\d}=\pmatrix{ac&1}\pmatrix{1\\bd}$$ provided $a:A\to X,\ b:A\to Y,\ c:X\to U,\ d:Y\to U$.

  4. This can be done in two steps (which are dual to each other): $$\underset{A\ \to\ X\times Y\ \to\ U}{\pmatrix{a&b}\pmatrix{c\\d}}\ =\ \underset{A\ \to\ U\times Y\ \to\ U}{\pmatrix{ac&b}\pmatrix{1_U\\d}}\ =\ \underset{A\ \to\ U\times A\ \to\ U}{\pmatrix{ac&1_A}\pmatrix{1_U\\bd}}$$ For the first equation, consider the map $\vartheta:X\times Y\to U\times Y$, induced by $c:X\to U$ and $1_Y$ according to the product structures.
    (Observe that effectively $\vartheta=\pmatrix{c&0\\0&1_Y}$, and consequently it coincides with the map induced by $c$ and $1_Y$ for the coproduct structures.)
    Then verify that we have $\pmatrix{a&b}\,\vartheta=\pmatrix{ac&b}$ and $\vartheta\,\pmatrix{1_U\\d}=\pmatrix{c\\d}$, so that it's just the two associations of the triple composition of morphisms $$\pmatrix{a&b}\pmatrix{c&0\\0&1}\pmatrix{1\\d}$$ The other equation can be proven similarly.
0
On

$\def\summap#1{{\begin{cases} #1 \end{cases} }}$$\def\productmap#1{\left\langle #1 \right\rangle}$$\def\matrix#1{\begin{bmatrix} #1 \end{bmatrix}}$$\def\projection#1#2{\pi_{#1 \times #2}}$This is a complement to the accepted answer which fills in details and uses notation more similar to that found in the book from which this question originated. The accepted answer was helpful but, because I am stupid, it still took me several hours to figure out the details based on the answer.

The idea of using matrix based intuition to guide figuring out what identities to look for was very helpful, as was beginning to fully exploit the properties of zero maps to derive useful identities.

Throughout repeated use will be made of the following identities:

$$h \circ \summap{f \\g} = \summap{ h \circ f \\ h \circ g} \,, \quad \langle f, g \rangle \circ h = \langle f \circ h, g \circ h \rangle$$

One also has (see here for a proof) the following identities involving the inverse, denoted $A \times B \overset{\alpha_{AB}}{\to} A + B$, of the identity matrix from $A +B$ to $A \times B$:

$$ \alpha_{AB} \circ \langle 1_A, 0_{AB} \rangle = j_{A+B}^1 \,, \quad \alpha_{AB} \circ \langle 0_{BA}, 1_B \rangle = j_{A+B}^2 \,, \quad \begin{cases} 1_A \\ 0_{BA} \end{cases} \circ \alpha_{AB} = \pi_{A \times B}^1 \,, \quad \begin{cases} 0_{AB} \\ 1_B \end{cases} \circ \alpha_{AB} = \pi_{A \times B}^2 \,. $$

As the accepted answer explains, (see also here for a proof of the explicit form of the map $f+g$) what we would like to prove amounts to showing that:

$$\summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \productmap{f_{AX} , f_{AY}} = \summap{ g_{YU} \circ f_{AY} \\ 1_U} \circ \alpha_{AU} \circ \productmap{ 1_A, g_{XU} \circ f_{AX}} \,.$$

but it will initially suffice to show the slightly different statement that:

$$\summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \productmap{f_{AX} , f_{AY}} = \summap{1_U \\ g_{YU} \circ f_{AY}} \circ \alpha_{UA} \circ \productmap{ g_{XU} \circ f_{AX}, 1_A}$$

Intuitively, if we think of sum maps as column vectors and product maps as row vectors, and function composition as (reverse-ordered, non-commutative) scalar multiplication, what we will show is similar to:

$$\matrix{f_{AX} & f_{AY} } \matrix{g_{XU} \\ g_{g_{YU}}} = \matrix{ f_{AX} g_{XU} & 1_A } \matrix{ 1_U \\ f_{AY}g_{YU}} = f_{AX} g_{XU} + f_{AY}g_{YU} \,.$$

1. Corresponding to finding the $\vartheta$ from the accepted answer, the intuition for what we want to show is:

$$\matrix{f_{AX} & f_{AY} } \matrix{g_{XU} \\ g_{g_{YU}}} = \matrix{f_{AX} & f_{AY}} \matrix{g_{XU} & 0_{XY} \\ 0_{YU} & 1_Y} \matrix{1_U \\ g_{YU}} $$

So first we just show that

$$ \matrix{g_{XU} \\ g_{g_{YU}}} = \matrix{g_{XU} & 0_{XY} \\ 0_{YU} & 1_Y} \matrix{1_U \\ g_{YU}} $$

which one can deduce (comparing domains and codomains, where the product is required, where the coproduct is required, etc.) corresponds to the actual equation:

$$ \summap{g_{XU} \\ g_{YU}} = \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \summap{ \productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } $$

We show this by using the universal mapping property of sum, i.e. by showing that the right hand side has the correct (unique) values when pre-composed with the relevant injections, denoted $j_{X+Y}^1, j_{X+Y}^2$:

$$\begin{array}{rcl} \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \summap{ \productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ j_{X+Y}^1 & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{g_{XU}, 0_{XY}} \\ & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{1_U \circ g_{XU}, 0_{UY} \circ g_{XU} } \\ & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{1_U, 0_{UY} }\circ g_{XU} \\ & = & \summap{1_U \\ g_{YU}} \circ j_{U+Y}^1 \circ g_{XU} = 1_U \circ g_{XU} = g_{XU} \end{array} $$

$$\begin{array}{rcl} \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \summap{ \productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ j_{X+Y}^2 & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{0_{YU}, 1_Y} \\ & = & \summap{1_U \\ g_{YU}} \circ j_{U+Y}^2 = g_{YU} \end{array} $$

Therefore we can conclude:

$$ \summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \productmap{f_{AX} , f_{AY}} = \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \summap{\productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ \alpha_{XY} \circ \productmap{f_{AX}, f_{AY}} $$

2. Now intuitively what we want to show is that

$$\matrix{f_{AX} & f_{AY}} \matrix{g_{XU} & 0_{XY} \\ 0_{YU} & 1_Y} = \matrix{f_{AX} g_{XU} & f_{AY}} $$

which corresponds to the actual equation:

$$ \summap{ \productmap{g_{XU} , 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ \alpha_{XY} \circ \langle f_{AX}, f_{AY} \rangle = \productmap{g_{XU} \circ f_{AX}, f_{AY}} $$

Observe that, with $\projection{X}{Y}^1, \projection{X}{Y}^2$ denoting the corresponding projections:

$$\productmap{g_{XU} \circ f_{AX}, f_{AY}} = \productmap{g_{XU} \circ \projection{X}{Y}^1, \projection{X}{Y}^2} \circ \productmap{f_{AX}, f_{AY}} $$

So it suffices to prove the somewhat more concise identity that

$$ \summap{ \productmap{g_{XU} , 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ \alpha_{XY} = \productmap{g_{XU} \circ \projection{X}{Y}^1, \projection{X}{Y}^2} $$

We show this holds by showing that post-composing the left-hand side by the relevant projections gives the unique values required in order to equal the right hand side (due to the universal property of products):

$$\begin{array}{rcl} \projection{X}{Y}^1 \circ \summap{ \productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y}} \circ \alpha_{XY} & = & \summap{ \projection{X}{Y}^1 \circ \productmap{g_{XU}, 0_{XY}} \\ \projection{X}{Y}^1 \circ \productmap{0_{YU}, 1_Y}} \circ \alpha_{XY} \\ & = & \summap{g_{XU} \\ 0_{YU}} \circ \alpha_{XY} \\ & = & g_{XU} \circ \summap{1_X \\ 0_{YX}} \circ \alpha_{XY} = g_{XU} \circ \projection{X}{Y}^1 \end{array} $$

$$\begin{array}{rcl} \projection{X}{Y}^2 \circ \summap{ \productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y}} \circ \alpha_{XY} & = & \summap{0_{XY} \\ 1_Y } \circ \alpha_{XY} \\ & = & \projection{X}{Y}^2 \end{array}$$

Therefore, what we have shown so far (combining both parts above) is intuitively:

$$\matrix{f_{AX} & f_{AY} } \matrix{g_{XU} \\ g_{g_{YU}}} = \matrix{f_{AX} & f_{AY}} \matrix{g_{XU} & 0_{XY} \\ 0_{YU} & 1_Y} \matrix{1_U \\ g_{YU}} = \matrix{f_{AX} g_{XU} & f_{AY}} \matrix{1_U \\ g_{YU}} $$

which corresponds to the actual equations:

$$\begin{array}{rcl}\summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \productmap{f_{AX} , f_{AY}} &=& \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \summap{\productmap{g_{XU}, 0_{XY}} \\ \productmap{0_{YU}, 1_Y} } \circ \alpha_{XY} \circ \productmap{f_{AX}, f_{AY}} \\ &=& \summap{1_U \\ g_{YU}} \circ\alpha_{UY} \circ \productmap{g_{XU} \circ f_{AX}, f_{AY}} \end{array} $$

3. What we show now corresponds intuitively to:

$$\matrix{f_{AX} g_{XU} & f_{AY}} = \matrix{f_{AX} g_{XU} & 1_A} \matrix{1 & 0 \\ 0 & f_{AY}} $$

where the last matrix represents some map $U \times A \overset{k}{\to} U \times Y$, since the actual equation this corresponds to is:

$$\productmap{g_{XU} \circ f_{AX}, f_{AY}} = k \circ \productmap{g_{XU} \circ f_{AX}, 1_A} $$

We will improve its representation later to make it more matrix-like, but the most obvious map to think of which might satisfy these conditions is:

$$ k = \productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2}$$

One has that this map does indeed satisfy the required properties:

$$\begin{array}{rcl} \productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2} \circ \productmap{g_{XU} \circ f_{AX}, 1_A} & = & \productmap{\projection{U}{A}^1 \circ \productmap{g_{XU} \circ f_{AX}, 1_A}, f_{AY} \circ \projection{U}{A}^2 \circ \productmap{g_{XU} \circ f_{AX}, 1_A}} \\ & = & \productmap{g_{XU} \circ f_{AX}, f_{AY} \circ 1_A} = \productmap{g_{XU} \circ f_{AX}, f_{AY} } \end{array} $$

Thus the totality of what has been shown is:

$$ \productmap{g_{XU} \circ f_{AX}, f_{AY}} = \productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2} \circ \productmap{g_{XU} \circ f_{AX}, 1_A} $$

4. Now what we want to show is intuitively:

$$ \matrix{1 & 0 \\ 0 & f_{AY}} \matrix{1_U \\ g_{YU}} = \matrix{1_U \\ f_{AY} g_{YU}} $$

which, comparing domains and codomains corresponds likely to the actual equations:

$$\summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2 } = \summap{1_U \\ g_{YU} \circ f_{AY} } \circ \alpha_{UA}$$

since $\alpha_{UA}$ is the most straightforward map $U \times A \to U+A$ to think of (since the $\alpha$'s are introduced specifically for the purpose to map from the products to the coproducts, opposite the "usual order"). We will see that this our problem of $\productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2 }$ "not looking like a matrix". Specifically:

$$\begin{array}{rcl} \productmap{\projection{U}{A}^1, f_{AY} \circ \projection{U}{A}^2 } & = & \productmap{ \summap{1_U \\ 0_{AU}} \circ \alpha_{UA} , f_{AY} \circ \summap{0_{UA} \\ 1_A} \circ \alpha_{UA} } \\ & = & \productmap{ \summap{1_U \\ 0_{AU}} , \summap{0_{UY} \\ f_{AY}} } \circ \alpha_{UA} \end{array}$$

So our equation above becomes:

$$ \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}} , \summap{0_{UY} \\ f_{AY}} } \circ \alpha_{UA} = \summap{1_U \\ g_{YU} \circ f_{AY} } \circ \alpha_{UA}$$

so it suffices to prove the simpler identity:

$$ \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}} , \summap{0_{UY} \\ f_{AY}} } = \summap{1_U \\ g_{YU} \circ f_{AY} } $$

which we can prove via the universal property of coproducts, i.e. by pre-composing the left-hand side by the relevant injections and showing that we get the correct results. Specifically:

$$\begin{array}{rcl} \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}} , \summap{0_{UY} \\ f_{AY}} } \circ j_{U+A}^1 & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}}\circ j_{U+A}^1 , \summap{0_{UY} \\ f_{AY}} \circ j_{U+A}^1 } \\ & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{1_U, 0_{UY} } \\ & = & \summap{1_U \\ g_{YU}} \circ j_{U+Y}^1 = 1_U \end{array} $$

$$\begin{array}{rcl} \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}} , \summap{0_{UY} \\ f_{AY}} } \circ j_{U+A}^2 & = & \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{ \summap{1_U \\ 0_{AU}}\circ j_{U+A}^2 , \summap{0_{UY} \\ f_{AY}} \circ j_{U+A}^2}\\ & = &\summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{0_{AU}, f_{AY}}\\ &=& \summap{1_U \\ g_{YU}} \circ \alpha_{UY} \circ \productmap{0_{YU}, 1_Y } \circ f_{AY} \\ & = &\summap{1_U \\ g_{YU}} \circ j_{U+Y}^2 \circ f_{AY} = g_{YU} \circ f_{AY} \end{array} $$

so, to recap, what we have shown intuitively in this step is:

$$ \matrix{1_U & 0_{UY} \\ 0_{AU} & f_{AY}} \matrix{1_U \\ g_{YU}} = \matrix{1_U \\ f_{AY} g_{YU}} $$

5. Putting everything together, we have actually shown what we had set out to prove:

$$\summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \productmap{f_{AX}, f_{AY}} = \summap{1_U \\ g_{YU} \circ f_{AY} } \circ \alpha_{UA} \circ\productmap{g_{XU} \circ f_{AX}, 1_A} $$

which corresponds in terms of intuition to the matrix equation:

$$\matrix{f_{AX} & f_{AY}} \matrix{g_{XU} \\ g_{YU}} = \matrix{ f_{AX} g_{XU} & 1_A } \matrix{1_U \\ f_{AY} g_{YU}} \,.$$

Now how does one show the following? (Rhetorical question)

$$\begin{array}{rcl}\summap{1_U \\ g_{YU} \circ f_{AY} } \circ \alpha_{UA} \circ\productmap{g_{XU} \circ f_{AX}, 1_A} &=& \summap{ g_{YU} \circ f_{AY} \\ 1_U } \circ \alpha_{AU} \circ\productmap{1_A, g_{XU} \circ f_{AX}}\\ &=:& g_{XU} \circ f_{AX} + g_{YU} \circ f_{AY} \end{array} $$

The key is in describing the relationship between $\alpha_{AU}$ and $\alpha_{UA}$. Let $$I_{AU} := \summap{\productmap{1_A, 0_{AU}} \\ \productmap{0_{UA}, 1_U}} = \alpha_{AU}^{-1} \,, \quad I_{UA} := \summap{\productmap{1_U, 0_{UA}} \\ \productmap{0_{AU}, 1_A} } = \alpha_{UA}^{-1} \,.$$

Now note the following identities:

$$ I_{AU} = \productmap{\projection{U}{A}^2, \projection{U}{A}^1} \circ I_{UA} \circ \summap{j_{U+A}^2 \\ j_{U+A}^1} \,, \quad I_{UA} = \productmap{\projection{A}{U}^2, \projection{A}{U}^1} \circ I_{AU} \circ \summap{j_{A+U}^2 \\ j_{A+U}^1} $$

$$\productmap{\projection{U}{A}^2, \projection{U}{A}^1} \circ \productmap{\projection{A}{U}^2, \projection{A}{U}^1} = \productmap{\projection{A}{U}^1, \projection{A}{U}^2} = id_{A\times U}$$ $$\productmap{\projection{A}{U}^2, \projection{A}{U}^1} \circ \productmap{\projection{U}{A}^2, \projection{U}{A}^1} = \productmap{\projection{U}{A}^1, \projection{U}{A}^2} = id_{U \times A}$$

$$ \summap{j_{A+U}^2 \\ j_{A+U}^1} \circ \summap{j_{U+A}^2 \\ j_{U+A}^1} = \summap{j_{A+U}^1 \\ j_{A+U}^2} = id_{A+U} $$ $$\summap{j_{U+A}^2 \\ j_{U+A}^1} \circ \summap{j_{A+U}^2 \\ j_{A+U}^1}= \summap{j_{U+A}^1 \\ j_{U+A}^2} = id_{U+A} $$

Therefore it follows that:

$$\begin{array}{rcl}\alpha_{AU} &= & I_{AU}^{-1}\\ & = &\left(\productmap{\projection{U}{A}^2, \projection{U}{A}^1} \circ I_{UA} \circ \summap{j_{U+A}^2 \\ j_{U+A}^1} \right)^{-1} \\ &=& \left(\summap{j_{U+A}^2 \\ j_{U+A}^1} \right)^{-1} \circ I_{UA}^{-1} \circ \left(\productmap{\projection{U}{A}^2, \projection{U}{A}^1} \right)^{-1} \\ & = & \summap{j_{A+U}^2 \\ j_{A+U}^1} \circ \alpha_{UA} \circ \productmap{\projection{A}{U}^2, \projection{A}{U}^1} \end{array}$$

Similarly,

$$\alpha_{UA} =\summap{j_{U+A}^2 \\ j_{U+A}^1} \circ \alpha_{AU} \circ \productmap{\projection{U}{A}^2, \projection{U}{A}^1} \,. $$

Substituting this into the above, we get what we wanted to show:

$$\begin{array}{rcl} \summap{1_U \\ g_{YU} \circ f_{AY} } \circ \alpha_{UA} \circ\productmap{g_{XU} \circ f_{AX}, 1_A} & = & \left(\summap{1_U \\ g_{YU} \circ f_{AY} } \circ \summap{j_{U+A}^2 \\ j_{U+A}^1} \right) \circ \alpha_{AU} \circ \left( \productmap{\projection{U}{A}^2, \projection{U}{A}^1} \circ \productmap{g_{XU} \circ f_{AX}, 1_A} \right) \\ & = & \summap{ g_{YU} \circ f_{AY} \\ 1_U } \circ \alpha_{AU} \circ\productmap{1_A, g_{XU} \circ f_{AX}} \\ & =: & g_{XU} \circ f_{AX} + g_{YU} \circ f_{AY} \,. \end{array} $$

This completes the proof that the matrix multiplication formula given for so-called linear categories is valid, which was Exercise 1 of this section in Lawvere's book.

Note: As a bonus, we can apply the result above

$$\summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \langle f_{AX}, f_{AY} \rangle = \summap{1_U \\ g_{YU} \circ f_{AY}} \circ \alpha_{UA} \circ \langle g_{XU} \circ f_{AX}, 1_A \rangle \,, $$

as well as the formulas relating $\alpha_{UA}$ to $\alpha_{AU}$, to prove commutativity of the sum map, i.e. $h+k = k+h$ for $A \overset{h,k}{\to} B$. We take $U=X=Y=B$. Then one has:

$$\begin{array}{rcl} h+k &:= & \summap{k \\ 1_B} \circ \alpha_{AB} \circ \productmap{1_A, h} \\ & =& \summap{1_B \\ k} \circ \alpha_{BA} \circ \productmap{h, 1_A} \\ &= & \summap{1_B \\ 1_B \circ k } \circ \alpha_{BA} \circ \langle 1_B \circ h, 1_A \rangle \\ & =: & \summap{1_U \\ g_{YU} \circ f_{AY}} \circ \alpha_{UA} \circ \langle g_{XU} \circ f_{AX}, 1_A \rangle \\ & = & \summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \langle f_{AX}, f_{AY} \rangle \\ & = & \summap{1_B \\ 1_B} \circ \alpha_{BB} \circ \langle h, k \rangle\\ & = & \summap{1_B \\ 1_B} \circ \summap{j_{B+B}^2 \\ j_{B+B}^1} \circ \alpha_{BB} \circ \langle \projection{B}{B}^2, \projection{B}{B}^1 \rangle \circ \langle h , k \rangle \\ & = & \summap{1_B \\ 1_B} \circ \alpha_{BB} \circ \langle k, h \rangle \\ & =:& \summap{g_{XU} \\ g_{YU}} \circ \alpha_{XY} \circ \langle f_{AX}, f_{AY} \rangle \\ & = & \summap{1_U \\ g_{YU} \circ f_{AY}} \circ \alpha_{UA} \circ \langle g_{XU} \circ f_{AX}, 1_A \rangle \\ & = & \summap{1_B \\ 1_B \circ h} \circ \alpha_{BA} \circ \langle 1_B \circ k , 1_A \rangle \\ & = & \summap{1_B \\ h} \circ \alpha_{BA} \circ \langle k, 1_A \rangle \\ & = & \summap{h \\ 1_B} \circ \alpha_{AB} \circ \langle 1_A, k \rangle \\ & =:& k + h \,. \end{array}$$