Proof of associativity of polynomials product (infinite variables)

354 Views Asked by At

The product of polynomials in $R[X_i]_{i\in I}$ where $I$ is not necessarily finite is associative ($R$ commutative ring), but I can't find any detailed proof of this fact. Either it is left in exercice, either it is "obvious".

The question is quite the same (I guess...) as what Bourbaki assets to be obvious in Algebra A.III.27 §10 associativiy of bilinear product in total algebra...

So, either it is really obvious, but I lack some words of explanation ;), either some manipulations of sums and indices are required to be sure that it is obvious...

Anyone can give me details, or reference to a book or notes where it is well done, please ?

To be more precise, I want to prove that the product in $A^{(\mathbb{N}^{(I)})}$, that is $((a_\alpha)_{\alpha\in \mathbb{N}^{(I)}}, (b_\beta)_{\beta\in \mathbb{N}^{(I)}})\mapsto \left( \sum_{\alpha + \beta = \gamma} a_\alpha b_\beta\right)_{\gamma\in \mathbb{N}^{(I)}}$ is associative, and directly, not by any isomorphism for instance.

2

There are 2 best solutions below

3
On

I assume you're using the definition that $R[X_i]_{i\in I}=\bigcup_{F}R[X_i]_{i\in F}$ where the union is taken over all finite subsets $F \subset I$.

Let $f_1,f_2,f_3\in R[X_i]_{i\in I}$. Then there are finite subsets $F_1,F_2,F_3$ of $I$ such that each $f_i\in R[X_i]_{i\in F_i}$. But then if we set $F=F_1\cup F_2 \cup F_3$, then $F$ is a finite subset of $I$, and the product $(f_1f_2)f_3$ in the whole ring $R[X_i]_{i\in I}$ is simply defined to be the product of the polynomials in the polynomial ring $R[X_i]_{i\in F}$ in finitely many variables. I'm sure you already know that this product is associative.

0
On

Here is my "solution". Correct and comment ;) !

Let $a=(a_\alpha)_{\alpha\in \mathbb{N}^{(I)}}$, $b=(b_\beta)_{\beta\in \mathbb{N}^{(I)}}$ and $c=(c_\gamma)_{\gamma\in \mathbb{N}^{(I)}}$ three elements of $R^{(\mathbb{N}^{(I)})}$, we prove the equality $a(bc)=(ab)c$ by showing that each member is equal to the polynomial $d=(d_\delta)_{\delta\in \mathbb{N}^{(I)}}$ where for all $\delta\in \text{Supp}(d)$, $d_\delta=\sum\limits_{\alpha+\beta+\gamma=\delta} a_\alpha b_\beta c_\gamma$.

When calculating with the formula giving the product, it comes easily that $(ab)c = (e_\epsilon)_{\epsilon\in \mathbb{N}^{(I)}}$ with $e_\epsilon = \sum\limits_{\rho+\gamma=\epsilon}\left(\sum\limits_{\alpha+\beta=\rho}a_\alpha b_\beta\right) c_\gamma$ for all $\epsilon\in\text{Supp}((ab)c)$, and $a(bc) = (f_\eta)_{\eta\in \mathbb{N}^{(I)}}$ with $f_\eta = \sum\limits_{\alpha+\tau=\eta}a_\alpha\left(\sum\limits_{\beta+\gamma=\tau}b_\beta c_\gamma\right) $ for all $\eta\in\text{Supp}(a(bc))$. What matters here is to see clearly which elements the sum is taken over.

In each case, it is the internal sums that lead the whole. Let $A=\text{Supp}(a)$, $B=\text{Supp}(b)$ and $C=\text{Supp}(c)$. Let's begin with counting the number of different elements $\rho$ of $\mathbb{N}^{(I)}$ which can be writtent as a sum $\alpha+\beta$ where $(\alpha,\beta)\in A\times B$ (this number $m$ ranges between $1$ and $\text{Card}(A)\times \text{Card}(B)$ !). Then, for each of these $\rho$, whose number is finite, the sum $\gamma+\rho$ equals to an $\epsilon$, and these are those $\epsilon$ which are the support of the product (there are exactly $m\times \text{Card}(C)$ of them).

Let's consider now, for $\delta$ in $\mathbb{N}^{(I)}$, the set $E_\delta=\{(\alpha,\beta,\gamma)\in A\times B\times C \,/\, \alpha+\beta+\gamma =\delta\}$. As "$E_\delta\neq \emptyset$ if and only if $\delta\in \text{Supp}(d)$", we have for almost all of the $\delta$, $E_\delta=\emptyset$. Let $\delta$ be such that $E_\delta \neq \emptyset$. The idea is to write $E_\delta$ as a partition with the form of a product so as to be able to show both sums of $(ab)c$ (the same way is used for $a(bc)$).

Let $E'=\{\gamma\in C \,/\, \exists \lambda\in \mathbb{N}^{(I)}, \gamma + \lambda = \delta\}$, and for $\gamma\in E'$, $E'_\gamma= \{(\alpha,\beta)\in A\times B \,/\, (\alpha,\beta,\gamma)\in E_\delta\}$. If $\gamma_1$ and $\gamma_2$ are two distinct elements of $E'$, then $E'_{\gamma_1}\cap E'_{\gamma_2}=\emptyset$. Indeed, if $(\alpha,\beta)\in E'_{\gamma_1}$, then $\alpha+\beta+\gamma_1= \delta$, so $\alpha+\beta=\delta-\gamma_1\neq \delta-\gamma_2$, and hence $\alpha+\beta +\gamma_2\neq \delta$, which implies $(\alpha,\beta)\notin E'_{\gamma_2}$. Moreover, if $\gamma\in E'$, there exists (an unique) $\lambda\in \mathbb{N}^{(I)}$ such that $\gamma+\lambda = \delta$, and then for all $(\alpha,\beta) \in E'_\gamma$, the equality $\lambda= \alpha+\beta$ stands.

One can write $E_\delta$ as $\bigcup\limits_{\gamma\in E'} F_\gamma$ where $F_\gamma=\{(\alpha, \beta,\gamma) \,/\, (\alpha,\beta)\in E'_\gamma\}$ and this union is disjointed since are the $E'_\gamma$. Then we have (cf. explanations below) \begin{align*} \sum\limits_{\alpha+\beta+\gamma=\delta}a_\alpha b_\beta c_\gamma = \sum\limits_{(\alpha,\beta,\gamma)\in E_\delta} a_\alpha b_\beta c_\gamma & = \sum\limits_{\gamma\in E'} \sum\limits_{(\alpha,\beta,\gamma)\in F_\gamma} a_\alpha b_\beta c_\gamma \tag{1}\\ & = \sum\limits_{\gamma\in E'} \left(\sum\limits_{(\alpha,\beta)\in E'_\gamma} a_\alpha b_\beta \right)c_\gamma \tag{2}\\ & = \sum\limits_{\gamma+\lambda = \delta} \left(\sum\limits_{\alpha+\beta=\lambda} a_\alpha b_\beta \right) c_\gamma. \tag{3} \end{align*}

(1) results from the fact that the union on $E'$ is disjointed. In (2), $\gamma$ being set, we isolate the inner sum which extends on all the $\alpha$ and $\beta$ in $E'_\gamma$, that makes the parenthesis appear. In (3), one uses the initial remark about the leading of the inner sum: inner and outer sums are linked by the sum $\alpha+\beta$ which is called $\lambda$ as in $E'$.

We established the equality between $(ab)c$ and $d$, and one would show in the same way the equality between $a(bc)$ and $d$.