This was already asked by someone else here:
They answered their own question there but I'm not satisfied with their answer for part a. I'll reproduce the question from that link here:
Consider $k\mathbb{P^3}$, and fix a degree $d$. For each multi-index $\alpha = (a_0,a_1,a_2)$ with $a_0+a_1+a_2 = d$ let $x_\alpha$ be an indeterminate. Set $R = k[\{x_\alpha\}]$, and let $S$ be the $R$-algebra: $$ S = \frac{R[y_0,y_1,y_2]}{\left(\sum_\alpha x_\alpha y_0^{a_0}y_1^{a_1}y_2^{a_2}\right)}.$$ Show that:
$(a)$ if we invert any $x_\alpha$, then the family becomes flat. That is, the $R[x_\alpha^{-1}]$-algebra $S[x_\alpha^{-1}]$ is flat, by showing that it is a free module (it is not finitely generated).
$(b)$ $S$ is an integral domain, and moreover that it contains $R$, so that it is torsion-free as a $R$-module.
$(c)$ $S$ is actually not flat over $R$, by proving and using the following two facts:
If $S$ is a flat module over a ring $R$ and $R \to T$ is any map of rings, then $S\otimes_R T$ is flat over $T$.
There is a map of rings $R = k[\{x_\alpha\}] \to k[t] = T$, such that $$T \otimes_R S = \frac{k[t,y_0,y_1,y_2]}{(ty_0^d)}$$ is not a flat $T$-algebra.
First,the question in Eisenbud asks to show that it is a free module over R. Next,in the linked answer they say that the basis is given by the monomials with $y^\alpha$ removed. But the relation $y^\alpha =-\sum_{\mu \neq\alpha} x_\mu/x_\alpha y^\mu$ isn't a relation over R but over $R[x_\alpha^{-1}]$. I don't see how this should give a basis as intended.
Can you help me with this? How do I show freeness over R?
EDIT: Actually, I would like some help with part b as well, to show irreducibility of the polynomial.
Regarding part (a): (linear independence) Claim: $\{y^{\beta}\}$ where $\beta$ varies over all multi-indices (except $\alpha$) is a basis for $S[x_{\alpha}^{-1}]$ over $R[x_{\alpha}^{-1}]$.
If there is a relation $$\sum_{\beta\neq \alpha} \dfrac{f_{\beta}}{x_{\alpha}^n}y^{\beta}=0,$$ where $f_{\beta}\in R$, then we get $\sum_{\beta\neq \alpha} f_{\beta}y^{\beta}=0$ in $S$. Hence $\sum_{\beta\neq \alpha} f_{\beta}y^{\beta}=h\cdot \sum_{\beta} x_{\beta}y^{\beta}$ in $R[y_0, y_1, y_2]$.
Comparing the degree $d$ component (wrt $y_i$'s) we get $\sum_{\beta\neq \alpha||\beta|=d} f_{\beta}y^{\beta}=h_0\cdot \sum_{\beta} x_{\beta}y^{\beta}$ where $h_0$ is the degree $0$ part of $h$. So $h_0\in R$.
Now compare the coefficients of $y^{\alpha}$ of both the sides we get LHS=0 and RHS= $h_0x_{\alpha}$, hence $h_0=0$.
Similarly we argue that the other components of $h$ are $0$.
Regarding part (b): Let us write $p(y)=\sum_{\alpha}x_{\alpha}y^{\alpha}$. Assume $$p(y)=f(y)g(y)$$ for some polynomials $f$ and $g$ in $R[y_0, y_1, y_2]$.
We can assume that $f(y)$ and $g(y)$ are homogenous polynomials of degree $r$ and $s$ respectively so that $r+s=d$.
Write $B=k[y_0, y_1, y_2]$ we can think of $R[y_0, y_1, y_2]=B[\{x_{\alpha}\}]$.
Now think of $p(y)=f(y)g(y)$ in $B[\{x_{\alpha}\}]$.
See that $p(y)$ is a linear polynomial over $B$ (in $B[\{x_{\alpha}\}]$).
So WLOG, $\deg(f(y))=1$ and $\deg g(y)=0$ where degrees are wrt $\{x_{\alpha}\}$'s.
So $g(y)$ = a monomial in $y_i$'s of degree $s$, say $g(y)=y^{\beta}$.
and $f(y) = \sum_{\nu} x_{\nu}h_{\nu}(y)$ where $|\nu|=d$ and $\deg h_{\nu}(y)=r$ wrt $y_i'$s with $h_{\nu}(y)\in k[y_0, y_1, y_2]$.
We have $p(y)=f(y)g(y)=f(y)y^{\beta}$.
So $y^{\beta} | y^{\alpha}$ for all $\alpha $ occuring in the sum of $p(y).$
If $y^{\beta}=y_0^{b_0}y_1^{b_1}y_2^{b_2}$, we claim that $\beta=(b_0, b_1, b_2)=(0, 0, 0)$.
If say $b_0>0$, then take $\alpha=(0, b_1,b_2)$, now $y^{\alpha}$ occurs as a term in $p(y)$. But $y^{\beta}\nmid y^{\alpha}$, a contradiction. So $b_0=0$. Similarly we can argue that $b_1=0=b_2$.