In the following photo is the definition of some sets and the conjecture of Zaremba. Am I missing something obvious or does the conjecture not certainly fail as $d=1$ will never be in the set $\mathfrak{D}_{\{1,\ldots,A\}}$?
The are no options $b$ as a numerator for $d=1$ that satisfy $0<b<1$.

Your quoted text appears to come from On Zaremba's conjecture by Jean Borgain and Alex Kontorovich.
The authors consider continued fraction representations of elements of $(0,1)$, which, I guess, is to allow them to assume the form $$ \frac{1}{a_1+\frac{1}{a_2+\frac{1}{a_3+\frac{1}{\ldots}}}}, $$ that is, a form with integer term $a_0$ equal to $0$ (as they specify just prior to the quoted passage). Neglecting $a_0$ enables them to avoid having to allow a $0$ into their alphabet that will only ever be used as the first partial quotient.
As you noticed, with this setup their statement fails for denominator $1$. The easiest fix is simply to modify the conjecture to state that every integer greater than $1$ occurs as the denominator of an element of $\mathfrak{R}_\mathcal{A}$. An alternative fix would be to allow $b=0$ (which would correspond to the sequence $[a_1,a_2,\ldots,a_k]$ being empty, that is, to $k=0$). Then the conjecture can be left as is.
For more information about this problem, see A141822 in the Online Encyclopedia of Integer sequences (OEIS). Note that the OEIS sequence starts with denominator $2$; they don't bother worrying about denominator $1$.