I'm trying to fit my data to the following equation: $$ Y = A(1-2e^{bx}) $$ What I tried to do was transform the equation to a linear form via the following steps: \begin{align*} & A-Y = 2Ae^{bx}\\ & \ln(A-Y) = \ln(2Ae^{bx}) \\ & \ln(A-Y) - \ln(2) = \ln(A) + bX \\ & y = a + bX \end{align*} where $ y = \ln(A-Y) - \ln(2)$ and $ a = \ln(A)$.
Then I did a least squares fit for my dataset to find both $a$ and $b$. Using those values I plugged them into my original equations to generate points on the line.
I must be missing a step somewhere, or my math is incorrect.

The model $$y = a\,(1-2e^{bx})$$ is nonlinear with respect to its parameters $a,b$ and nonlinear regression should be used.
The problem is that you need to start with rather good estimates and that, as already said in comments, you cannot linearize the model. So, how to get estimates ?
First approach
Suppose that you fix $b$ at a given value. Then define $z=1-2e^{bx}$ and the model simplifies to $y=a z$ for which a linear regression (with no intercept) will give you $a$ and the sum of squares $SSQ$; both quantities depend on the value assigned to $b$ and what you look for is an approximate value of $b$ which minimizes $SSQ(b)$.
Try a few values for $b$, plot the result $SSQ(b)$ as a function of $b$ and you will quickly have an approximate solution. Now, you can safely start the nonlinear regression (it will converge in very few iterations).
Second approach
If you do not want to use this approach, take two points corresponding, say, to $x_1$ and $x_2\approx 2x_1$. Now $$\alpha=\frac{y_2}{y_1}=\frac{a\,(1-2e^{bx_2})}{a\,(1-2e^{bx_1})}=\frac{1-2e^{2bx_1}}{1-2e^{bx_1}}$$ Setting $p=e^{bx_1}$, you then have $$\alpha=\frac{1-2p^2}{1-2p}$$ which is quadratic equation in $p$ which write $2 p^2-2 \alpha p+(\alpha-1)=0 $; if two roots, keep the positive root (since $p=e^{bx_1}>0$) and from $p$ and $x_1$ compute $b=\frac{\log(p)}{x_1}$. Now, from $y_1=a\,(2-e^{bx_1})$, compute $a$.
You have your estimates and you can start the nonlinear regression.
Example for illustration purposes
I generated $12$ data points $x_i=2+i$, ($i=1,2,\cdots,12$), and the corresponding $y$'s according to $$y_i=456(1-2e^{-0.234x_i})+5(-1)^i$$ the last term corresponding to quite large errors.
For using the last described method, I arbitrarily chose among the data points $x_1=5$, $y_1=167.945$, $x_2=10$ , $ y_2=373.149$. This makes $\alpha=2.22185$ and then the roots for $p$ are $0.321476$ and $1.90037$ for which the corresponding values of $b$ are $-0.2270$ and $0.1284$; the last value must be discarded since the data points shows a curve which is concave up. So, using $b=-0.227$ and $167.945=a(1-2e^{-0.227 \times 5})$we get $a=470.229$; as you can see, in spite of the noise, the estimates are quite close to the values used for the generation of the data.
Starting with these estimates, the nonlinear regression solves in a couple of itertaions and the results are $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ a & 457.503 & 3.49725 & \{449.711,465.296\} \\ b & -0.233121 & 0.00221737 & \{-0.238062, -0.228181\} \\ \end{array}$$ and the fit is almost perfect ($R^2=0.999752$).
Edit
If you have, as you said in a comment, a good estimate for parameter $a$, you can get almost immediately a good estimate for parameter $b$. Rewrite the model as $$\log(\frac y a-2)=bx$$ which gives $$b=\frac{\sum_{i=1}^n x_i \log(\frac {y_i} a -2)}{\sum_{i=1}^n x_i^2}$$ and start the nonlinear regression with these guesses.