Using the linear change of coordinates
y = h(x) = a + bx,
show that f(x) = αx(1−x) on [0, 1], is equivalent to g(y) = β −y^2, on an interval [c, d],
for suitably chosen a, b, c, d.
In particular, what is the relationship between α and β for equivalence?
So it's a pretty simple question. You would just calculate h(f(x)) and g(h(x)) and find for which values they are equivalent which goes like this:
h(f(x)) = a + b(αx(1−x))
= a + bαx − bαx^2
g(h(x)) = β - (a+bx)^2
= β - a^2 - 2abx - b^2*x^2
hence
a + bαx − bαx^2 = β - a^2 - 2abx - b^2*x^2 would mean
a = β - a^2
bα = - 2ab
bα = b^2
meaning b = α , a = (-1/2)α and β = (α/2)(α/2 - 1)
correct? Sorry, I don't have solutions and this is the first time I am solving a conjugacy problem. The difficult bit, is on what suitable interval [c,d] are these maps equivalent?
Thanks
What you've done looks correct to me. The interval $[c,d]$ should be the image of the interval $[0,1]$ under the map $h$, so it's the interval $[a,a+b]=[-\alpha/2,\alpha/2]$ (if $\alpha>0$; if $\alpha<0$, then it's $[a+b,a]=[\alpha/2,-\alpha/2]$).