Let $D$ denote the differentiation operator. It is a classic result that $e^{cD}f(a)=f(a+c)$ if $f(x)$ is an analytic function and the radius of convergence of the Taylor series of $f(x)$ at $x=a$ is bigger than $c$. More generally, if $T$ is a linear operator on an algebra satisfying $T(ab)=T(a)b+aT(b)$ and $S=e^T$ is well defined (e.g., if $T$ is locally nilpotent or the algebra has a norm and $T$ is bounded), then $S(ab)=S(a)S(b)$. This implies that if $f$ is an analytic function, and if everything actually makes sense, then $S(f(a))=f(S(a))$.
In particular, if $T=h(x)D$ and $S=e^{h(x)D}$, then defining $g(x)=S(x)$ (the left hand side is a function of $x$, the right hand side is an operator applied to the identity function) satisfies $S(f(x))=f(g(x))$, at least when things work out and everything is well defined.
Question: What conditions on $h$ or $f$ make things work out, i.e., turn this formal argument into a rigorous argument?
Some observations. If $T(f)=\lambda f$, then $e^T(f)=e^{\lambda}f$, and if $h(x)f'(x)=\lambda f(x)$, then $f(x)=e^{\lambda \int 1/h(x) dx}$. If we set $H(x)=\int 1/h(x) dx$, then this implies that $H(g(x))=1+H(x)$. If this functional equation has no solution, or if the solution was not analytic, that would show that $e^{hD}$ was not a well defined operator, but short of that, I don't see a good approach to the problem. Trying to get an explicit formula just seems too messy.
Thanks to @BrevanEllefsen for pointing me to an idea from a paper of Lie. We can locally perform a change of variables so that, with respect to the new variable, $h(x)d/dx$ is just differentiation with respect to the new variable. There are a number of small technical issues to work out, but outside of these issues it reduces the problem to case of $h$ constant.
Define $H(x)=\int 1/h(x) dx$. This can be defined away from roots of $h(x)$, and by the inverse function theorem will be locally invertible with smooth inverse away from the poles of $h(x)$. Let $F(x)=f(H^{-1}(x))$, so that $f(x)=F(H(x)$. By the chain rule,
$$h(x)f'(x)=h(x)F'(H(x))H'(x)=F'(H(x))=\frac{dF}{dH},$$ and so $h(x)\frac{d}{dx}=\frac{d}{dH}$. Fix $a$, and $b=H(a)$. We will have a Taylor series for $F(H)$ in some neighborhood of $b$, and so we can view $e^{\frac{d}{dH}}$ as an operator acting on germs of functions at $H=b$. $(e^{t\frac{d}{dH}}F)(H(a))$ becomes
$$ \sum_{n=0}^{\infty} \frac{F^{(n)}(H(a))}{n!}t^n=\sum_{n=0}^{\infty} \frac{F^{(n)}(b)}{n!}t^n .$$
The sum is convergent if $t$ is less than the radius of convergence of $F$ at $b$, which will be the distance from $b$ to the nearest pole of $F$. If we do converge, then we get
$$(e^{th(x)\frac{d}{dx}}f)(x)=(e^{t\frac{d}{dH}}F)(H(x))=F(H(x)+t)$$
and if $H^{-1}$ is defined on a large enough region, we can rewrite things in terms of $f$, namely $$F(H(x)+t)=f(H^{-1}(H(x)+t)).$$ Without these two conditions, either the power series defining the operator will not converge when applied to $F$ or we will not be able to write our result back in terms of $f$. I am certain more could be said on when these conditions hold, but I am at least satisfied that the problem can be answered for specific $h$ and $f$.