There is a nice $\exp$-like function, which can be defined on $\mathbb{R}$ by its Taylor series or by an integral:
$$p(x)=\sum_{k=0}^\infty \frac{x^k}{(k+1)^{k+1}}=\int_0^1 u^{-u x}du$$
I had a thought to define $\cos$-like and $\sin$-like functions the same way they are defined for the exponent:
$$pc(x)=\frac{1}{2}(p(ix)+p(-ix))=\sum_{k=0}^\infty \frac{(-1)^k x^{2k}}{(2k+1)^{2k+1}}=\int_0^1 \cos (x ~u \ln u)du$$
$$ps(x)=\frac{1}{2i}(p(ix)-p(-ix))=\sum_{k=0}^\infty \frac{(-1)^k x^{2k+1}}{(2k+2)^{2k+2}}=-\int_0^1 \sin (x~ u \ln u)du$$
I'm not sure how to use either of these definitions to find the zeroes (or the maxima/minima, etc) of $pc(x)$ and $ps(x)$.
I guess like for Bessel functions there might be no closed form expressions for these zeroes. But how to find them numerically in an effective way, without trial and error and computing either the series or the integral several times?


Note this isn't intended to be a complete answer, but rather a record of the progress I've made so far. Hopefully others can build on this and improve the estimation method.
To efficiently calculate the roots of these functions numerically Newton's method would seem advisable. However this method requires an estimate of the location of these roots to use as a seed value. If our estimate is not sufficiently close to our desired root it will converge to some other zero fo the function.
Informally integrals are a sort of continuous summation. Using this as a heuristic I decided to think of the integral $-\int_0^1 \sin(x u\ln(u))du$ as a superposition of sine waves, each with a frequency $u\ln(u)$. Of these frequencies there is a maximum frequency of $1/e$. From this I concluded that consecutive even roots, which approximately correspond to 1 period of a sinusoid, shouldn't be closer than $2\pi e$. Since the first even root occurs at $x=0$ I was able to estimate that the first three even roots should occur near $\lbrace 17.1,34.2,51.2 \rbrace$.
Using this method I got the following results for the first 6 even roots. In each case, except for the last one, I stopped the iterations when ps(r) was on the order of $10^{-15}$.
I haven't yet thoroughly tested the method for higher roots, which is a point of possible failure. The following roots are for $ps(x)$ in particular.
Guess=17.07946844
Final Result=18.70552649220349
Number of Iterations = 4
Guess=34.15893689
Final Result=35.93348821178866
Number of Iterations = 4
r_6 :
Guess=51.23840533
Final Result=53.08048121487314
Number of Iterations = 4
r_8 :
Guess=68.31787378
Final Result=70.20033529768552
Number of Iterations = 4
Guess=85.39734222
Final Result=87.3073325217411
Number of Iterations=4
Guess=102.4768107
Final Result=104.407075063793409825332380293
( $ps(r)\approx 10^{-18}$ using 200 terms in the series)
Number of Iterations=4
I discovered an interesting result. $u\ln(u)$ has an extreme value at $u=1/e$. Near this point the integral will get its most significant contribution if $|\sin(x/e)|=1$; this occurs when $x=\pi e/2 + n\pi/2$. For those fixed values of $x$ the integrand will be very flat and boxy near the critical point, but highly oscillatory elsewhere in the interval.
The taylor series of the integrand centered at $u=1/e$ is,
$$-(-1)^n+\frac{1}{32}(-1)^n\pi^2e^4(1+2n)^2(u-1/e)^4,$$
To estimate the integral for large odd $n$ we can do a sort of saddle point approximation and treat the integrand as being,
$$ f(u)=\exp(-\frac{1}{32}\pi^2e^4(1+2n)^2(u-1/e)^4) + \text{(highly oscillatory terms)}$$
Clearly a change of variables could be performed in the integral to pull out the $n$-dependence; this would result in a prefactor on the order of $1/\sqrt{n}$. This explains why the peaks seem to roughly fall as $1/\sqrt{x}$.
I will do some numerical experiments later to see how well this can be used to predict the locations of the extrema.