Consider two independent linear Brownian motions $B'=(B'_t)_{t\geqslant0}$ and $B''=(B''_t)_{t\geqslant0}$, starting from $B'_0=B''_0=0$, and the process $X=(X_t)_{t\geqslant0}$ defined by $$X_t=\max\{B'_t,B''_t\}$$
What is known about the distribution of the process $X$?
The question is admittedly a little vague, hence we present a few remarks to help narrow it.
1. For each positive $t$, the PDF $f_t$ of $X_t$ is $$f_t(x)=2\varphi_t(x)\Phi_t(x)$$ where $\varphi_t$ and $\Phi_t$ are the centered normal PDF and CDF with variance $t$. Equivalently, $$f_t(x)=\frac2{\sqrt{t}}\varphi\left(\frac{x}{\sqrt{t}}\right)\Phi\left(\frac{x}{\sqrt{t}}\right)$$ where $\varphi$ and $\Phi$ are the standard normal PDF and CDF. In particular, $X$ is not a Brownian motion.
2. The process $X$ is a submartingale.
To show this in an elementary way, introduce the notations $B=(B',B'')$, and $\mathcal F^Y_t=\sigma(Y_s;s\leqslant t)$ for every time $t$ and every process $Y=(Y_t)_{t\geqslant0}$. Then, $X_t\geqslant B'_t$ almost surely hence, for every fixed $s<t$, $$E(X_t\mid \mathcal F^B_s)\geqslant E(B'_t\mid \mathcal F^B_s)=E(B'_t\mid \mathcal F^{B'}_s)=B'_s$$ By symmetry, $E(X_t\mid \mathcal F^B_s)\geqslant B''_s$ hence $E(X_t\mid \mathcal F^B_s)\geqslant X_s$. Finally, $\mathcal F^X_s\subseteq\mathcal F^B_s$ hence $$E(X_t\mid \mathcal F^X_s)=E(E(X_t\mid \mathcal F^B_s)\mid\mathcal F^X_s)\geqslant X_s$$ as desired.
3. The process $X$ is recurrent, in the sense that, for every $s$, almost surely, $$\sup_{t\geqslant s}X_t=+\infty\qquad\inf_{t\geqslant s}X_t=-\infty$$ Note that this implies that, for every nonnegative time $s$ and real number $x$, the sets of times $\{t\geqslant s\mid X_t=x\}$, $\{t\geqslant s\mid X_t\geqslant x\}$ and $\{t\geqslant s\mid X_t\leqslant x\}$ are all almost surely unbounded.
4. The process $X$ is (most probably) not Markov.
We did not write a full proof of this but the idea is that considering a (many-to-one) functional of a Markov process (these are often called hidden Markov models) usually destroys the Markov property. But one should beware that counterexamples exist, for example, $|B'|$ is Markov...
So, to begin with a precise question:
What would be a simple argument that $X$ is not a Markov process?
Some contributions.
MARGINAL DISTRIBUTION, MOMENTS and CONNECTIONS
Per clarification comments,
$$f_t(x)=2\phi_t(x)\Phi_t(x)=\frac 2 {\sqrt{t}}\phi(x/\sqrt{t})\Phi(x/\sqrt{t}) \tag{1}$$
This it the PDF of a Skew Normal distribution with location parameter $\xi=0 $, scale parameter $\sqrt{t}$ and shape (or "skew" or "slant") parameter $\alpha=1$.
Regarding moments, we have
$$\mathbb E(X_t) = \sqrt{\frac t{\pi}}, \;\;\; \text{Var}(X_t)= \left(1-\frac {1}{\pi}\right) \cdot t,\;\;\; \mathbb E(X_t^2) = t \tag{2}$$
while the skewness and kurtosis coefficients are the same for all $t$.
When $\alpha=1$, the CDF of the Skew Normal equals the square of the standard normal CDF,
$$P[X_t\leq x]=F_t(x) = [\Phi(x/\sqrt{t})]^2 \tag{3}$$
which for example gives us that $P[X_t\leq 0] = 1/4,\;\forall t$. So as the $\{X_t\}$ process travels along the index, although its mean and variance increase, the allocation of probability on the two sides of zero remains the same.
An interesting property of a Skew Normal r.v. with zero location parameter (not zero-mean) is its relation to the half-normal and chi-square distributions,
$$|X_t/\sqrt{t}| \sim HN(1),\;\;\; X_t^2/t \sim \chi^2_1 \tag{4}$$
..."as if" $X_t$ was a normal r.v. with zero-mean and variance $t$.
ALTERNATIVE REPRESENTATION for the MARGINAL DISTRIBUTION
It is a known result that a Skew Normal random variable can be represented as the sum (or the difference, depending on the sign of the shape parameter) of a normal random variable and an independent half-normal r.v. For the specific parameter values of our case, let two i.i.d. normals $N_t, Z_t \sim N(0, \sigma^2=t)$. Then defining the random variable
$$Y_t = \frac 1{\sqrt{2}}\cdot (N_t + |Z_t|) \tag{5}$$
we have that $Y_t\sim X_t$ (this identity in law does not automatically extend to any joint distribution along the index of course).
JOINT DISTRIBUTION of $(X_s, X_t)$
For $s<t$ we have
$$P[X_s\leq x_s, X_t\leq x_t] = P[\max\{B'_s,B''_s\}\leq x_s,\max\{B'_t,B''_t\}\leq x_t]$$
$$=P[B'_s\leq x_s,B''_s\leq x_s,B'_t\leq x_t,B''_t\leq x_t]$$
$$=P[B'_s\leq x_s,B'_t\leq x_t, B''_s\leq x_s,B''_t\leq x_t]$$
$$P[B'_s\leq x_s,B'_t\leq x_t]\cdot P[B''_s\leq x_s,B''_t\leq x_t]$$
Denoting $\Phi_2(u_1,u_2;\rho)$ the CDF of the bivariate standard normal distribution with correlation coefficient $\rho$, we obtain
$$P[X_s\leq x_s, X_t\leq x_t] = F_{s,t}(x_s, x_t)=\left [\Phi_2\left(\frac {x_s}{\sqrt{s}},\frac {x_t}{\sqrt{t}};\sqrt{\frac st}\right)\right]^2 \tag{6}$$
One can see that the joint CDF leads to the marginal CDF $(3)$ as it should. The rather complicated joint PDF corresponding to $(6)$ appears not to be particularly useful. Still, having the CDF available, and it being a widely studied, tabulated and software-implemented function, permits us to calculate various threshold probabilities usually of interest like $P[X_s> 0, X_t> 0]$ for example.
CONDITIONAL DISTRIBUTION
Using Baye's Theorem together with $(3)$ and $(6)$ we obtain
$$P[X_t\leq x_t\mid X_s\leq x_s]=\frac {P[X_s\leq x_s, X_t\leq x_t]}{P[X_s\leq x_s]} = \left(\frac{\Phi_2\left(x_s/\sqrt{s},x_t/\sqrt{t};\sqrt{s/t}\right)}{\Phi(x_s/\sqrt{s})}\right)^2 \tag{7}$$