Inductive Proof of a property of String concatenation

649 Views Asked by At

I'm trying to prove that for any 2 strings $\alpha, \beta \in T^\ast$, where $T^\ast$ is the set of all strings on the alphabet $T$, the following holds

$(\alpha \cdot \beta^l) = (\alpha\cdot\beta)^l$

Where for any string $\alpha$:

  • $\alpha^l$ denotes the longest prefix
  • $\alpha^r$ denotes the shortest suffix
  • $|\alpha|$ denotes the length of the string
  • $\alpha(n) , n$ a natural number, is the index of the $n$th letter of the string

I'm trying to do it by induction on $|\beta|$, but I'm having a hard time. Here's the base case:

Suppose $|\beta| = 1$

$RHS = (\alpha \cdot \beta )^l$ = $<(\alpha\cdot\beta)(0), ... , (\alpha\cdot\beta)(|\alpha\cdot\beta| - 2)>$ = $<(\alpha \cdot\beta)(0), ..., (\alpha\cdot\beta)(|\alpha|+|\beta| - 2)>$ = $<(\alpha\cdot\beta)(0),...,(\alpha\cdot\beta)(|\alpha|+1-2)>$ = $<(\alpha\cdot\beta)(0), ..., (\alpha\cdot\beta)(|\alpha|-1|)>$

Then for $(\alpha\cdot\beta)(i)$, we have $\alpha(i)$ if $i < |\alpha|$ and $\beta(0)$ if $i = |\alpha|$. I don't think this is right, though.

For my Inductive Case:

Inductive Hypothesis: If $|\beta| = n$, then $(\alpha\cdot\beta^l) = (\alpha\cdot\beta)^l$.

Suppose $|\beta| = n + 1$. Then

$RHS = (\alpha\cdot\beta)^l$ = $<(\alpha\cdot\beta)(0),...,(\alpha\cdot\beta)(|\alpha\cdot\beta|-2)>$ = $<(\alpha\cdot\beta)(0),...,(\alpha\cdot\beta)(|\alpha|+n+1-2)>$ = $<(\alpha\cdot\beta)(0),...,(\alpha\cdot\beta)(|\alpha|+|\beta|-1)>$ = $<(\alpha\cdot\beta(0),...,(\alpha\cdot\beta)(|\alpha\cdot\beta|-1))>$

Which I'm also pretty sure is wrong. I was stuck on this step:

$<(\alpha\cdot\beta)(0),...,(\alpha\cdot\beta)(|\alpha|+n+1-2)>$

For quite a while. I couldn't think of a way to properly apply my inductive hypothesis here.

Is there an easier way to prove this inductively? I feel like I'm missing something that would make this much more intuitive.

2

There are 2 best solutions below

0
On

I like your thinking on this. We should use induction.

Suppose $\alpha, \beta$ are both $\epsilon$ the empty string. Then clearly it holds since $\epsilon \epsilon^l = \epsilon = (\epsilon \epsilon)^l$. Now assume that it's true for $|\alpha| = m, |\beta| = n$. Then if $|\alpha'| = n + 1$ then we can write $\alpha' = a\alpha $ for some $a\in \Sigma$ our alphabet. Then $\alpha' \beta^l = a (\alpha \beta^l) = a(\alpha \beta)^l = (a\alpha\beta)^l$. Where we need a lemma:

If $|\alpha| = 1$ then $\alpha \beta^l = (\alpha \beta)^l$ for any string $\beta \in \Sigma^*$. It's clearly true for $\beta = \epsilon$ since $\alpha \epsilon^l = \alpha \epsilon \neq (\alpha \epsilon)^l$ clearly that doesn't hold in general. So we're in trouble. Let's assume that $|\beta| \gt 0$. Then $\alpha b^l = \alpha \epsilon = (\alpha b)^l$. So it's true in the base case of $|\beta| = 1$. Now assume true for $|\beta| = r$. Then if $|\beta'| = r + 1$ we can write $\beta' = \beta b$ for some $b \in \Sigma$. So that we have: $$ \alpha \beta'^l = \alpha \beta = (\alpha \beta b)^l = (\alpha \beta')^l $$.

So, as you can see it can be proven by two separate inductions. One for $n$ and one for $m$. I'll let you prove the $m$ one.

0
On

Don't take it the wrong way, but the main reason for which you don't find an intuitive proof is your poor notation. So let me first reformulate your question with a different notation (mostly taken from Lothaire's book Combinatorics on words).

Let $A$ be an alphabet and let $A^*$ be the set of all words on $A$. Note that $A^*$ is a monoid for the concatenation product. The identity of this product is the empty word $1$.

According to your definition the longest prefix $LP(a_1 \dotsm a_n)$ of $a_1 \dotsm a_n$ is $a_1 \dotsm a_{n-1}$ if $n > 0$ and is $1$ if $n = 0$. I claim that for all words $u, v$, one has $uLP(v) = LP(uv)$. The result is trivial if $v = 1$. Suppose that $v \not= 1$. Setting $u = a_1 \dotsm a_n$ and $v = b_1 \dotsm b_m$ (with $m > 0$), one gets $$ uLP(v) = a_1 \dotsm a_nb_1 \dotsm b_{m-1} = LP(uv) $$ This is a very intuitive proof that does not require any induction. The proof for the suffixes is similar.