Recall that a right-linear grammar is a grammar that consists of rules of the form $A\to uB$, where $A$ and $B$ are non-terminals and $u$ is a (possibly empty) word of terminals. Similarly for left-linear grammars. What are the proper name (if any) for the following kinds of grammars:
- Grammars that consist of rules of the form $A\to uA$, for various non-terminals $A$ and terminal words $u$ (notice that the non-terminal in the "antecedent" and the "succedent" of each rule is the same; but of course, different rules may have different non-terminals). Uniform? Identical?
- Grammars that consist of rules of the form $A\to uA$ and $B\to Bv$, for various non-terminals $A,B$ and terminal words $u,v$ (i.e., it is a mix of left and right rules, but each rule of the kind mentioned in item 1 above).
Note that any grammar of each of the above two kinds produces a regular language — in the sense that the set of all words (in the alphabet of all symbols, both terminal and non-terminal) derivable from a given non-terminal symbol is a regular language.
A grammar with productions of type
$A \rightarrow uA$
is called right-recursive. They are commonly used in top-down parsers (eg, $LL(1)$ parsers). If the terminal is on the right, the grammar is called left-recursive, and is frequently used in bottom-up parsers (eg, $LR(1)$ parsers). More precisely, as discussed in comments above, any such production has that name, and grammars with such productions exhibit left-(right-) recursion. This is in the context of compilers at least - see the Dragon Book or Programming Languages Pragmatics -, but probably the same applies to the general theory of pushdown automata.