Neural coding converts single word or single sentence of words into the vector of real numbers. This coding, while sometimes useful, is not revertible. Are there methods (or at least research effort to create them) that converts mathematical expressions in analog, real-valued vectors that can be encoded back into mathematical expressions with the minimum loss of information. By mathematical expression and I mean the set logical expressions that all involve some single word for whom the real-valued vector is generated and form whom this vector will represent its distributional semantics.
Note: it is possible to recover the word from the output vector of neural machine translation. But it my case I would like to recover from the vector the full set of logical expressions that were encoded initially.
There are some old tradition of digital-to-analog conversion but I would like something different: 1) that tradition concerns with design of integral circuits, I would like to read about more mathematical methods; 2) that tradition is mostly about binary data, I would like to read about non-binary data too.