Thinking about the bits you need to store a bicubic graph $G$'s data, you have several ways to do it:
- the adjacency matrix $A_G$
- the edge list $L_G$
- ...
In case of hamiltonian graphs, you have, additionally, the LCF-notation. Which of'em is the most efficient way to store $G$ and
Does hamiltonicity save some bits?
Yes, hamiltonicity helps. As a general heuristic, notation with more assumptions tends to be more efficient.
I will assume that the graph has $n$ vertices. I will assume that we only care about the graph up to isomorphism type, which is what we're implicitly doing at least with LCF notation.
Adjacency matrices
An adjacency matrix has $n^2$ entries from $\{0,1\}$, for a total of $n^2$ bits. In principle, only $\binom n2$ bits are necessary due to the symmetry.
We could use the biadjacency matrix to further reduce this to $\frac14 n^2$ bits: this is an $\frac n2 \times \frac n2$ matrix where we put a $1$ or $0$ in the $(i,j)$ position if vertex $i$ in one part is adjacent to vertex $j$ in the other.
Edge lists
The edge list of a cubic graph consists of $\frac32 n$ entries, each of which is a pair of elements of $\{1,2,\dots,n\}$. Each edge therefore takes $2 \log_2 n$ bits to represent naively, for a total of $3 n \log_2 n$ bits.
In a bipartite graph, we only have $\frac n2$ possibilities for each endpoint, so we could save a tiny amount of space by using only $\log_2 \frac n2 = \log_2 n - 1$ bits per vertex, but this doesn't save much.
In a bicubic graph, we can go through the vertices of one part in a fixed order, and for each one, write down the three neighbors in the other part. (We don't need to write down both halves of an edge: the three edges in positions $3i-2, 3i-1, 3i$ have the $i^{\text{th}}$ vertex in the first part as an endpoint.) This requires recording $\frac32 n$ numbers from $\{1, \dots, \frac n2\}$, which can be done using $\frac32 n(\log_2 n - 1)$ bits.
LCF notation
Here, we only need to record $n$ different values from $\{1,2,\dots,n\}$ (excluding $1, n-1, n$ actually, but that doesn't make a huge difference) for only $n \log_2 n$ bits.
We can actually be more efficient here: given a graph in LCF notation such as $$ [5,7,−7,7,−7,\color{red}{−5}, 5,7,\color{red}{−7},7,\color{red}{−7},\color{red}{−5}, 5,\color{red}{7},\color{red}{−7},\color{red}{7},\color{red}{−7},\color{red}{−5}] $$ half the entries are redundant (here, the entries in red). If you see a $k$ in position $i$, then you should see $-k$ (or $n-k$) in position $i+k$. So we can skip those entries and leave them implied, and only have to store $\frac12 n \log_2 n$ bits.
We could also save even further by observing that in a bipartite graph, all the values will be odd, so we can skip writing the last bit, which will always be $1$. Then, we only need $\log_2 n - 1$ bits per entry: $\frac12 n (\log_2 n - 1)$ bits total.
A lot of "nice" graphs have highly repetitive LCF notations, which lets us save space, but almost all cubic graphs have LCF notations with no repetition, so we can't rely on this.