I'm $\DeclareMathOperator{\par}{\unicode{8523}}$ trying to wrap my mind around the $\par$ ("par") operator of linear logic.
The other connectives have simple resource interpretations ($A\otimes B$ is "you have both $A$ and $B$", $A\&B$ is "you can have either $A$ or $B$" etc.).
But despite the inference rules being simple, I've seen only mentions that $A\par B$ is "difficult to describe" in a resource interpretation, apart from the special case of $A\multimap B ~\equiv ~ A^\bot \par B$ ("you can convert $A$ to $B$").
Any way I can understand it better?
Please see Linear Logic in wikipedia: the symbol ⅋ ("par") is used to denote multipicative disjunction, whereas ⊗ is used to denote multiplicative conjunction. They are duals of each other. There is also some discussion of this connective.
Variously, one can denote the "par" operation using the symbol $ \sqcup $, which is described in the paper (pdf) "Linear Logic Displayed" as "join-like."
The logical rules for ⊗ and ⅋ are as follows:
If Γ: A: B Δ ⊢Θ, then Γ: A⊗B: Δ ⊢ Θ; conversely, if Γ ⊢ Δ: A and ⊢ B: Θ, then Γ ⊢ Δ: A⊗B: Θ.
Dually, if Γ ⊢ Δ: A: B: Θ, then Γ ⊢ Δ: A⅋B: Θ; conversely, if A: Γ ⊢Δ and Θ: B ⊢, then Θ: A⅋B: Γ ⊢ Δ.
Multiplication distributes over addition if one is a conjunction and one is a disjunction:
A⊗(B⊕C) ≡ (A⊗B)⊕(A⊗C) (and on the other side);
(multiplicative conjunction distributes over additive disjunction)
A⅋(B&C) ≡ (A⅋B)&(A⅋C) (and on the other side);
(multiplicative disjunction over additive conjunction)
Also:
Linear implication $A\multimap B$ corresponds to the internal hom, which can be defined as (A⊗$B^⊥)^⊥$. There is a de Morgan dual of the tensor called ‘par’, with A⅋B=($A^⊥⊗B^⊥)^⊥$. Tensor and par are the ‘multiplicative’ connectives, which roughly speaking represent the parallel availability of resources.
The ‘additive’ connectives & and ⊕, which correspond in another way to traditional conjunction and disjunction, are modeled as usual by products and coproducts.
For a nice exploration in linear logic: see this: