Let $X$ be a random variable, $f(X)$ is a function. And we have $$ \begin{align} I(X;f(X)) &= H(X)-H(X|f(X)) \\ &= H(f(X))-\underbrace{H(f(X)|X)}_{=0}\\ H(f(X)) &= H(X)-H(X|f(X))\leq H(X) \end{align} $$ So we find that functioning can decrease information entropy.
Let $T$ be shuffling operation, $X$ be permutations of Poker Deck. And we have $$ \begin{align} H(TX) &\geq H(TX|T) \\ &= H(T^{-1}TX|T) \\ &= H(X|T) \\ &= H(X) \end{align} $$
Are they contradictory? The second algorithm is in Thomas M.Cover Elements of Information Theory Chapter 4:Entropy Rate of a Stochastic Process, Exerciese 4.3.
There is no Contradiction.
Let us say we have a list of Cards , some of which are White & some of which are Maroon.
Let the Cards be numbered too.
Eg A list may be $L=[W1,W2,M4,M8]$.
The Colours & Numbers & the Order will matter.
(A) Consider some one-to-one function $f_1(list)$ which gives a new list with the colours toggled.
$f_1([W1,W2,M4,M8])=[M1,M2,W4,W8]$
$f_1([W1,M2,W4,M8])=[M1,W2,M4,W8]$
Then there is no Change in Entropy when we use this function.
(B) Consider some many-to-one function $f_2(list)$ which gives a new list with all the colours set to W.
$f_2([W1,W2,M4,M8])=f_2([W1,M2,M4,M8])=f_2([M1,W2,M4,M8])=[W1,W2,W4,W8]$
Here we see that many Inputs give the same Output. No matter what the Input , Maroon will not occur in the Outputs.
Then there is a Decrease in Entropy when we use this function.
(C) Consider some many-to-one function $f_3(list)$ which gives a new list with the numbers set to 1 & Colour set to W.
$f_3([W1,W2,M4,M8])=f_3([W3,W5,M7,M9])=f_3([W2,W4,M6,M8])=[W1,W1,W1,W1]$
Here we see that a large number of Inputs give the same Output. Output is a Constant.
Then there is a great Decrease in Entropy.
(D) Coming to your Shuffle :
Shuffling takes a list , randomizes it and gives a new list.
It is not a normal function , because it is one-to-many. Still , let us write it with that notation like this :
$f_4([W1,W2,M4,M8])$ may give $[W1,W2,M8,M4],[W2,W1,M4,M8],[W1,M4,W2,M8],,,$
In general , we see that there may be $4!=24$ Possible Outputs for every Input.
It is this IMPLICT randomness that will Increase the Entropy.
When we consider the Shuffle without considering the IMPLICIT randomness , Entropy will Increase.
(E) Better notation (still not normal function) to Indicate your Shuffle might be like this :
$f_5(list,random\ number)$ may give a list based on the Input list & the given random number.
Every Input may have $4!=24$ Possible Outputs which will get selected by the random number.
It is this EXPLICT random number that will Increase the Entropy.
When we consider the Shuffle without considering the EXPLICIT random number , Entropy will Increase.
Deterministic : Here $f_1,f_2,f_3$ are Deterministic functions , which will not Increase the Entropy.
Non-Deterministic : The Shuffling $f_4,f_5$ are Non-Deterministic (not functions) , involving randomness , which will Increase the Entropy.
There is no Contradiction.