The definition of Shannon Entropy for a Random variable with two outcomes is ( From https://en.wikipedia.org/wiki/Entropy_(information_theory)):
Which produces a symmetric graph like this:
However, I have use case where the uncertainty is treated as importance but I want to skew it so that higher probability has more importance. Basically, what formula can I use to skew the Entropy/Uncertainty either right or left?
Something which will produce a graph like this (similar to Skewed Normal Distribution)?
Note: I still want the Entropy range to be from 0-1. Can't use a weightage between p and 1-p since that doesn't make it 0-1.



Here is a solution:
Consider the curves $C_{a,b}$ with parametric equations:
$$x(t)=t+abt(1-t), \ \ \ y(t)=at(1-t)$$
where $0 \le a,b \le 1$.
Fig. 1: Curves $C_{a,b}$ plotted for values $a=0.6,0.8, 1$ and $b=0., 0.2, 0.4, 0.6, 0.8, 1$. The red curve corresponds to the case $a=1$ and $b=0.8$.
Explanation: I have skewed the original curve by applying a "skewing matrix" to the original curve in this way:
$$\begin{pmatrix}1&1\\0&b\end{pmatrix}\begin{pmatrix}t\\at(1-t)\end{pmatrix}=\begin{pmatrix}t+at(1-t)\\abt(1-t)\end{pmatrix}$$