Matrix representation from a linear function

88 Views Asked by At

I'm studying an opinion formation model [1] where the main rule is:

$p^{(t)}_i = \frac{p^{(0)}_i + \sum_{j \in N(i)} w_{i,j} p^{(t-1)}_j}{1 + \sum_{j \in N(i)} w_{i,j}}$

Now, I'd like to represent that equation in matrix notation:

$\mathbf{p}^{(t)} = \mathbf{A} \mathbf{p}^{(t-1)}$

where:

$A_{i,j} = \frac{p_i^{(0)}}{\sum_{j \in N(i)} w_{i,j}}$, if $i=j$

$A_{i,j} = \frac{w_{i,j}}{\sum_{j \in N(i)} w_{i,j}}$, if $j \in N(i)$

$A_{i,j} = 0$, otherwise.

Is that right, or not? I'm asking because when I run the iteration given by $\mathbf{p}^{(t)} = \mathbf{A} \mathbf{p}^{(t-1)}$ I get a $\mathbf{p}^{(t)}$ with all zeros, and that is certainly wrong.

1

There are 1 best solutions below

0
On

I've found that my first representation was wrong. The exact way is the following:

$\mathbf{p}^{(t)} = \mathbf{A} \mathbf{p}^{(0)} + \mathbf{B} \mathbf{p}^{(t-1)}$

where:

$A=diag(\alpha_1, \ldots, \alpha_n)$ and $\alpha_i= \frac{1}{1 + \sum_{j \in N(i)} w_{i,j}}$,

$B_{i,j} = \frac{w_{i,j}}{1 + \sum_{j \in N(i)} w_{i,j}}$, if $j \in N(i)$

$B_{i,j} = 0$, otherwise.