I am trying to simulate the non-linear electrical resistive device called memristor in python. The device is characterized by a change in conductivity as an electrical field is applied, and so the relationship between applied voltage and current through the device is hysteretic and non-linear. The system is described by the equations
\begin{equation} M(x) = R_\text{ON} \cdot x + R_\text{off} \cdot (1-x) \end{equation} \begin{equation} \frac{dx}{dt} = \frac{\mu_x \cdot R_\text{ON}}{D^2}\cdot i_\text{tran}(t) \end{equation} When a voltage is applied, the region where there is charge change in thickness, so x is used here as the internal state that describes how the region of charge changes in relation to the total thickness of the layer. \begin{equation} x = \frac{w}{D}, \hspace{10pt} 0 \le x \le 1 \end{equation} I have tried to set this up in Python, but lack of understanding of differential equations and little practical experience with simulating in Python makes it difficult to spot my mistakes. I am pretty sure I have some of the basics wrong. The code looks like this:
from scipy.integrate import odeint
import numpy as np
import matplotlib.pyplot as plt
R_ON = 100 #Resistance maximum conducting state [ohm]
R_off = 16e3 #Resistance minimum conducting state [ohm]
W = 2*np.pi #Angular frequancy
V_0 = 5 #Volts
t0 = 0 #Time array start
tf = 1 #Time array stop
N = 50 #Array length
mu_V = 1e-14 #Oxygen vacancy mobility
D = 10e-9 #Titanium layer thickness
X0 = 0.76 #Normed charge region (initial value)
def M_tran(x):
return R_ON*x+R_off*(1-x)
def i_tran(t, x):
return (V_0*np.sin(W*t))/M_tran(x)
def odefun2(t, x):
c = (mu_V*R_ON)/(D**2)
return c*i_tran(t, x)
t = np.linspace(t0,tf,N+1)
T = odeint(odefun2, X0, t, tfirst=True)
U = np.multiply(i_tran(t, X0), M_tran(T[:,0]))
plt.plot(i_tran(t, X0), U)
Do I need a window function for x perhaps?
I'll attach the plot I get up if it can be of any help, although it makes no sense to me. Plot generated by this code
For completeness, the analytic solution to this question is as follows. Your ODE can be written as
$$\frac{dx}{dt}=\frac{f(t)}{x+b(1-x)}$$
where $f(t)$ is a given forcing (which has constants as well as the actual input voltage in it); note that I have absorbed your $R_{on}$ into $f$ as well. This equation can be solved as
$$\int_{x_0}^x y+b(1-y) dy = \int_0^t f(s) ds.$$
Thus
$$\frac{(1-b)x(t)^2}{2} + b x(t) - \left ( \frac{(1-b) x_0^2}{2} + bx_0 \right ) = \int_0^t f(s) ds.$$
This is just a quadratic equation that you can solve for $x(t)$ for each $t$; the coefficients are actually structured such that a solution is guaranteed to exist at least for a short time no matter what $b$ or $x_0$ is, which I find quite neat.
Numerically it looks like the only real problem with your implementation is that in the call to i_tran inside the plotter, you need to use the time-dependent values of $x$ not the initial value.