Perceptrons that recognize AND, OR, NOT

185 Views Asked by At

I'm trying to figure out how to create a set of perceptron weights: one for AND, one for OR, one for NOT. I'm not sure where to begin, but any hints are greatly appreciated!

1

There are 1 best solutions below

0
On BEST ANSWER

A look at the Wikipedia page on perceptron weights, see here, indicates that these are a bit complicated in applying them to some kind of "learning process" whereby various cases are looked at in order to find a perceptron which is in some sense optimal.

However using just the definition of a perceptron as a function $f$ defined on binary vectors $x$ in terms of a weight vector $w$ and bias real number $b$ having the form $$f(x)=w \cdot x +b\tag{1}$$ where here $f(x)$ is to return $1$ if the right side of $(1)$ is greater than $0$, otherwise $f(x)=0,$ makes it fairly simple to get at least one "perceptron" for each of AND, OR, NOT.

For each of AND, OR one can use the weight vector $w=(1,1).$ Then specifically to get a perceptron for OR one can use bias $b=0$ and then with $x=(p,q)$ the formula $f(p,q)=(1,1)\cdot (p,q)+0=p+q$ has the desired property that it is positive when $p$ OR $q$ is true, otherwise it is not positive.

The same idea works for $p$ AND $q$ using the same $w=(1,1)$ but this time set the bias $b$ to $-1$ and then here $f(p,q)=p+q-1$ gives a working formula for AND.

In the case of NOT, the weight vector $w$ has only one component and may be taken as $(-1)$ and use bias $b=+1$ giving the formula $-p+1$ as a formula capturing NOT $p.4