Experimental Hardware Neurons

Updated 6/16/03

 

Faceplate of the experimental device. 4.5" x 5.25" Frac-Rac faceplate.
Banana jacks used for stacking with parallel 3.5mm outputs.
Left side of symbol contains 3 position "threshold" switch.
Right side of symbol (white half) contains an LED which flashes when the "cell" fires.
Blue jacks are "excitatory" inputs, black jacks are "inhibitory" inputs, red jacks are 10 volt pulse outputs.

 

A type of logic element based on pulses, rather than levels is presented. In electronic music terms, this would be equivalent to logic elements for "triggers", rather than logic elements for "gates". It is an interesting and inexpensive way to make complex pattern's of pulses. The reader will have to judge for themselve how "musical" the results are.

Some MP3 samples of using both types of pattern generators and 4 neurons to trigger a Roland Handsonic. Triggering is sent via MIDI using a Roland OP-8m CV to MIDI interface in "Whole" mode. Neurons 1-4 are routed to gate inputs 1-4 and potentiometers adjust input voltages for channels 1-4 CV inputs.

Style
Kit
Sample
Modern Techey Techno1.mp3
  Techey Techno2.mp3
  Techey Techno3.mp3
  Industrial Electro1.mp3
  Industrial Electro2.mp3
Ethnic Tabla I tabla1.mp3
  Tabla I tabla2.mp3
Standard Standard standard.mp3
  JazzBrush brush.mp3
  TR-808 analog.mp3

These elements can then be used to process patterns generated by another pattern generator. The use of current differencing amplifiers produces a design with very low parts count and expense. Two "neurons" can be realized from each LM3900 package. Fans of John Simonton will notice the unit delay is produced by a single stage from the PAIA 4780 sequencer. With component values show, the cells output a synchronous 10 millisecond pulse at a variable rate (tempo). A "Run/Stop" switch resets the entire array to a known state for startup (intro).

The pulse width can be made adjustable by using a pot for R3. Make multiple copies of sheet 3 of the schematic using the common nets. The experimental device produces 4 neurons with a chip count of 1 x LM555, 3 x LM3900, 1 x CD4024 and 1 x CD4081.

The background introduction is excerpted from Marvin Minsky's 1967 book "Computation: Finite and Infinite Machines". The book is long out of print, but worth looking for in a library. Warren S. McCulloch's book "Embodiments of Mind" was mentioned in the "Whole Earth Catalog". Additional research lead to the Minsky book and this device. A long strange trip indeed.

 
START OF MINSKY EXCERPT
 

3.1 THE "CELLS" OF McCULLOCH AND PITTS

Our machines will be assembled by interconnecting certain basic elements called 'cells." Each cell will be represented in our diagrams by a circular figure:

From each cell leads a single line or wire, called the "output fiber" of the cell. This output fiber may branch out after leaving the cell; each branch must ultimately terminate as an input connection to another (or perhaps the same) cell. Two types of termination are allowed. One, called an excitatory input, is represented by a little black arrow:
The other, called an inhibitory input, is represented by a little outline circle:
We permit any number of input connections to a cell. Figure 3.1-1 shows a few typical cell configurations, with names whose meaning will become evident later in the chapter.
 

We build up more complex machines by interconnecting cells to form "neural networks." The only restriction on the formation of such nets is that, while we permit output fibers to branch out to form arbitrarily many input termination's, we do not allow output fibers from different cells to fuse together. A glance at the illustrations in this chapter will show what is permitted.

Each cell is a finite-state machine and accordingly operates in discrete moments. The moments are assumed synchronous among all cells. At each moment a cell is either firing or quiet; these are the two possible states of the cell. For each state there is an associated output signal, transmitted along the cell's fiber branches. It is convenient to imagine these signals as short waves or pulses transmitted very quickly along the fiber. Since each cell has only two possible states, it is even more convenient to think of the firing state as producing a pulse while the quiet state yields no pulse. (One may think of "no pulse" as the name of the signal associated with the quiet state.)

Cells change state as a consequence of the pulses received at their inputs. In the center of the circle representing each cell there is written a number, called the threshold of that cell. This threshold determines the state-transition properties of a cell C in the following manner.

At any moment t, there will be a certain distribution of activity on the fibers terminating upon C. We ignore all fibers which are quiet at this time, and look to see if any inhibitory inputs are firing. If one or more of the inhibitors are firing, then C will not fire at time t + 1.

Otherwise if no inhibitor is firing we count up the number of excitatory inputs that are firing; if this number is equal to or greater than the threshold of C (the number in the circle), then C will fire at time t + 1.

In other words: a cell will fire at time t + 1 if and only if at time t, the number of active excitatory inputs equals or exceeds the threshold, and no inhibitor is active. A cell with threshold 1 will fire if any excitatory fiber is fired and no inhibitor is. A cell with threshold 2 requires at least two excitations (and no inhibition) if it is to fire at the next moment. A cell with threshold 0 will fire at any time unless an inhibitor prevents this.

Fig. 3.1-2 illustrates the behavior of a number of different cells.

 

REMARKS

(1) The reader will note that the state of a cell at t + 1 doesn't depend on its state at time t. These are very simple "neurons" indeed. One notable property of real neurons is that once having fired, there is an interval during which they can't he fired again (called the refractory interval), and this illustrates a real dependency on the previous state. However, we will see (section 3.6) that it is easy to "simulate" such behavior with groups of McCulloch-Pitts cells.

(2) Our "inhibition" is here absolute, in that a single inhibitory signal can block response of a cell to any amount of excitation. We might equally well have used a different system in which a cell fires if the difference between the amounts of excitation and inhibition exceeds the threshold. This is equivalent to having the inhibition signals increase the threshold. McCulloch, himself, [1960] has adopted this model for certain uses.

(3) Delays. We assume a standard delay between input and output for all our cells. In more painstaking analyses, such as those of Burks and Wang [1957] and Copi, Elgot, and Wright [1958], it has been found useful to separate the time-dependency from the other "logical" features of the cells and to introduce special time-delay cells along with instantaneous logical cells. The use of instantaneous logical cells forces one to restrict the ways in which the elements can be connected (lest paradoxical nets be drawn). We will avoid them except briefly in section 4.4.1.

(4) Mathematical Notations. In the original McCulloch-Pitts paper [1943], the properties of cells and their interconnections were represented by a mathematical notation employing expressions from a "temporal propositional calculus." Because most of our arguments can be based on commonsense reasoning about diagrams, it did not seem necessary here to bring in this mathematical apparatus. Of course, anyone who intends to design efficient complex nets or computers will have to learn to use the appropriate modern mathematical representations. For reasons discussed in note 3 of chapter 4, the original McCulloch-Pitts paper is recommended not so much for its notation as for its content, philosophical as well as technical.

 

3.2 MACHINES MADE UP OF McCULLOCH-PITTS NEURONS

In this section we show how one can go about constructing some useful, more complicated machines, using a few useful kinds of neural nets. Our goal is to collect enough devices to make a general-purpose computer!

 

3.2.1 Delays

Examples 1, 3, and 4 of section 2.3 give the state-transition structures for machines which remember, and read out, the last one, two, and three binary signals they have received. The network machines in Fig. 3.2-1 do precisely the same things. (Compare with Figs. 2.3-1, -3, and -4.)

The behaviors of these nets are described by the equations R(t) = S(t - 1) R(t) = S(t - 2) R(t) = S(t - 3) respectively.

Note that while it requires 2^n states to remember n digits (see 2.3), it takes only n cells. Of course, the network delay machine with n cells does indeed have 2^n total states. For each cell can have 2 states, and one must multiply these numbers together for all the cells. Remember precisely why the state diagram (Fig. 2.3-4) of the 3-delay net needs fully 8 (2 x 2 x 2) states.

This example already shows two reasons why we turn from state diagrams to network machines made of parts, as we start to consider complicated machines. First, the state diagrams become too large, as they grow exponentially with the number of distinct items or signals the machine must remember. Second, the division into parts rather than states makes the diagrams more meaningful, at least to the extent that different aspects of the behavior are localized in, or accounted for, by different physical parts of the machine.

 

3.2.2 Gates and switches.

Control of the flow of information Suppose that it is desired to control the flow of signals or information from one place to another (Fig. 3.2-2).

The information is traveling along an already established path, and we wish to control its flow by signals under our own control. We introduce (Fig. 3.2-3) cells of threshold 2 ("AND" cells) into each fiber: in the figure we use three parallel fibers just to show that we can handle any number at once. Now, during any interval in which we want information to flow, we can send a string of pulses along the "control" or "gating" fiber. So long as the control pulses are present, the output fibers will carry the same signals that come into the (upper) inputs; if the control pulses are absent there will be no output.
We can use the inhibitory effect to control information flow, within the family of cells we are allowed to use, by using a network like that of Fig. 3.2-5. Note that here, information flows through only when the control fiber does not fire. In the "facilitation" (to use the neurophysiologist's word) gate of Fig. 3 2-3, information flows only when the control fiber does fire.
Now consider the network obtained when we use both "inhibition" and "facilitation" gates (Fig. 3.2-6)!
When the control fiber is quiet, the signals fire the threshold 1 (OR) cells, and the information flows to Receiver 1. The threshold 2 (AND) cells do not fire. When the control fiber is active, the AND cells transmit the signals to Receiver 2, just as in Fig. 3.2-3. Also, the OR cells are now all "inhibited," and no signals can get to Receiver 1. Thus the activation of the control fiber has the effect of flipping a double-throw (three-pole) switch between the transmitter and the two receivers.

 

3.2.3 Memory

The control System just described is not very convenient because its operator has to continue to send control pulses throughout the interval during which he wants signal flow. The net in Fig 3 2-7 permits the operator to initiate signal transmission with a single signal, and to terminate it with another; no attention is required during the intervening period of transmission.

The trick here is to use a "feedback" fiber that runs from the output fiber of a cell back to an excitatory termination at the input of the very same cell. Once this cell has been fired (by a signal from the "start" fiber) it will continue to fire at all successive moments, until it is halted by a signal on the inhibitory "stop" fiber. Throughout this interval of activity it will send pulses to the "gate" cell and permit the passage of information. This little net has some memory. It remembers whether it has received a stop signal since it last received a "start" signal.

 
END OF EXCERPT
 
Those familiar with Buchla sequencers will notice the above figure describes the logic of the Buchla clock sources. This similarity originally suggested the possible utility of such cells for rhythm generation and synthesizer clocking control.
 

Schematics of the Experimental Hardware Neurons

 

Synchronous clock source for all neurons. Uses 1 x 555 timer and 1 x LM3900.
Note offsheet references are labeled input or output for that sheet.

CLK1 is a positive going ~10 millisecond pulse at a rate set by R4 and C3.


CLK2 is a negative going ~300 microsecond pulse triggered by the raising edge of CLK1.
It is used to transfer data from the first LM3900 SR latch to the second.
NOTCLK2 is used to reset the first SR Latch.


RESET is a positive going ~300 microsecond pulse triggered by the falling edge of CLK1.
RESET is used to reset the second LM3900 SR latch.

 

Simple pattern generator for neural elements. Uses 1 x CD4024 and 1 x CD4081.
Produces pulses at half time intervals. Outputs 10 volt 10 millisecond pulses with ~1K output impedance.

MANR is used to reset all latches to a know state (low).

 

A more complex pattern generator for neural elements. Uses 2 x CD4089
Produces pulses at intervals set by DIP switches. Outputs 10 volt 10 millisecond pulses with ~1K output impedance.

MANR is used to reset all latches to a know state (low).

Pulse patterns produced by the CD4089 for various DIP switch settings.

 

Schematic of 2 actual neuron elements.

Outputs 10 volt 10 millisecond pulses with ~1K output impedance.

Switches S3 and S4 are SPDT Center OFF types which set the cell threshold to 0 (NOT), 1 (OR) or 2 (AND).

This is basically two SR latches in series formed by the positive feedback of R53 and R48.
CLK2 transfers data from the first latch to the output as in the PAIA 4780.

 

Back of the Neural Pulser showing clock generator, pattern generators, and 4 neurons built on a Radio Shack experimenters board.

 

Returm to Homepage

 
Portions Copyright 2003 Wiard Synthesizer Company