Pattern Engine


The lower right part of the logo may be built from digital memory units Commutative Diagram and a simple voting/threshold unit. Each memory unit stores a map from a finite discrete space Xi into a finite space Y.Memory Unit Additionally, there are special points ⊥ ∈ Y, called „undefined“ and ⊥ ∈ Xi, called „unknown“. Obviously ⊥ ∈ Xi at „In“ forces „Out“ ∈ Y to ⊥. Initially, all of Xi is mapped to ⊥, the memory unit is empty. When the boolean input „Update“ is false, memory contents are fixed. When it is true, the input value at „Set“ is copied to „Out“. The association between the value at „In“ ∈ X and „Out“ ∈ Y remains stored when „Update“ returns to false.

The Pattern Engine

The memory units will store the functions fi of the logo. Their address spaces are the small discrete spaces Xi. The voting unit implements the majority decision μ The special point ⊥ is treated as abstention. The misalignment between the diagonal and the image of the fi in Yn is measured by majority and veto. If majority is high (>α, Majority Threshold ) and veto is low (<β, Veto Threshold ), e.g. if the fit is good, a feedback path will store the majority result back into the memory cells. This makes the diagram commute.
( We may replace majory and veto by shannon entropy. The entropy of a point in Yn measures how well the diagram of our logo commutes. The feedback path is activated whenever entropy is lower than a threshold.)
The Voting Unit has two additional inputs, „Set“ and „Update“. The boolean input „Update“ fakes majority and veto to 100% and 0%. We can use „Set“ and „Update“ to store reference examples. In this respect, the entire pattern engine behaves like a single memory unit with an unusual adressing scheme. The exact values of α and β are not important. A low veto threshold ( zero ) may improve semi–supervised learning, but it is not indispensable.

A small extension allows for unsupervised learning of a clustering task. Set majority threshold to 1 and veto threshold to zero, effectively disabling the feedback path. All memory cells are empty (⊥), and the space Y consists of just a single point ⊥. Majority is zero. In this situation, we will add a new point in Y and activate the ’set‘ input. Majority jumps to 1. If the input pattern changes slowly, not all address inputs will change at the same time. Step by step, majority drops from 1 to zero. When the output changes to ⊥, a new symbol in Y is created.
As the memory fills up the rate of symbol creation drops. Those input patterns that correspond to symbol creation will have majority close to one, but most input patterns will give a low majority and a high veto. We have planted many different germs of functions to Y, but they do not grow.
Now we will lower majority threshold and raise veto threshold. The symbol populations will now begin to compete against each other under the influence of continuously varying input. As a result, the area covered by some symbols will grow while other symbols go extinct.

Some modifications

The voting process and the feedback rule that control the growth of germs allow for some modulation of the general theme. Some of these variants are listed below.

The simple feedback rule ( ‚majority‘ > α ∧ ‚veto‘ < β) may be replaced by (‚majority‘ > α ∧ ‚veto‘ < 1 - 'majority' - 'veto' ). In effect, the undecided voters take side with the majority. Its main advantage is a self-stabilization of category boundaries. If there are no more undecided memory cells, the majority wins ( of course ) but it can no longer toggle its opponents. Without such a rule, small minorities die out and their contribution is lost forever. Other obvious modifications include a weighting of votes. If a memory cell is toggled too often, its weight will decrease. We might be tempted to store an entire probability distribution in each cell. This is not necessary, because there are many address input values that identify the class of the input. In this case, the elaborate bayesian calculus gives the same result as a crude approximation. More interesting are explicit time dependencies of weights. The weight of a vote starts at 100% with a new address input and decreases if the address input remains constant. Such a behaviour is common in sensory neurons. In a pattern engine it will lead to oscillations and perceptual bistability. In a hierarchical system, oscillations will be coupled across hierarchy boundaries, leading to complex patterns and ripples.

The weight may also depend on the total size of the fibres. Calculate the weight of an entry y = fi(t) at address t = hi(x) by a decreasing function of fibre size,
such as 1/# fi-1(y) ) or max( 0 , #X/#Y + n0 – # fi-1(y) ).
A newly created germ has a high weight and tends to grow rapidly. Unlimited growth, however, is checked by a decreasing weight. As a fibre is compressed, it shows increasing resistance to further compression. If a fibre becomes too large, further expansion is checked.
This simple approach works amazingly well. It is used in my current attempts to build an OCR engine. The following graph shows the size of the largest fibre, the smallest fibre and the median. To the left of the diagram, all fibres are created with equal size. The feedback process keeps changing them.

Next: Code Snippet

 Date Posted: 29 Mrz 2009 @ 06 29 PM
Last Modified: 16 Apr 2011 @ 03 18 PM
Posted By: Hardy

Responses to this post » (None)


Post a Comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

\/ More Options ...