Next: Computer Communications Laboratory
Up: Department of Computer
Previous: Computer Devices Laboratory
/ V. I. Varshavsky / Professor
/ Rafail A. Lashevsky / Professor
/ Vyacheslav B. Marakhovsky / Professor
/ V. V. Smolensky / Research Associate
In 2000, research themes of the laboratory was On-Chip Learning Artificial
Neuron, Weight Refreshing in On-Chip Learning ANN and Logical Timing and
Decentralized Control in Massively Parallel Computing Systems.
In the first two themes the problem of design floating gate transistors
based on-chip learning artificial neural networks were studied. Design
methodology of such a structure is created. The methodology could be used
for circuit optimization taking into account elements parameter and
dimension deviations. Especially important problem is the tolerance to
supplied voltage deviations, because of possible difference in the voltages
in learning and working times. As a result of investigation the main parts
of the structure and their design methodology had been developed. The
proposed structure is fully tolerant to any kind of deviations and
especially efficient in the environment where supplied voltage varies
greatly. The tolerance of the structure was reached using the idea of
compensating deviations in one part of the structure by deviations in
another part. Such an approach is important because technology process is
going to smaller design rules and supplied voltage. The possibility of using
floating gate transistors based step voltage comparator as a way to
refreshing synapse weight represented as a voltage in dynamic memory is
shown.
In the third theme, we studied the problems of functional power of
beta-driven CMOS artificial neurons along with the ways of increasing their
capability and implementability. The following results were obtained:
- A special class of threshold functions (Horners functions) was suggested
as test functions for learning of neurons. It was shown that Horners
functions have the complexity close to maximum and have the minimum length
of teaching sequence for given threshold that is very important for
experiments with neuron learning.
- The beta-driven artificial neuron learnable to non-isotonous threshold
functions was studied. We suggested the neuron that have synapses capable to
form the weight and type (excitatory or inhibitory) of the input during the
learning, using only increment and decrement signals. A neuron with such
synapses can be learned to an arbitrary threshold function of a certain
number of variables.
- For such kind of neuron synapse circuits are suggested with one and two
memory elements for storing positive and negative input weights. The results
of SPICE simulation prove that the problem of teaching neuron to
non-isotonous threshold functions has stable solutions.
In the frame of automata theory, we received one rather interesting result.
We have proved functional equivalence of bilateral linear cellular automata
arrays and cellular arrays with arbitrary unilateral connection graph. The
considered task is building a cellular automaton, such that an array from
automata of this type with arbitrary unilateral bivalent connection graph
can solve the same problem as a bilateral linear cellular automata array. It
is presumed that the complexity of the cellular automaton does not depend on
the number of the automata in the array and, maybe, depends in some regular
way on the rank of the respective graph vertex.
With a top-down education, the undergraduate and graduate school students
are involved in all steps of the research. They learn the real style of
design in the environment System-Circuit-Process, understood the
importance of creating the design methodology for circuit optimization and
prepared themselves to working in time of very fast changes in
microelectronics and computer science.
Refereed Journal Papers
-
Rafail Lashevsky and Yohey Sato.
Deviation-Tolerant Floating Gate Structures
as a Way to Design an On-Chip Learning Neural Networks.
Soft-Computing, A Fusion of Foundations,
Methodologies and Applications, Springer-Verlag, 2001.
Hardware implementation of artificial neural networks (ANN) based on
transistors with floating gates is discussed. Choosing analog approach as
weight storage rather than digital improves the learning accuracy, minimize
chip area and power dissipation. One of the problems in On-Chip learning ANN
with analog weight voltage is the tolerance to supplied voltage deviations,
because of possible difference in supplied voltages in learning and working
times. The proposed floating gate based neuron structure can compensate all
kind of deviations. Design methodology of such a structure is developed.
Refereed Proceeding Papers
-
Varshavsky, V. and Marakhovsky, V.,
Non-Isotonous Beta-Driven Artificial Neuron.
SPIE's 14th Annual International Symposium
on Aerospace/Defense Sensing, Simulation,
and Controls. Applications and Science of
Computational Intelligence III,
pp. 250--257, SPIE, Orlando, Florida, April 2000.
In the paper we discuss variants of digital-analog
CMOS implementation of artificial neuron taught
to logical threshold functions. The implementation
is based on earlier suggested beta-driven
neuron circuit consisting of synapses with
excitatory inputs, beta-comparator and three
output amplifiers. Such a circuit can be taught
only to threshold functions with positive
weights of variables, which belong to the
class of isotonous Boolean functions. However,
most problems solved by artificial neural
networks either require inhibitory inputs.
If the input type (excitatory or inhibitory)
is known beforehand, the problem of inverting
the weight sign is solved trivially by inverting
the respective variable. Otherwise, the neuron
should have synapses capable of forming the
weight and type of the input during the learning,
using only increment and decrement signals.
A neuron with such synapses can learn an arbitrary
threshold function of a certain number of
variables. Synapse circuits are suggested
with two or one memory element for storing
positive and negative input weights. The results
of SPICE simulation prove that the problem
of teaching non-isotonous threshold functions
to a neuron has stable solutions.
-
Varshavsky, V. and Marakhovsky, V.,
Some Remarks about Functional Equivalence
of Bilateral Linear Cellular Arrays and Cellular
Arrays with Arbitrary Unilateral Connection Graph.
Proceedings of 4th International Meeting
VECPAR 2000, Vector and Parallel Processing, Part II,
pp. 399--406, Springer, Porto, Portugal, June 2000.
The considered task is building a cellular
automaton, such that an array from automata
of this type with arbitrary unilateral bivalent
connection graph can solve the same problem
as a bilateral linear cellular automata array.
It is presumed that the complexity of the
cellular automaton does not depend on the
number of the automata in the array and, maybe,
depends in some regular way on the rank of
the respective graph vertex.
Academic Activities
-
Rafail A. Lashevsky.
Organiser of the University of Aizu International Workshop on ``Education
for Information Technology in the 21st century", June 13-14, 2000.
-
Vyacheslav B. Marakhovsky.
Member of ACM.
-
Vyacheslav B. Marakhovsky.
Member of IEEE.
Others
-
Yutaka Nemoto.
Graduation Thesis: Comparator for ANN with Dynamic Analog Memory.
University of Aizu, 2000, Thesis Adviser: Rafail A. Lashevsky.
-
Noriko Oosuka.
Graduation Thesis: Floating Gate Transistor Based ANN with Digital
and Analog Input Weights Memory Comparison.
University of Aizu, 2000, Thesis Adviser: Rafail A. Lashevsky.
-
Takaaki Asakura.
Graduation Thesis: Deviation-Tolerant Floating Gate Structures
for On-Chip Learning Artificial Network.
University of Aizu, 2000, Thesis Adviser: Rafail A. Lashevsky.
-
Hirotoshi Takagi.
Graduation Thesis: Step-Voltage Comparator with Capacitor Divider
for Artificial Neural Networks with Analog Voltage Memory.
University of Aizu, 2000, Thesis Adviser: Rafail A. Lashevsky.
-
Takanobu Asano.
Graduation Thesis: Beta-CMOS Artificial Neuron with Alternative
Learning Step. Univ. of Aizu, 2000, Thesis Adviser: Vyacheslav B. Marakhovsky.
-
Yohey Sato.
Master Thesis: Deviation -Tolerant Floating Gate Structures as a Way to
Design an On-Chip Learning Neural Networks.
University of Aizu Graduate School, 2000, Thesis Adviser: Rafail A. Lashevsky.
-
Jun Ono.
Master Thesis: Analyses of Weight and Threshold Deviation Influence on the
On-Chip Learning ANN Parameters.
University of Aizu Graduate School, 2000, Thesis Adviser: Rafail A. Lashevsky.
Next: Computer Communications Laboratory
Up: Department of Computer
Previous: Computer Devices Laboratory