(Return to ICNN'97 Home Page)


ICNN'97 PANELS


Monday, June 9, 16:20 - 18:20 PM


Tuesday, June 10, 16:20 - 18:20 PM


Wednesday, June 11, 16:20 - 18:20 PM



ICNN'97 Panel on "Connectionist Learning: Is it Time to Reconsider the Foundations?"

Chair: Asim Roy, Arizona State University, USA


A panel will discuss the above question at ICNN'97 on Monday afternoon (June 9). Below is the abstract for the panel discussion broadly outlining the questions to be addressed. Also attached is a slightly modified version of a subsequent note sent to the panelists. The issues are very broad and the questions are simple. The questions are not tied to any specific "algorithm" or "network architecture" or "task to be performed." However, the answers to these simple questions may have an enormous effect on the "nature of algorithms" that we would call "brain-like" and for the design and construction of autonomous learning systems and robots. These questions also have a bearing on other brain related sciences such as neuroscience, neurobiology and cognitive science.

Please send any comments on these issues directly to: asim.roy@asu.edu. The collection of responses will be posted to the newsgroups in a few weeks. All comments/criticisms/suggestions are welcome. All good science depends on vigorous debate.

ABSTRACT

Classical connectionist learning is based on two key ideas. First, no training examples are to be stored by the learning algorithm in its memory (memoryless learning). It can use and perform whatever computations are needed on any particular training example, but must forget that example before examining others. The idea is to obviate the need for large amounts of memory to store a large number of training examples. The second key idea is that of local learning - that the nodes of a network are autonomous learners.

Local learning embodies the viewpoint that simple, autonomous learners, such as the single nodes of a network, can in fact produce complex behavior in a collective fashion. This second idea, in its purest form, implies a predefined net being provided to the algorithm for learning, such as in multilayer perceptrons.

Recently, some questions have been raised about the validity of these classical ideas. The arguments against classical ideas are simple and compelling. For example, it is a common fact that humans do remember and recall information that is provided to them as part of learning. And the task of learning is considerably easier when one remembers relevant facts and information than when one doesn't.

Second, strict local learning (e.g. back propagation type learning) is not a feasible idea for any system, biological or otherwise. It implies predefining a network "by the system" without having seen a single training example and without having any knowledge at all of the complexity of the problem. Again, there is no system that can do that in a meaningful way. The other fallacy of the local learning idea is that it acknowledges the existence of a "master" system that provides the design so that autonomous learners can learn.

Recent work has shown that much better learning algorithms, in terms of computational properties (e.g. designing and training a network in polynomial time complexity, etc.) can be developed if we don't constrain them with the restrictions of classical learning. It is, therefore, perhaps time to reexamine the ideas of what we call "brain-like learning."

This panel will attempt to address some of the following questions on classical connectionists learning:

1. Should memory be used for learning? Is memoryless learning an unnecessary restriction on learning algorithms?

2. Is local learning a sensible idea? Can better learning algorithms be developed without this restriction?

3. Who designs the network inside an autonomous learning system such as the brain?

PANEL MEMBERS

  1. Igor Aleksander
  2. Shunichi Amari
  3. Eric Baum
  4. Jim Bezdek
  5. Rolf Eckmiller
  6. Lee Giles
  7. Geoffrey Hinton
  8. Dan Levine
  9. Robert Marks
  10. Jean Jacques Slotine
  11. John G. Taylor
  12. David Waltz
  13. Paul Werbos
  14. Nicolaos Karayiannis (Panel Moderator, ICNN'97 General Chair)
  15. Asim Roy (Chair and Organizer)

A SUBSEQUENT NOTE SENT TO THE PANELISTS

The panel abstract was written to question the two pillars of classical connectionist learning - memoryless learning and pure local learning. With regards to memoryless learning, the basic argument against it is that humans do store information (remember facts/information) in order to learn. So memoryless learning, as far I understand, cannot be justified by any behavioral or biological observations/facts. That does not mean that humans store any and all information provided to them. They are definitely selective and parsimonious in the choice of information/facts to collect and store.

We have been arguing that it is the "combination" of memoryless learning and pure local learning that is not feasible for any system, biological or otherwise. Pure local learning, in this context, implies that the system somehow puts together a set of "local learners" that start learning with each learning example given to it (e.g. in back propagation) without having seen a single training example before and without knowing anything about the complexity of the problem. Such a system can be demonstrated to do well in some cases, but would not work in general.

Note that not all existing neural network algorithms are of this pure local learning type. For example, if I understand correctly, in constructive algorithms such as ART, RBF, RCE/hypersphere and others, a "decision" to create a new node is made by a "global decision-maker" based on evidence on performance of the existing system. So there is quite a bit of global coordination and "decision-making" in those algorithms beyond the simple "local learning".

Anyway, if we "accept" the idea that memory can indeed be used for the purpose of learning (Paul Werbos indicated so in one of his notes), the terms of the debate/discussion change dramatically. We then open the door to the development of far more robust and reliable learning algorithms with much nicer properties than before. We can then start to develop algorithms that are closer to "normal human learning processes". Normal human learning includes processes such as (1) collection and storage of information about a problem, (2) examination of the information at hand to determine the complexity of the problem, (3) development of trial solutions (nets)for the problem, (4) testing of trial solutions (nets), (5) discarding such trial solutions (nets) if they are not good enough, and (6) repetition of these processes until an acceptable solution is found. And these learning processes are implemented within the brain, without doubt, using local computing mechanisms of different types. But these learning processes cannot exist without allowing for storage of information about the problem.

One of the "large" missing pieces in the neural network field is the definition or characterization of an autonomous learning system such as the brain. We have never defined the external behavioral characteristics of our learning algorithms. We have largely pursued algorithm development from an "internal mechanisms" point of view (local learning, memoryless learning) rather than from the point of view of "external behavior or characteristics" of these resulting algorithms. Some of these external characteristics of our learning algorithms might be:(1) the capability to design the net on their own, (2) polynomial time complexity of the algorithm in design and training of the net, (3) generalization capability, and (4) learning from as few examples as possible (quickness in learning).

It is perhaps time to define a set of desirable external characteristics for our learning algorithms. We need to define characteristics that are "independent of": (1) a particular architecture, (2) the problem to be solved (function approximation, classification, memory, etc.), (3)local/global learning issues, and (4) issues of whether to use memory or not to learn. We should rather argue about these external properties than issues of global/local learning and of memoryless learning.

(Return to Top)



ICNN'97 Panel on "Brain Imaging and Modelling" (June 10)

Chair: Professor John G. Taylor


1. ICNN97.

The topics of the Conference Sessions cover the highly relevant areas of image and signal processing and data analysis as well as neurobiology and neurocognition. The Panel should thus be of great interest to many of the attendees. This is especially true since the use of non-invasive instruments has suddenly exploded onto the arena of brain science and is making significant contributions to the subject.

The Panel itself is scheduled to be held on the afternoon of Tuesday, June 10 from 4.20 - 6.20 pm.

2. Brain Imaging.

Enormous strides have been made in the last few years in probing the activities of the brain as it processes information, with a battery of machines - PET, fMRI, MEG & EEG, optically sensitive dyes - with which to follow the neuronal activity as it circulates around the coupled networks dedicated to solving different tasks. These networks are now becoming delineated, as are the ways in which the different modules interact between each other both across space and in a temporal manner. It is these advances and their implications for brain science which are the topics to be discussed at the Panel.

3.The Panel

The theme of the Panel is

"The methods and results of brain imaging, and the implication of the results for models of the brain".

The program is as follows:

4.20: Dr A Ioannides (IME, KFA)

"Magnetoencephalography: Probing Event Ordering from Milliseconds to Seconds"

4:45 Dr K Friston (Wellcome Neurological Institute, London, UK)

"Linear and nonlinear models of brain activation and interactions"

5:10 Prof P Fox (San Antonio)

"Metanalytic Modelling of Brain Functional Areas and Systems"

5:35 Prof JG Taylor (Convenor; IME, Research Centre-Juelich/Centre for Neural Networks, King's College London, UK)

6:00 Panel Discussion

(Return to Top)



ICNN'97 Panel on "Modeling the Creative Process"

Chair: Dan Levine, University of Texas at Arlington, USA


"Abduction and Creativity: A Connectionist Approach"

Jean-Daniel Kant, INRIA (France) and University of Texas at Arlington

Abduction is a cognitive process that consists in ``inferring the best or most plausible explanations for a given set of facts'' (Peng & Reggia, 1990). I will discuss how such abductive processes may be related to creativity. Then, I will suggest some basics for a connectionist implementation of abduction for categorization and decision-making tasks. Finally I will examine what such an architecture might be able to tell about creativity.

"Good Art, Bad Art"

Frank Larkey, University of Houston

Recognition, classification, and critical evaluation of visual art (paintings, drawings, prints, etc.) are made routinely for exhibitions, galleries, museums, and private collections. The potential modeling of this process will be presented. Interviews with art critics and visual artists will provided a basis for exploring the relationship of knowledge base and categorization of visual art as a creative product.

"Creative Cooperation: Exploiting Ignorance and Empathy"

Sam Leven, Scientific Cybernetics, Inc, and FOR A NEW SOCIAL SCIENCE

Group members can share much more than their individual pools of information -- in creative groups, they share their fantasies (Bormann, 1996), their reflective wisdom (Stohl, 1996), and the ignorance which allows production of new combinations of knowledge (Burt, 1982). A new model of creativity derived from psychobiological and genetic research (Leven, 1996 and in press) and an older model of bargaining (Elsberry, et. al., 1988; Leven and Smith, 1996) allow explanation of emergent images in group interaction and their molding by shared envisioning (Langer, 1997).

"Novel Rule Making and Analogy Formation"

Daniel S. Levine, University of Texas at Arlington

A complex network architecture, combining adaptive resonance with various modulatory influences, has been developed that can learn rules inductively and switch from one level of rule complexity to another (e.g., spatial to temporal) based on reinforcement. A variant of this network also seems to be able to learn simple proportional analogies, such as A is to B as S is to ?, or apple is to red as banana is to ?. Both these studies suggest methods by which existing schemata can be combined in novel and creative ways.

"The Role of Circular Reactions in Cognitive Development and Creativity"

Haluk Ogmen, University of Houston

In this talk, we will present a model for primary, secondary, and tertiary circular reactions. We will discuss how a developmental progression of circular reactions can transform a chance discovery process to a state where new means are invented by mental equilibration of schemes.

"On Creativity"

Karl H. Pribram, Radford University

Thomas Edison: Creativity is 1 percent inspiration and 99 percent perspiration.

"Simulating the Old in New Ideas"

Cynthia M. Sifonis and Thomas B. Ward, Texas A&M University

Recent research has examined the structure of the novel ideas people develop through imaginative thought. Across many domains, including imaginary animals, tools, toys, carpets, fruit, and restaurants, people project the characteristic properties of existing concepts onto their novel creations, and when they introduce differences, those differences tend to be alignable ones. Although a number of simulation programs have been developed that produce novel ideas, none explicitly simulate precisely how the structure of known concepts influences the structure of those novel creations. A full account of creative functioning demands that models be able to do so.

"Incubation and Recovery from Mental Blocks"

Steven Smith, Texas A&M University

Traditional conceptions of incubation in creative thinking and problem solving rely on the operation of autonomous implicit cognitive pro cesses. Evidence for such autonomous processes is either lacking or flawed. An alternative theory of incubation views the phenomenon as recovery from mental blocks induced by context-dependent fluctuations in one's mental set. Both empirical and historical evidence fit this new view of incubation.

"Creativity and Consciousness"

John Taylor, King's College London and KFA Juelich (Germany)

Creativity will be considered as part of the analysis of the difference between conscious and non-conscious processing, and some of the neural modules and their models will be briefly surveyed.


(Return to Top)

Web Site Author: Mary Lou Padgett (m.padgett@ieee.org)
URL: http://www.mindspring.com/~pci-inc/ICNN97/panels.htm
(Last Modified: 28-May-1997)