Click for other NIPS*97 workshops
Breckenridge, Colorado, December 5 1997


Paolo Frasconi  Alessandro Sperduti 
Dipartimento di Sistemi e Informatica Dipartimento di Informatica
Università di Firenze (ITALY)  Università di Pisa (ITALY) 

Updated abstracts page


Algorithms that manipulate symbolic information are capable of dealing with highly structured data. On the other hand, many well known learning systems, such as feedforward neural nets and mixture models, are limited to domains in which instances are organized into static data structures, like records or fixed-size arrays. Restricted classes of dynamical models have been studied in connectionism. For example, recurrent neural nets and (input/output) HMMs generalize feedforward nets and mixture models to sequences, a particular case of dynamically structured data. However, the range of useful dynamical data structures is clearly not limited to sequences (i.e., linear chains, from a graphical perspective).

 Examples of domains in which instances have a rich structure are quite numerous. Data in multimedia applications or in hearth sciences have temporal and spatial dimensions, generalizing sequences to regular multidimensional grids. Compounds in chemistry and molecular biology are naturally represented by undirected graphs. Complex graphical structures (such as labeled trees and webs) are very common in syntactic pattern recognition. Other domains such as automated reasoning, software engineering or the World Wide Web also yield instances which are represented by directed graphs.

 Unfortunately, connectionist models capable of naturally dealing with dynamic data structures more general than sequences have received relatively scarce attention in the literature until recently. A few exceptions are recursive neural networks and hidden recursive models, recent extensions of recurrent nets and HMMs that can learn directed acyclic graphs.

 The aim of the workshop is to reach a unified view of formalisms and tools for dealing with rich data representations, covering issues such as computational power of recursive neural networks, probabilistic graphical models for learning data structures, methods for learning with cyclic and infinite graphs, methods for learning transduction from graphs to graphs. 

Target audience.

The workshop aims to bring together researchers actively involved in this and related areas. The discussion should address problems and novel potential achievements related to algorithms and adaptive architectures for dynamical data structures. A great interest and potential of discussion is expected from people that have worked in the temporal domain, since sequences are just a particular case of graphs. It is believed by the organizers that there is a lot of room for research aiming to extend methods and theoretical results from sequential spaces to more general graphical spaces. 


The workshop will be open by one or two review talks by the organizers, for introducing the problem of learning data structures, with emphasis on some well assessed models such as recursive neural networks. Then a set (about 10) of short talks focusing on specific aspects of dynamical processing of sequences and data structures will be presented, followed by open discussion among the participants. The amount of time for presentations and group discussion will be evenly balanced. 
The schedule and abstracts can be found here


Schedule and abstracts.
Adaptive processing of data structures. The page (still under construction) collects information about connectionist models for learning structured information.
The 2nd International summer school on Neural Networks has lectures related to the topics covered in this workshop.
ECAI '96 Workshop on Neural Networks and Structured Knowledge (NNSK)

held on August 12, 1996 in Budapest, Hungary, has covered similar topics.
NIPS*97 Homepage

Paolo Frasconi

Last modified: Thu Nov 6 19:26:45 MET 1997