Ayuda
Ir al contenido

Dialnet


Resumen de The inner and outer approaches to the design of recursive neural architectures

Pierre Baldi

  • Feedforward neural network architectures work well for numerical data of fixed size, such as images. For variable size, structured data, such as sequences, ddimensional grids, trees, and other graphs, recursive architectures must be used. We distinguish two general approaches for the design of recursive architectures in deep learning, the inner and the outer approach. The inner approach uses neural networks recursively inside the data graphs, essentially to “crawl” the edges of the graphs in order to compute the final output. It requires acyclic orientations of the underlying graphs. The outer approach uses neural networks recursively outside the data graphs and regardless of their orientation. These neural networks operate orthogonally to the data graph and progressively “fold” or aggregate the input structure to produce the final output. The distinction is illustrated using several examples from the fields of natural language processing, chemoinformatics, and bioinformatics, and applied to the problem of learning from variable-size sets.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus