Evolving Networks: Using the Genetic Algorithm with Connectionist Learning

R. K. Belew, J. McInerney, and N. N. Schraudolph. Evolving Networks: Using the Genetic Algorithm with Connectionist Learning. In C. G. Langton, C. Taylor, J. D. Farmer, and S. Rasmussen, editors, Artificial Life II, SFI Studies in the Sciences of Complexity: Proceedings, pp. 511–547, Addison-Wesley, Redwood City, CA, 1992.

Download

pdf djvu ps.gz
540.5kB   301.1kB   212.0kB  

Abstract

It is appealing to consider hybrids of neural network learnning algorithms with evolutionary search procedures simply because nature has successfully done so, suggesting that such hybrids may be more efficient than either technique applied in isolation. We survey recent work in this area and report our own experiments on using the GA to search the space of initial conditions for backpropagation networks. We find that use of the GA provides much greater confidence in the face of the dependence on initial conditions that plague gradient techniques, and allows a reduction of individual training time by as much as two orders of magnitude. We conclude that the GA's global sampling characteristics complement connectionist local search techniques well, leading to efficient and robust hybrids.

BibTeX Entry

@incollection{BelMcISch92,
     author = {Richard K. Belew and John McInerney and Nicol N. Schraudolph},
      title = {\href{http://nic.schraudolph.org/pubs/BelMcISch92.pdf}{
               Evolving Networks: Using the Genetic Algorithm
               with Connectionist Learning}},
     editor = {Christopher G. Langton and Charles Taylor and J. Doyne Farmer
               and Steen Rasmussen},
  booktitle = {Artificial Life II},
     series = {SFI Studies in the Sciences of Complexity: Proceedings},
     volume =  10,
      pages = {511--547},
  publisher = {Addison-Wesley},
    address = {Redwood City, CA},
       year =  1992,
   b2h_type = {Book Chapters},
  b2h_topic = {Evolutionary Algorithms},
   abstract = {
    It is appealing to consider hybrids of neural network learnning algorithms
    with evolutionary search procedures simply because nature has successfully
    done so, suggesting that such hybrids may be more efficient than either
    technique applied in isolation.  We survey recent work in this area and
    report our own experiments on using the GA to search the space of initial
    conditions for backpropagation networks.
    We find that use of the GA provides much greater confidence in the face of
    the dependence on initial conditions that plague gradient techniques, and
    allows a reduction of individual training time by as much as two orders of
    magnitude.  We conclude that the GA's {\em global sampling}
    characteristics complement connectionist {\em local search}
    techniques well, leading to efficient and robust hybrids.
}}

Generated by bib2html.pl (written by Patrick Riley) on Thu Sep 25, 2014 12:00:33