Deep optimisation: multi-scale evolution by inducing and searching in deep representations

Jamie Caldwell*, Joshua Knowles, Christoph Thies, Filip Kubacki, Richard Watson

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

The ability of evolutionary processes to innovate and scale up over long periods of time, observed in nature, remains a central mystery in evolutionary biology, and a challenge for algorithm designers to emulate and explain in evolutionary computation (EC). The Major Transitions in Evolution is a compelling theory that explains evolvability through a multi-scale process whereby individuality (and hence selection and variation) is continually revised by the formation of associations between formerly independent entities, a process still not fully explored in EC. Deep Optimisation (DO) is a new type of model-building optimization algorithm (MBOA) that exploits deep learning methods to enable multi-scale optimization. DO uses an autoencoder model to induce a multi-level representation of solutions, capturing the relationships between the lower-level units that contribute to the quality of a solution. Variation and selection are then performed within the induced representations, causing model-informed changes to multiple solution variables simultaneously. Here, we first show that DO has impressive performance compared with other leading MBOAs (and other rival methods) on multiple knapsack problems, a standard combinatorial optimization problem of general interest. Going deeper, we then carry out a detailed investigation to understand the differences between DO and other MBOAs, identifying key problem characteristics where other MBOAs are afflicted by exponential running times, and DO is not. This study serves to concretize our understanding of the Major Transitions theory, and why that leads to evolvability, and also provides a strong motivation for further investigation of deep learning methods in optimization.

Original languageEnglish
Title of host publicationApplications of Evolutionary Computation
Subtitle of host publication24th International Conference, EvoApplications 2021, Held as Part of EvoStar 2021, Virtual Event, April 7–9, 2021, Proceedings
EditorsPedro A. Castillo, Juan Luis Jiménez Laredo
PublisherSpringer
Pages506-521
Number of pages16
Edition1
ISBN (Electronic)9783030726997
ISBN (Print)9783030726980
DOIs
Publication statusPublished - 1 Apr 2021
Event24th International Conference on the Applications of Evolutionary Computation, EvoApplications 2021 held as Part of EvoStar 2021 - Virtual, Online
Duration: 7 Apr 20219 Apr 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12694 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on the Applications of Evolutionary Computation, EvoApplications 2021 held as Part of EvoStar 2021
CityVirtual, Online
Period7/04/219/04/21

Bibliographical note

Funding Information:
Acknowledgements. We acknowledge financial support from the EPSRC Centre for Doctoral Training in Next Generation Computational Modelling grant EP/L015382/1.

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

Keywords

  • Deep autoencoder
  • Model-Building Optimisation Algorithms
  • Multi-scale search
  • Problem structure

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Deep optimisation: multi-scale evolution by inducing and searching in deep representations'. Together they form a unique fingerprint.

Cite this