9th Workshop on Evolutionary Computation for the Automated Design of Algorithms
July 13-17, 2019 @ GECCO 2019 in Prague, Czech Republic
“It may be turtles all the way down, but the turtles get smaller.” -- Anonymous @ ECADA 2017
| April 3, 2019
||Workshop paper submission deadline
|April 17, 2018
||Notification of acceptance
|April 24, 2019
|April 24, 2019
||Author registration deadline
Workshop Schedule & Venue
Time: Sunday, July 14, 2019, 10:40-12:30
Venue:Club C (1F)
||ECADA Workshop Intro - all co-chairs
||Erik Andreas Meulman, Peter Alexander Nicolaas Bosman "Toward Self-Learning Model-Based EAs"
||Mohamed Amine EL Majdouli "SAPIAS: Towards an independent Self-Adaptive Per-Instance Algorithm Selection for Metaheuristics"
||Michael Adam Lones "Instruction-Level Design of Local Optimisers using Push GP"
||Adam Harter, Aaron Scott Pope, Daniel R. Tauritz, Chris Rawlings "Empirical Evidence of the Effectiveness of Primitive Granularity Control for Hyper-Heuristics"
||Aaron Scott Pope, Daniel R. Tauritz, Chris Rawlings "Automated Design of Random Dynamic Graph Models"
||Samuel N. Richter, Michael G. Schoen, Daniel R. Tauritz "Evolving Mean-Update Selection Methods for CMA-ES"
||Concluding Remarks - all co-chairs
The main objective of this workshop is to discuss hyper-heuristics and related methods, including but not limited to evolutionary computation methods, for generating and improving algorithms with the goal of producing solutions (algorithms) that are applicable to multiple instances of a problem domain. The areas of application of these methods include optimization, data mining and machine learning [1-18,23].
Automatically generating and improving algorithms by means of other algorithms has been the goal of several research fields, including Artificial Intelligence in the early 1950s, Genetic Programming in the early 1990s, and more recently automated algorithm configuration  and hyper-heuristics . The term hyper-heuristics generally describes meta-heuristics applied to a space of algorithms. While Genetic Programming has most famously been used to this end, other evolutionary algorithms and meta-heuristics have successfully been used to automatically design novel (components of) algorithms. Automated algorithm configuration grew from the necessity of tuning the parameter settings of meta-heuristics and it has produced several powerful (hyper-heuristic) methods capable of designing new algorithms by either selecting components from a flexible algorithmic framework [3,4] or recombining them following a grammar description .
Although most Evolutionary Computation techniques are designed to generate specific solutions to a given instance of a problem, one of the defining goals of hyper-heuristics is to produce solutions that solve more generic problems. For instance, while there are many examples of Evolutionary Algorithms for evolving classification models in data mining and machine learning, the work described in  employed a hyper-heuristic using Genetic Programming to create a generic classification algorithm which in turn generates a specific classification model for any given classification dataset, in any given application domain. In other words, the hyper-heuristic is operating at a higher level of abstraction compared to how most search methodologies are currently employed; i.e., it is searching the space of algorithms as opposed to directly searching in the problem solution space , raising the level of generality of the solutions produced by the hyper-heuristic evolutionary algorithm. In contrast to standard Genetic Programming, which attempts to build programs from scratch from a typically small set of atomic functions, hyper-heuristic methods specify an appropriate set of primitives (e.g., algorithmic components) and allow evolution to combine them in novel ways as appropriate for the targeted problem class. While this allows searches in constrained search spaces based on problem knowledge, it does not in any way limit the generality of this approach as the primitive set can be selected to be Turing-complete. Typically, however, the initial algorithmic primitive set is composed of primitive components of existing high-performing algorithms for the problems being targeted; this more targeted approach very significantly reduces the initial search space, resulting in a practical approach rather than a mere theoretical curiosity. Iterative refining of the primitives allows for gradual and directed enlarging of the search space until convergence.
As meta-heuristics are themselves a type of algorithm, they too can be automatically designed employing hyper-heuristics. For instance, in 2007, Genetic Programming was used to evolve mate selection in evolutionary algorithms ; in 2011, Linear Genetic Programming was used to evolve crossover operators ; more recently, Genetic Programming was used to evolve complete black-box search algorithms [13,14,16], SAT solvers , and FuzzyART category functions . Moreover, hyper-heuristics may be applied before deploying an algorithm (offline)  or while problems are being solved (online) , or even continuously learn by solving new problems (life-long) . Offline and life-long hyper-heuristics are particularly useful for real-world problem solving where one can afford a large amount of a priori computational time to subsequently solve many problem instances drawn from a specified problem domain, thus amortizing the a priori computational time over repeated problem solving. Recently, the design of Multi-Objective Evolutionary Algorithm components was automated .
There is considerable scope to learn more about the factors that influence the quality of algorithms generated by hyper-heuristics, such as the impact of the meta-heuristic exploring the algorithm space. An initial study compared the performance of algorithms generated by hyper-heuristics powered by five major types of Genetic Programming . Another avenue for research is investigating the potential performance improvements obtained through the use of asynchronous parallel evolution to exploit the typical large variation in fitness evaluation times when executing hyper-heuristics .
- Edmund K. Burke, Gendreau, M., Hyde, M., Kendall, G., Ochoa, G., Özcan, E., & Qu, R. (2013). Hyper-heuristics: A survey of the state of the art. Journal of the Operational Research Society, 64(12), 1695-1724.
- Holger H. Hoos (2012). Automated algorithm configuration and parameter
tuning. In Autonomous search (pp. 37-71). Springer Berlin Heidelberg.
- KhudaBukhsh, A. R., Xu, L., Holger H. Hoos & Leyton-Brown, K. (2009).
SATenstein: Automatically Building Local Search SAT Solvers from Components. In IJCAI, 9, 517-524.
- Manuel López-Ibáńez and Thomas Stützle. (2012). The automatic design of multiobjective ant colony optimization algorithms. IEEE Transactions on Evolutionary Computation, 16(6):861-875.
- Mascia, F., Manuel López-Ibáńez, Dubois-Lacoste, J., & Thomas Stützle. (2014). Grammar-based generation of stochastic local search heuristics through automatic algorithm configuration tools. Computers & operations research, 51, 190-199.
- William B.
Langdon and Mark
Harman. Genetically Improving 50000 Lines of C++. Research Note ,
RN/12/09, Department of Computer Science, University College London, Gower
Street, London WC1E 6BT, UK, 2012.
- Justyna Petke,
Mark Harman, William B. Langdon, and
Westley Weimer. Using
Genetic Improvement & Code Transplants to Specialise a C++ Program to a
Problem Class Proceedings of the 17th European Conference on Genetic
Programming, EuroGP 2014, Granada, Spain, 2014. Springer Verlag.
- Gisele L.
Pappa and Alex A.
Freitas. Automating the Design of Data Mining Algorithms: An Evolutionary
Computation Approach, Springer, Natural Computing Series, 2010.
Edmund K. Burke, Matthew Hyde, Graham Kendall and John Woodward. A genetic programming
hyper-heuristic approach for evolving 2-D strip packing heuristics. In IEEE
Transactions on Evolutionary Computation, 14(6):942-958, December 2010.
- M. Oltean and D. Dumitrescu. Evolving TSP heuristics using multi
expression programming. In: Computational Science - ICCS 2004, Lecture Notes
in Computer Science 3037, pp. 670-673. Springer, 2004.
- Ekaterina A. Smorodkina and Daniel R. Tauritz. Toward Automating EA
Configuration: the Parent Selection Stage. In Proceedings of CEC 2007 - IEEE Congress on Evolutionary Computation, pages 63-70, Singapore, September 25-28, 2007.
- Brian W. Goldman and Daniel R.
Tauritz. Self-Configuring Crossover. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11),
pages 575-582, Dublin, Ireland, July 12-16, 2011.
- Matthew A. Martin and Daniel R.
Tauritz. Evolving Black-Box Search Algorithms Employing Genetic
Programming. In Proceedings of the 15th Annual Conference Companion on
Genetic and Evolutionary Computation (GECCO '13), pages 1497-1504, Amsterdam,
The Netherlands, July 6-10, 2013.
- Matthew A. Martin and Daniel R.
Tauritz. A Problem Configuration Study of the Robustness of a Black-Box
Search Algorithm Hyper-Heuristic. In Proceedings of the 16th Annual
Conference Companion on Genetic and Evolutionary Computation (GECCO '14),
pages 1389-1396, Vancouver, BC, Canada, July 12-16, 2014.
- John R. Woodward and
Jerry Swan, "The automatic
generation of mutation operators for genetic algorithms", in Proceedings of
the 14th international conference on Genetic and evolutionary computation
- Matthew A. Martin and Daniel R.
Tauritz. Hyper-Heuristics: A Study On
Increasing Primitive-Space. In Proceedings of the 17th Annual Conference
Companion on Genetic and Evolutionary Computation (GECCO '15), pages
1051-1058, Madrid, Spain, July 11-15, 2015.
- Su Nguyen and Mengjie Zhang and Mark Johnston and Kay Chen Tan. Automatic
Design of Scheduling Policies for Dynamic Multi-objective Job Shop Scheduling
via Cooperative Coevolution Genetic Programming. IEEE Transactions on
Evolutionary Computation, 18(2):193-208, April 2014.
- Sean Harris, Travis Bueter, and Daniel R.
Tauritz. A Comparison of Genetic Programming Variants for Hyper-Heuristics. In Proceedings of the 17th
Annual Conference Companion on Genetic and Evolutionary Computation (GECCO
'15), pages 1043-1050, Madrid, Spain, July 11-15, 2015.
- Kevin Sim, Emma Hart, and Ben Paechter. A Lifelong Learning Hyper-Heuristic for Bin-Packing Evolutionary Computation, MIT Press Evolutionary Computation, 23(1):37-67, March 2015.
- Alex R. Bertels, and Daniel R. Tauritz. Why Asynchronous Parallel Evolution is the Future of Hyper-heuristics: A CDCL SAT Solver Case Study. In Proceedings of the 18th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '16), pages 1359-1365, Denver, Colorado, U.S.A., July 20-24, 2016.
- Leonardo C. T. Bezerra, Manuel López-Ibáńez, and Thomas Stützle. Automatic Component-Wise Design of Multi-Objective Evolutionary Algorithms. IEEE Transactions on Evolutionary Computation, 20(3):403–417, 2016.
- Marketa Illetskova, Alex R. Bertels, Joshua M. Tuggle, Adam Harter, Samuel Richter, Daniel R. Tauritz, Samuel Mulder, Denis Bueno, Michelle Leger and William M. Siever. Improving Performance of CDCL SAT Solvers by Automated
Design of Variable Selection Heuristics. Accepted for publication in the proceedings of the 2017 IEEE Symposium Series on Computational Intelligence (SSCI 2017), Honolulu, Hawaii, U.S.A., November 27 - December 1, 2017.
- Islam Elnabarawy, Daniel R. Tauritz and Donald C. Wunsch. Evolutionary Computation for the Automated Design of Category Functions for Fuzzy ART: An Initial Exploration. In Proceedings of the 19th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '17), pages 1133-1140, Berlin, Germany, July 15-19, 2017.
Call for Papers
We welcome original submissions on all aspects of Evolutionary Computation for the Automated Design of Algorithms, in particular, evolutionary computation methods and other hyper-heuristics for the automated design, generation or improvement of algorithms that can be applied to any instance of a target problem domain. Relevant methods include methods that evolve whole algorithms given some initial components as well as methods that take an existing algorithm and improve it or adapt it to a specific domain. Another important aspect in automated algorithm design is the definition of the primitives that constitute the search space of hyper-heuristics. These primitives should capture the knowledge of human experts about useful algorithmic components (such as selection, mutation and recombination operators, local searches, etc) and, at the same time, allow the generation of new algorithm variants. Examples of the application of hyper-heuristics, including genetic programming and automatic configuration methods, to such frameworks of algorithmic components are of interest to this workshop, as well as the (possibly automatic) design of the algorithmic components themselves and the overall architecture of metaheuristics. Therefore, relevant topics include (but are not limited to):
- Applications of hyper-heuristics, including general-purpose automatic algorithm configuration methods for the design of metaheuristics, in particular evolutionary algorithms, and other algorithms for application domains such as optimization, data mining, machine learning, image processing, engineering, cyber security, critical infrastructure protection, and bioinformatics.
- Novel hyper-heuristics, including but not limited to genetic programming based approaches, automatic configuration methods, and online, offline and life-long hyper-heuristics, with the stated goal of designing or improving the design of algorithms.
- Empirical comparison of hyper-heuristics.
- Theoretical analyses of hyper-heuristics.
- Studies on primitives (algorithmic components) that may be used by hyper-heuristics as the search space when automatically designing algorithms.
- Automatic selection/creation of algorithm primitives as a preprocessing step for the use of hyper-heuristics.
- Analysis of the trade-off between generality and effectiveness of different hyper-heuristics or of algorithms produced by a hyper-heuristic.
- Analysis of the most effective representations for hyper-heuristics (e.g., Koza style Genetic Programming versus Cartesian Genetic
- Asynchronous parallel evolution of hyper-heuristics.
Workshop papers must be submitted using the GECCO submission system. After login, the authors need to select the "Workshop Paper" submission form. In the form, the authors must select the workshop they are submitting to. To see a sample of the "Workshop Paper" submission form go to GECCO's submission site and chose "Sample Submission Forms". Submitted papers must not exceed 8 pages (excluding references) and are required to be in compliance with the GECCO 2019 Papers Submission Instructions. It is recommended to use the same templates as the papers submitted to the main tracks. It is not required to remove the author information. All accepted papers will be presented at the corresponding workshop and appear in the GECCO Conference Companion Proceedings. By submitting a paper, the author(s) agree that, if their paper is accepted, they will:
- Submit a final, revised, camera-ready version to the publisher on or before the camera-ready deadline
- Register at least one author before April 24, 2019 to attend the conference
- Attend the conference (at least one author)
- Present the accepted paper at the conference
Co-Chairs (in alphabetical order)
E-mail: email@example.comEmma Hart gained a 1st Class Honours Degree in Chemistry from the University of Oxford, followed by an MSc in Artificial Intelligence from the University of Edinburgh. Her PhD, also from the University of Edinburgh, explored the use of immunology as an inspiration for computing, examining a range of techniques applied to optimisation and data classification problems. She moved to Edinburgh Napier University in 2000 as a lecturer, and was promoted to a Chair in 2008 in Natural Computation. She is active world-wide in the field of Evolutionary Computation, for example as General Chair of PPSN 2016, and as a Track Chair at GECCO for several years. She has given keynotes at EURO 2016 and UKCI 2015, as well as invited talks and tutorials at many Universities and international conferences. She is Editor-in-Chief of Evolutionary Computation (MIT Press) from January 2016 and an elected member of the ACM SIGEVO Executive Board. She is also a member of the UK Operations Research Society Research Panel.
Daniel R. Tauritz is an Associate Professor and Associate Chair for Undergraduate Studies in the Department of Computer Science at the Missouri University of Science and Technology (S&T), a contract scientist for Sandia National Laboratories, a university collaboration scientist at Los Alamos National Laboratory (LANL), the founding director of S&T's Natural Computation Laboratory, and founding academic director of the LANL/S&T Cyber Security Sciences Institute. He received his Ph.D. in 2002 from Leiden University for Adaptive Information Filtering employing a novel type of evolutionary algorithm. He has served on many EC related conference program committes, including the GECCO GA track program committee, the Congress on Evolutionary Computation program committee, the PPSN program committee, and a variety of other international conference program committees. His research interests include the design of hyper-heuristics and self-configuring evolutionary algorithms and the application of computational intelligence techniques in cyber security, critical infrastructure protection, and program understanding. He was granted a US patent for an artificially intelligent rule-based system to assist teams in becoming more effective by improving the communication process between team members.
E-mail: firstname.lastname@example.orgJohn R. Woodward is a Lecturer in Computer Science at the
Queen Mary University of London, within the
School of Electronic Engineering and Computer Science and for the
previous four years was a lecturer at the University of Nottingham. He holds a BSc
in Theoretical Physics, an MSc in Cognitive Science and a PhD in Computer
Science, all from the University of
Birmingham. His research interests are Operational Research, Optimization, and Genetic Programming. He has
over 50 publications in Computer Science, Operations Research and
Engineering which include both theoretical and empirical contributions, and
given over 100 talks at international conferences and as an invited speaker
at universities. He has worked in industrial, military, educational and
academic settings, and been employed by EDS, CERN and RAF and three UK
Previous ECADA workshops
- 8th ECADA
Workshop @GECCO 2018 - Kyoto, Japan
- 7th ECADA
Workshop @GECCO 2017 - Berlin, Germany
- 6th ECADA
Workshop @GECCO 2016 - Denver, Colorado
- 5rd ECADA
Workshop @GECCO 2015 - Madrid, Spain
4th ECADA Workshop @GECCO 2014 - Vancouver, BC, Canada
- 3rd ECADA
Workshop @GECCO 2013 - Amsterdam, The Netherlands
- 2nd ECADA
Workshop @GECCO 2012 - Philadelphia, PA, USA
- 1st ECADA
Worlkshop @GECCO 2011 - Dublin, Ireland