Results from the Cooperation Between University of British Columbia and Cologne University of Applied Sciences

Steven Ramage from University of British Columbia (UBC) describes the SPO development and the results from the cooperation between UBC and the SPOTSevenLab (Cologne University of Applied Sciences) as follows:

Experimental Methods

Results from the cooperation between UBC and SPOTSeven are published in the book “Experimental Methods for the Analysis of Optimization Algorithms”.

“Bartz-Beielstein et al. [2005] adapted EGO to the optimization of algorithms, with their Sequential Parameter Optimization (SPO) method. SPO uses the same acquisition functions as EGO, but a slightly different model, which includes a second order polynomial fit as well as the standard Gaussian process model. Unlike EGO, their approach is able to deal with random response values through a continual resampling of the best observed points using a doubling strategy, which allows the estimate to converge to the true value over time. Finally, as opposed to fitting the model with each sample point individually, as done by SKO, SPO merges the samples for each point into a better estimate of the objective at that point, and then fits the model on these merged estimates.
Hutter et al. [2009a] directly compared SPO and SKO and their suitability for algorithm configuration. They found that SPO in general outperformed SKO on the algorithms they studied. They also introduced SPO+, which introduced some modifications to the original algorithm….”

Stephen Edward Andrew Ramage’s master thesis “Advances in Meta-algorithmic Software Libraries for Distributed Automated Algorithm Configuration” is available online.
The cooperation between UBC and SPOTeven Lab resulted in a publication, which can be found in the book Experimental Methods for the Analysis of Optimization Algorithms. Here are some reviews:

“This book belongs on the shelf of anyone interested in carrying out experimental research on algorithms and heuristics for optimization problems. … Don’t keep this book on the shelf: read it, and apply the techniques and tools contained herein to your own algorithmic research project. Your experiments will become more efficient and more trustworthy, and your experimental data will lead to clearer and deeper insights about performance.” (Catherine C. McGeoch, Amherst College)
“Here you will find aspects that are treated scientifically by the experts in this exciting domain offering their up-to-date know-how and even leading into philosophical domains.” (Hans-Paul Schwefel, Technische Universität Dortmund)
“[This] book … is a solid and comprehensive step forward in the right direction. [It] not only covers adequate comparison of methodologies but also the tools aimed at helping in algorithm design and understanding, something that is being recently referred to as ‘Algorithm Engineering’. [It] is of interest to two distinct audiences. First and foremost, it is targeted at the whole operations research and management science, artificial intelligence and computer science communities with a loud and clear cry for attention. Strong, sound and reliable tools should be employed for the comparison and assessment of algorithms and also for more structured algorithm engineering. Given the level of detail of some other chapters however, a second potential audience could be made up of those researchers interested in the core topic of algorithm assessment. The long list of contributors to this book includes top notch and experienced researchers that, together, set the trend in the field. As a result, those interested in this specific area of analysis of optimization algorithms should not miss this book under any circumstance. … The careful, sound, detailed and comprehensive assessment of optimization algorithms is a necessity that requires attention and care. As a result, my opinion is that this book should be followed and that it should be at the top of every experimenter’s table.” (Rubén Ruiz, European Journal of Operational Research, 2011, 214(2):453-456)