Università Ca' Foscari di Venezia

*Venue hotels & events
*Travel information
Technical program
Invited talks and tutorials
Call for papers
Special session calls

Invited talks and Tutorials

Invited talks will take place in the central days of the conference (Wednesday 20 and Thursday 21). Invited talks are given by:

Professor Xin Yao, The University of Birmingham, UK
Title: "How Efficient Are Evolutionary Algorithms?"

Xin Yao is a professor of computer science in the School of Computer Science at the University of Birmingham and the Director of the Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA). He is also a Fellow of IEEE, a Distinguished Lecturer of the IEEE Computational Intelligence Society, and a Distinguished Visiting Professor at the Nature Inspired Computation and Applications Laboratory (NICAL) of University of Science and Technology of China, Hefei, China.

Professor David Wolfe Corne, Heriot-Watt University, UK
Title: Super-Heuristics: Evolving problem solvers

A small number of independent strands of research since the 1960s have explored ways to automatically combine individual heuristics to help solve problem instances. The current buzzword "hyper-heuristics" arises from this activity, and is largely concerned with manipulating the ordering and parametrisation of execution of individual heuristics (e.g. dispatch rules in scheduling) in search for the best solutions to a given problem instance. Often, classifier systems or other learning methods are used to do the manipulation. A small portion of this overall activity is, in some contrast, concerned with learning new algorithms -- i.e. combinations of low-level heuristics that not only solve a given instance, but are reusable for a (perhaps wide) class of instances. To distinguish this type of activity, which I believe is particularly promising and exciting research area at the interface of learning and optimization, I call it Super-heuristics. Super-heuristics has had some intriguing successes so far, and could lead to enormously more efficient and effective optimisation in certain areas of industry. In this talk I will try to characterise the state of the art in super-heuristics, and set out several potentially promising ways forward.
David is Director of Research for the School of Mathematics and Computer Sciences (MACS) at Heriot-Watt University, Edinburgh, UK. MACS is pre-eminent in Scotland in the Mathematical and Computer Sciences, compromising part of the Maxwell Institute for Mathematical Sciences (a joint institute with the University of Edinburgh), and a number of pioneering research centres. He also leads the Intelligent Systems Laboratory, which maintains a portfolio of substantial achievements that range through fundamental models of computation, computational systems biology, computational neuroscience, and advanced methods for design, optimisation and data mining. Following several years as a research associate in the Department of Artificial Intelligence at the University of Edinburgh, which led to the extremely successful problem solving approach now called `hyper-heuristics', David was a Lecturer (1997) and then Reader (2003) at the University of Reading. He then took up a Chair in Computer Science at the University of Exeter in 2004, and moved to his current post in 2006. His continuing research agenda concerns novel methods for optimisation, data mining and machine learning, as well as strategies for solving large scale problems, with particular interests in multicriteria problems. He serves on the editorial boards of many prestigious journals, including Natural Computation, Theoretical Computer Science (C), the AI Review Journal, the IEEE/ACM Trans on Computational Biology and Bioinformatics and the IEEE Trans on Evolutionary Computation.

Tutorials will take place at the beginning of the conference (Monday 18 and Tuesday 19).

The following researchers will present advanced tutorials at LION4:

Holger Hoos, University of British Columbia (UBC), Vancouver, BC (Canada)

Computer-aided design of high-performance algorithms: Principled procedures for building better solvers (2 hours)

High-performance algorithms play an important role in many areas of computer science and are core components of many software systems used in real-world applications. Traditionally, the creation of these algorithms requires considerable expertise and experience, often in combination with a substantial amount of trial and error. Here, we outline a new approach to the process of designing high-performance algorithms that is based on the use of automated procedures for exploring potentially very large spaces of candidate designs. We contrast this computer-aided design approach with the traditional approach and discuss why it can be expected to yield better performing, yet simpler algorithms. We give an overview of existing work in this area, including algorithm configuration, selection and portfolio approaches. We also discuss several recent applications of computer-aided algorithm design that allowed our group to achieve substantial improvements in the state of the art in solving a broad range of SAT, mixed integer programming and course timetabling problems.
Holger H. Hoos is an Associate Professor at the Computer Science Department of the University of British Columbia (Canada). His main research areas span empirical algorithmics, artificial intelligence, bioinformatics and computer music, and he is one of the world's leading experts on stochastic local search methods and on the automated design of high-performance algorithms. He is a co-author of the book "Stochastic Local Search: Foundations and Applications", and his research has been published in numerous book chapters, journals, and at major conferences in artificial intelligence, operations research, molecular biology and computer music. Holger is a Faculty Associate of the Peter Wall Institute for Advanced Studies and currently serves as President of the Canadian Artificial Intelligence Association (CAIAC). (For further information, see Holger's web page at http://www.cs.ubc.ca/~hoos.)

Raffaele Giancarlo, Dipartimento di Matematica ed Applicazioni, University of Palermo, Italy

Data Driven Class Discovery in Microarray Data: Algorithmic Paradigms and Fast Heuristics (2 hours)

Microarrays, along with ChIP-Seq technologies, are a de facto standard for large scale genomic and proteomic studied. However, most of their success is intimately connected to the effectiveness of the computational techniques one uses to infer meaningful structure in a microarray dataset. Some of the Information Sciences areas connected to microarray data analysis are classic: Clustering and Statistical Validation Measures for the assessment of cluster quality. Unfortunately, mcroarrays offer special challenges, e.g, high data dimensionality and noise. The aim of this tutorial is to present some of the basic ideas and techniques that have been designed in the past ten years for Clustering and Statistical Validation Measures and that try to address the challenges posed by mcroarrays. The focus will be on paradigms, rather than single techniques, and particular attention will be given to the experimental and algorithm engineering aspects of this area which, although important, are largely neglected in the specialistic literature.

Terry Speed (ed), Statistical analysis of gene expression microarray data, CRC Press 2009.
Raffaele Giancarlo, Davide Scaturro and Filippo Utro, Statistical indices for computational and data driven class discovery in microarray data, in Biological Data Mining, J.Y. Chen and S. Lonardi (eds), Taylor and Francis 2009.
J. Handl , J. Knowles and D. B. Kell, Computational Cluster Validation in Post-genomic Data Analysis, Bioinformatics, 2003, 21:3201-3212.

Ilya Safro Mathematics and Computer Science Division Argonne National Laboratory, USA

Multilevel/multiscale/multigrid algorithms for optimization problems, with a special focus on combinatorial problems. (2 hours)

Outline of the tutorial:
1) Historical notes
2) Smoothing and Relaxation
3) General scheme of a multilevel algorithm; Different shapes of cycles
4) Coarsening of the original problem; Relation between fine and coarse problems; Is it easy to design a parallel multilevel framework?
5) Linear complexity of a multilevel framework
6) Multilevel algorithms for graphs and hypergraphs
7) Coarsening and learning the structure of your graph: edge matching, weighted aggregation, algebraic distance.
8) Uncoarsening with examples from linear ordering and partitioning
9) Uncoarsening: adding stochasticity
10) Real life examples; Experimental results for some NP-hard problems; Comparison with other fast heuristics
11) Another example: Multilevel scheme for constrained problems (VLSI placement and graph visualization)
Recommended survey: Brandt, A. and Ron, D. Multigrid solvers and Multilevel Optimization Strategies

Roberto Battiti, University of Trento, Italy

Reactive Search Optimization and Intelligent Optimization: from algorithms to software (2 hours)

Reactive Search Optimization (RSO) advocates the integration of sub-symbolic machine learning techniques into heuristics for solving complex optimization problems. The word reactive hints at a ready response to events during the problem solving process through an internal online feedback loop for the self-tuning of critical parameters. Methodologies of interest for RSO include machine learning and statistics, in particular reinforcement learning, active or query learning, neural networks, computational intelligence.
Intelligent optimization, a superset of Reactive Search Optimization, refers to a more extended area of research, including online and offline schemes based on the use of memory, adaptation, incremental development of models, experimental algorithmics applied to optimization, intelligent tuning and design of heuristics. The tutorial focusses on the main methods and corresponding software tools.
Prof. Roberto Battiti received the Laurea degree in Physics from the University of Trento, Italy, in 1985 and the Ph.D. degree from the California Institute of Technology (Caltech), USA, in 1990. He is now full professor of Computer Science at Trento university, deputy director of the DISI Department (Electrical Engineering and Computer Science) and director of the LION lab at (machine Learning and Intelligent OptimizatioN). His main research interests are heuristic algorithms for optimization problems, in particular reactive search optimization algorithms for discrete optimization problems. R. Battiti is a fellow of the IEEE. Full details about interests, research activities and scientific production can be found in the web: http://lion.dit.unitn.it/~battiti/ , http://reactive-search.org/ .