Schedule for: 22w5199 - Derivative-Free Optimization: Linking Algorithms and Applications
Beginning on Sunday, July 17 and ending Friday July 22, 2022
All times in UBC Okanagan, Canada time, PDT (UTC-7).
Sunday, July 17 | |
---|---|
16:00 - 23:00 | Check-in begins at 16:00 on Sunday and is open 24 hours (Front Desk Nechako Residence) |
Monday, July 18 | |
---|---|
08:00 - 08:45 | Breakfast (Sunshine/ADM) |
08:45 - 09:00 | Introduction and Welcome by the UBCO BIRS Staff (Arts Building room 386) |
09:00 - 09:30 |
Charles Audet: Monotonic grey box direct search optimization ↓ We are interested in blackbox optimization for which the user is aware of monotonic behaviour of some constraints defining the problem. That is, when increasing a variable, the user is able to predict if a function increases or decreases, but is unable to quantify the amount by which it varies. We refer to this type of problems as ``monotonic grey box'' optimization problems. Our objective is to develop an algorithmic mechanism that exploits this monotonic information to find a feasible solution as quickly as possible. With this goal in mind, we have built a theoretical foundation through a thorough study of monotonicity on cones of multivariate functions. We introduce a trend matrix and a trend direction to guide the Mesh Adaptive Direct Search (Mads) algorithm when optimizing a monotonic grey box optimization problem. Different strategies are tested on a some analytical test problems, and on a real hydroelectric dam optimization problem. (Arts Building room 386) |
09:30 - 09:45 | Coffee Break (Arts Building room 386) |
09:45 - 10:15 |
Gabriel Jarry-Bolduc: A derivative-free trust region algorithm using calculus rules to build the model function ↓ In this talk, we consider a box-constrained minimization problem where the objective function is composed of more than one function. We will present the potential benefits of using a calculus-based approach when building the model function. (Arts Building room 386) |
10:15 - 11:00 | Coffee Break (Arts Building room 386) |
11:00 - 11:30 |
Francesco Rinaldi: A weak tail-bound probabilistic condition for function estimation in stochastic derivative-free optimization ↓ In this talk, we use tail bounds to define a probabilistic condition for function estimation that eases the theoretical analysis of stochastic derivative-free optimization methods. In particular, we focus on the unconstrained minimization of a potentially non-smooth function, whose values can only be estimated via stochastic observations, and give a simplified convergence proof for both a direct search and a basic trust-region scheme. Finally, we discuss some possible extensions. (Arts Building room 386) |
11:30 - 13:30 | Lunch (Sunshine Café) |
13:30 - 14:00 |
Sébastien Le Digabel: SOLAR: A solar thermal power plant simulator for blackbox optimization benchmarking ↓ This work introduces SOLAR, a collection of optimization problems provided as a benchmarking tool for blackbox solvers. Each problem optimizes the design of a concentrated solar power plant defined as a blackbox numerical model. The type of variables, dimensionality, and number and type of constraints are different across problems. Optimization may be single or biobjective. The solar plant model considers several subsystems: a heliostats field, a central cavity receiver, a molten salt thermal energy storage, a steam generator and an idealized power block. Benchmark optimization results are provided using the NOMAD software package. (Arts Building room 386) |
14:00 - 14:15 | Coffee Break (Arts Building room 386) |
14:15 - 14:45 |
Sébastien Kerleau: Positive k-spanning sets and their use in derivative-free optimization ↓ A positive spanning set (PSS) is a set of vectors that spans the whole space using non-negative linear combinations. Derivative-free optimization methods based on PSSs typically favor those with the best cosine measure. A good cosine measure implies that the directions in the PSS can be useful for optimization purposes, however this metric does not fully account for the structure of the PSS. In particular, it does not reflect the spanning capabilities of a given subset of the PSS vectors.
In this talk, we will focus on a particular subclass of PSSs, called positive k-spanning sets. A positive k-spanning set (PkSS) remains positively spanning even when some of its elements are removed. After formally defining the positive k-spanning property, we will provide examples of PkSSs. We will then explain how to construct PkSS of minimum cardinality, using arguments from polytope theory. Finally, we will introduce a new measure
for quantifying the quality of a PkSS, inspired by the cosine measure for positive spanning sets. (Arts Building room 386) |
14:45 - 15:30 | Coffee Break (Arts Building room 386) |
15:30 - 16:00 |
Giampaolo Liuzzi: A new derivative-free interior point method for constrained black-box optimization ↓ Nowadays, black-box optimization problems are ubiquitous in many applications. This class of
problems is characterized by objective and/or constraint functions that are only known through
their input-output behavior, i.e. no analytical expression of the functions is available. Thus,
typically, first order derivatives cannot be used or are, at the very least, untrustworthy. Such
problems mainly arise in those contexts where the objective and constraint functions are computed
by means of relatively complex and time-consuming simulators. Frequently in such applications,
constraints are hard in the sense that functions cannot be computed (or they could not even be defined)
outside of the feasible region. In such situations, it is customary to use an extreme or dead penalty
approach in which an extremely high value is assigned to the objective function out side of the
feasible region. In this way, if a feasible starting point is known, the algorithm remains trapped
within the feasible region. However, such an approach frequently causes numerical difficulties which
are basically due to the discontinuity of the objective function on the boundary of the feasible region.
In this talk, we approach the problem with hard constraints by means of an interior log-barrier penalty
function. We propose an algorithm model and study its theoretical convergence properties. Further, we
also present preliminary numerical results which show viability of the proposed method. (Arts Building room 386) |
16:00 - 16:15 | Coffee Break (Arts Building room 386) |
16:15 - 16:45 |
Christine Shoemaker: Surrogate-assisted many-objective optimization with reference vector guided candidate search and aggregated surrogate model ↓ Designing many-objective optimization algorithms that cope with more than three objectives is challenging, especially when objective functions are multimodal and expensive to evaluate. In this study, we present a novel surrogate-assisted many-objective optimization algorithm named RECAS. The proposed algorithm is a non-evolutionary-based method and iteratively determines new points for expensive evaluation via a series of independent reference vector-guided candidate searches. In each candidate search, unlike most prior related studies, RECAS employs a surrogate model in an aggregated manner to directly approximate the acquisition function that maps each point to its quality assessment indicator rather than a certain objective.
We prove that RECAS converges almost surely to the Pareto-optimal front under some mild assumptions. Finally, in the numerical experiments, RECAS generally outperforms three recent state-of-the-art algorithms in maintaining convergent and well-spread approximations of the Pareto-optimal front on a suite of 84 widely used test problems with up to 10 objectives. The good performance of RECAS on two water shed simulation model calibration problems further indicates its great potential in handling computationally expensive real-world applications. (Zoom) |
Tuesday, July 19 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine/ADM) |
09:00 - 09:30 |
Matt Menickelly: Stochastic average model methods ↓ We consider the solution of finite-sum minimization problems, such as those appearing in nonlinear least squares or general empirical risk minimization problems. However, we are motivated by problems in which the summand functions are computationally expensive, and evaluating all summands on every iteration of an optimization method may be undesirable. We present in this talk the idea of stochastic average model (SAM) methods, inspired by stochastic average gradient (SAG) methods. SAM methods sample component functions on each iteration of a trust-region method according to a discrete probability distribution on component functions; the distribution is designed to minimize an upper bound on the variance of the resulting stochastic model. We present results concerning an implemented variant extending the derivative-free model-based trust-region solver POUNDERS, which we deem SAM-POUNDERS. (Arts Building room 386) |
09:30 - 09:45 | Coffee Break (Arts Building room 386) |
09:45 - 10:15 |
Kwassi Joseph Dzahini: Constrained stochastic blackbox optimization using a progressive barrier and probabilistic estimates ↓ This work introduces StoMADS-PB, a direct-search method using a progressive barrier approach for constrained stochastic blackbox optimization. The values of the objective and constraint functions can only be computed with random noise whose distribution is unknown. Since the deterministic computable version of the blackbox is unavailable, StoMADS-PB uses estimates and introduces probabilistic bounds (for its constraint violation function), required to be accurate and reliable with high, but fixed, probabilities. Using Clarke calculus and martingale theory, Clarke stationarity convergence results for the objective and the violation function are derived with probability one. (Arts Building room 386) |
10:15 - 11:00 | Coffee Break (Arts Building room 386) |
11:00 - 11:30 |
Solène Kojtych: Escaping unknown discontinuous regions in blackbox optimization ↓ The design of key nonlinear systems often requires the use of expensive blackbox simulations presenting inherent discontinuities whose positions in the space of variables cannot be analytically predicted. Without further precautions, the solution of related optimization problems leads to design configurations which may be close to discontinuities of the blackbox output functions. These discontinuities may betray unsafe regions of the design space, such as nonlinear resonance regions. To account for possible changes of operating conditions, an acceptable solution must be away from unsafe regions of the space of variables.
The objective of the presented work is to solve a constrained blackbox optimization problem with an additional constraint that the solution should be outside unknown zones of discontinuities or strong variations of the objective function or the constraints. The proposed approach is an extension of the Mesh Adaptive Direct Search (Mads) algorithm and aims at building a series of inner approximations of these zones. The algorithm, called DiscoMads, relies on two main mechanisms: revealing discontinuities and progressively escaping the surrounding zones. A convergence analysis supports the algorithm and preserves the optimality conditions of Mads. Numerical tests will be presented on analytical problems and on three engineering problems illustrating possible applications of the algorithm: the design of a simplified truss, the synthesis of a chemical component and the design of a turbomachine blade. The DiscoMads algorithm successfully solves these problems by providing a feasible solution away from discontinuous regions. (Arts Building room 386) |
11:30 - 13:30 | Lunch (Sunshine Café) |
13:30 - 14:00 |
Juliane Mueller: Surrogate model based optimization for tuning DL model architectures ↓ The performance of deep learning (DL) models depends to a great extent on the model’s architecture that is defined by hyperparameters such as the number of nodes, layers, and the learning rate, etc. Tuning the hyperparameters is difficult and time-consuming because evaluating the performance of a set of hyperparameters requires a lengthy training of the corresponding model. Moreover, stochastic optimizers are commonly used in model training, leading to variability of the architecture’s performance.
We will describe an automated optimization method for tuning DL model architectures and we take into account the prediction variability with the goal to identify architectures that make robust predictions. To this end, we will exploit surrogate models and active learning strategies, which enable efficient and effective optimization. We will demonstrate our developments on an application arising in particle physics. (Arts Building room 386) |
14:00 - 14:15 | Coffee Break (Arts Building room 386) |
14:15 - 14:45 |
Dominic Huang: A hybrid direct search and model-based derivative-free optimization method with dynamic decision processing and application ↓ A derivative-free optimization (DFO) method is a type of optimization method that does not make
use of derivative information in order to find the optimal solution. In this talk, we develop the framework
for a DFO method, referred to as the smart DQL method. It is designed to be a versatile hybrid method
capable of performing direct search, quadratic-model search, and line search all in the same method. The
smart DQL method is applied to the problem of solid tank design for 3D radiation dosimetry. We show
that, given the same evaluation budget, the smart DQL method produces a higher-quality solution than
the Grid Search method that was originally employed by the UBCO 3D Radiation Dosimetry Research
Group. (Arts Building room 386) |
14:45 - 15:30 | Coffee Break (Arts Building room 386) |
15:30 - 16:00 |
Youssef Diouane: Bayesian optimization: performance assessment and improvements using DFO techniques ↓ Bayesian Optimisation algorithms (BO) are global optimisation methods that iterate by constructing and using conditional Gaussian Processes (GP). It is a common claim that BO is state-of-the-art for costly functions. However, this claim is weakly supported by experimental evidence, as BO is most often compared to itself, rather than to algorithms of different nature.
In this work, we study the performance of BO within the well-known COmparing Continuous Optimizers benchmark (COCO). We first analyse the sensitivity of BO to its own parameters, enabling us to answer general questions regarding the choice of the GP kernel or its trend, the initial GP budget, and the suboptimisation of the acquisition function. Then, we study on which function class and dimension BO is relevant when compared to state-of-the-art optimizers for expensive functions.
The second part of this talk describes an improved BO algorithm, called TREGO (trust-region-like efficient global optimisation). TREGO alternates between regular BO steps and local steps within a trust region. By following a classical scheme for the trust region (based on a sufficient decrease condition), we demonstrate that our algorithm enjoys strong global convergence properties. The COCO benchmark experiments reveal that TREGO consistently outperforms EGO and closes the performance gap with other state-of-the-art blackbox optimization algorithms.
Joint work Victor Picheny, Rodolphe Le Riche, Alexandre Scotto Di Perrotolo. (Arts Building room 386) |
Wednesday, July 20 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine/ADM) |
09:00 - 09:30 |
Ana Custódio: A trust-region based approach to approximate Pareto fronts in multiobjective optimization ↓ We propose an algorithm based on a trust-region approach for computing an approximation to the complete Pareto front of a given multiobjective optimization problem. The algorithm alternates between two main steps, the extreme point step, that moves towards extreme points of the Pareto front, and a scalarization step, that attempts to reduce the gaps on the Pareto front, by solving an adequate scalarization problem. In any of these two steps, interpolation models are built to replace the components of the objective function. Numerical experiments illustrate the good performance of the method.
Authors: A. Mohammadi and A.L. Custódio (Arts Building room 386) |
09:30 - 09:45 | Coffee Break (Arts Building room 386) |
09:45 - 10:15 |
Everton Silva: Global optimality integral conditions and an algorithm for multiobjective problems ↓ In this work, we propose global optimality integral conditions for multiobjective problems, not necessarily differentiable. The integral characterization, already known for single objective optimization problems, is extended to multiobjective optimization, by the weighted sum and the Chebyshev weightedscalarizations. Using this last scalarization, we propose an algorithm for obtaining an approximation to
the weak Pareto front, whose effectiveness is illustrated by solving a collection of multiobjective test
problems. (Arts Building room 386) |
10:15 - 11:00 | Coffee Break (Arts Building room 386) |
11:00 - 11:30 |
Jeff Larson: A toolbox for optimizing nonsmooth composite objectives ↓ We present a new software toolbox for the solution of a broad class of
bound-constrained optimization of composite objective functions. The toolbox
implements four optimization methods, three of which are novel in the
literature. They include an implementation of a manifold sampling method from
previous work, a novel manifold sampling algorithm that is dual in a particular
sense to previous manifold sampling methods, and two methods that employ global
optimization subproblems. We provide rigorous convergence analysis and
demonstrate extensive testing of these methods. (Arts Building room 386) |
11:30 - 11:45 | Group photos (Arts Building room 386) |
11:45 - 13:30 | Lunch (Sunshine Café) |
13:30 - 14:00 |
Yves Lucet: Bi-objective optimization for road vertical alignment design ↓ We study a bi-objective optimization model to finds road profiles that optimize the road construction cost and the vehicle operating costs, specifically the fuel consumption. The research implements and validates the formula for the fuel consumption cost. It further presents and examines a variety of well-known methods: three classical scalarization techniques (the epsilon-constraint method, weighted sum method, and weighted metric methods) and two evolutionary methods (NSGA-II and FP-NSGA-II). It also proposes a warm start strategy, and analyze the results using commonly-used performance indicators: hypervolume (to assess the convergence of solutions), spacing (to assess the diversity of solutions), and CPU time (to assess the computation speed). We find that the warm start strategy improves the performance of all the scalarization techniques, and that the most promising method for the proposed problem is the epsilon-constraint method with a warm start. (Arts Building room 386) |
14:00 - 14:15 | Coffee Break (Arts Building room 386) |
14:15 - 14:45 |
Yiwen Chen: Adjusting the centred simplex gradient to compensate for misaligned sample points ↓ The centred simplex gradient (CSG) is a popular gradient approximation technique in derivative-free optimization. Its computation requires a perfectly symmetric set of sample points and it is known to provide an order-2 accuracy with regards to the radius of the sample set. In this talk, we consider the situation where the set of sample points is not perfectly symmetric. By adapting the formula for the CSG to compensate for the misaligned points, we define a new Adapted-CSG. We prove that the Adapted-CSG retains order-2 accuracy and present numerical examples to demonstrate its properties. (Arts Building room 386) |
14:45 - 15:30 | Coffee Break (Arts Building room 386) |
15:30 - 16:00 |
Margherita Porcelli: BFO 2.0: a new release of the Brute Force Optimizer that merits its name a little bit less ↓ The talk shortly reviews some of new features of the derivative-free optimizer BFO Matlab package. Among these, several important new problem-oriented features will be discussed:
a) Using coordinate partially-separable problem structure. This ubiquitous problem structure can now be exploited by BFO, leading to very significant gains in performance (orders of magnitude) also allowing the use of BFO for large problems (several thousands of variables).
b) The BFOSS library of model-based search steps. Release 2.0 of BFO now also supplies BFOSS, a BFO-compatible library whose purpose is to compute interpolation-based serch steps. Various options are available, ranging from sublinear to fully quadratic models. BFOSS supports model building for objective functions given in coordinate-partially-separable form. This combination provide even further significant performance improvements.
c) Categorical variables. BFO now supports the use of categorical variables. Categorical variables are unconstrained non-numeric variables whose possible ’states’ are defined by strings (such as ’blue’). These states are not implicitly ordered, as would be the case for integer or continuous variables. The user specifies the application-dependent neighbourhoods for categorical variables in two possible ways (static neighbourhoods and dynamic ones), allowing full application-dependent flexibility.
d) Performance and data profile training strategies. Because BFO is a trainable package (meaning that its internal algorithmic constants can be trained/optimized by the user to optimize its performance on a specific problem class), it needs to define training strategies which allow to decide if a particular option is better than another. BFO Release 2.0 now includes two new training strategies: Performance profiling and data profiling. These strategies use the eponymous tools for comparing and optimizing algorithmic performance.
BFO Release 2.0 is available at https://github.com/m01marpor/BFO free of charge. (Arts Building room 386) |
Thursday, July 21 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine Café) |
09:00 - 09:30 |
Warren Hare: Understanding positive bases ↓ A positive basis is a set that non-negatively spans R^n and contains no proper subsets with the same property. These attributes make postive-bases a useful tool in derivative-free optimization (DFO) and an interesting concept in mathematics. In this talk, we examine some properties of positive bases, including how to check if something is a positive basis and how to construct positive bases with nice structures. (Arts Building room 386) |
09:30 - 09:45 | Coffee Break (Arts Building room 386) |
09:45 - 10:15 |
Ludovic Salomon: Handling of constraints in multiobjective blackbox optimization ↓ The last decade has seen the development of new efficient convergent-based derivative-free and blackbox optimization methods to solve multiobjective problems. Most of them are extensions of single-objective blackbox algorihms. However, very few have been designed to handle general inequality blackbox constraints. This work proposes two new extensions of the MADS algorithm to multiobjective blackbox optimization. One is based on the progressive barrier, the other on a two-phase approach. Numerical experiments on synthetic benchmarks and three "real-case" problems show that these two approaches perform well according to other state-of-the-art algorithms. (Arts Building room 386) |
10:15 - 11:00 | Coffee Break (Arts Building room 386) |
11:00 - 11:30 |
Stefan Wild: Randomized subspace and adaptive methods for large-scale derivative-free optimization ↓ An especially challenging regime in data-driven science and engineering is when one can only query a noisy oracle. In this talk we seek to improve the scalability of such methods by investigating methods that iteratively work in randomized subspaces or otherwise adapt noisy estimates. We highlight theoretical and empirical results for these new methods and address ongoing opportunities for improved performance in practice.
Joint work with:
Raghu Bollapragada, University of Texas (https://sites.google.com/view/raghub)
Kwassi Joseph Dzahini, Argonne (https://www.anl.gov/profile/kwassi-joseph-dzahini)
Cem Karamanli, University of Texas (https://www.cemkaramanli.com)
Xiaoqian Liu, NC State University (https://xiaoqian-liu.github.io) (Arts Building room 386) |
11:30 - 13:30 | Lunch (Sunshine Café) |
13:30 - 17:00 | Break out groups (TBA) |
Friday, July 22 | |
---|---|
09:00 - 16:30 |
Excursion to Big White ↓ Hiking, Biking, or Just sight seeing. Transportation will be arranged. If you need to return by a specific time, then please let us know. (TBA) |