FORUM

Some thoughts on the role of robustness analysis in decision-aiding processes
 

Luis C. Dias

INESC Coimbra and Faculty of Economics,

University of Coimbra, Portugal.

LDias@inescc.pt

 

This third series of our Newsletter, with José Figueira as Editor, has witnessed the appearance of a very interesting Forum on the theme of Robustness Analysis (RA), which already contains an excellent mix of articles. Bernard Roy opened the series with an enlightening set of questions (No. 6, Fall 2002), accompanied by Jonathan Rosenhead’s article with the perspective behind the first uses of the “robustness analysis” expression, and followed by Philippe Vincke (No. 8, Fall 2003) who, like Roy, provides us a wide-scope perspective of the area. Four contributions followed, one focused on Bayesian inference / decision analysis, the others more focused on optimization contexts. This modest contribution will bring us back to a more general scope of multi-criteria decision aiding, to share some thoughts about the role that RA can play in such decision-aiding processes.

 

Motivation

 RA is motivated by the difficulties in setting the parameter values of decision-aiding models. Indeed, it is well known that setting technical and economical parameter values is often problematical: instruments and statistics can be imprecise (e.g., confidence intervals), measurement can be arbitrary and subjective (e.g., measuring noise pollution), some information (e.g. clinical data) may be controversial or contradictory, let alone uncertainties about the future. These are the type of difficulties that most easily come to our mind when talking about RA in classical optimization models.

When considering multi-criteria decision aiding, as we wish to do, we also incorporate in the models parameters related to the preferences of the Decision Maker (DM). Eliciting parameter values about preferences is also problematical. In cognitive terms, the parameters are artifacts whose semantic may be difficult to understand for the DM, not to mention biases related to the way questions are posed. For him or her, value judgments are naturally easier to express through words than through numbers. Furthermore, preferences may evolve, as they are often unstable outcomes of unresolved internal conflicts in the DM’s mind. Adding to these fundamental difficulties, other constraints of a more pragmatic nature may be present, e.g., the DM is reluctant to divulge precise parameter values about his preferences in public, or his/her time and patience is rather limited.

Moreover, we often need to address the concerns of a group of actors, rather than a single DM. The above mentioned difficulties of fixing preference-related parameter values are still present, if not reinforced by the diversity of judgments. In such cases, the existence of “hidden agendas” may hinder an open discussion about parameter values. Even in case of consensus, one must be aware of phenomena such as groupthink.

 

Concepts of Robustness Analysis

 Sensitivity Analysis (SA) is a traditional answer to the difficulties in setting the parameter values. In optimization, it indicates how much the parameters may vary without changing some conclusion of interest. RA is often seen as a reverse perspective of SA, but that would depend on the notion of RA that is being considered. Indeed, we may find multiple perspectives about the concept. To Rosenhead  (No. 6, Fall 2002), RA is used to choose one action that leaves many good options as regards the choices to be made in the future. Kouvelis and Yu [7] define robust solution to an optimization problem as the one which has the best performance under in its worst case (e.g., max-min rule). Another possibility is proposed by Aloulou et al. in this Forum (No. 12, Fall 2005). Mulvey et al. [9] differentiate between the quality of solution robust (that yields always a near-optimal value for every acceptable version of parameter values), from the quality of model robust (that is always feasible or almost feasible for every version). In this Forum (No. 10, Fall 2004), Sevaux and Sörensen introduce a concept of solution robustness meaning the solution (a plan) does not change much in optimization programs that are to be repeated regularly. More generally, Hites et al. [6] call for a multicriteria evaluation of robustness.

The perspectives that are nearer to the reverse of SA are Roy’s definition of robust conclusion [11,12], as an assertion that is valid for set of results compatible with the different model versions envisaged, and Vincke’s [13] definition of robust solution as one that is always near (or does not contradict) any other solution obtainable using an acceptable version ([13] has also introduced the notion of robust method).

Here, a version (to use the term recently proposed by Roy) of the model (or problem, in Roy’s words) is formally a combination of parameter values defining a model (e.g., a linear programming model, or an Electre model). Usually, the model versions are considered as equally acceptable, without attempting to define a “meta-model” that would attribute different degrees of probability (or possibility, or importance…) to different versions.

 

Roles for Robustness Analysis

 The role of RA in decision aiding does not seem to have been much discussed so far. Most of the proposed RA approaches can be separated according to their placement (ex-ante vs. ex-post) with respect to using a method to obtain a solution.

One of the possibilities is to consider RA as an ex-ante concern, which amounts to imbed this concern in a model to be optimized. In these cases, usually optimization problems, a model is built and an algorithm is used to obtain a solution that is robust according to some pre-specified criterion. The obtained solution will be optimal with respect to that criterion (e.g., it minimizes the maximum cost or the maximum regret), even though it might never have been optimal for any of the versions considered. Examples of these approaches are, e.g., [7,9]. An approach that seems particularly promising is to use several criteria rather than a single one to be optimized (as [6] suggested).

A second possibility is to consider RA as an ex-post concern, substituting or complementing SA, to assess how robust is a solution derived from a decision aiding process and to supply additional robust conclusions. Arguably, the first example of this type of approach is found in [12]. Such approaches may be useful to question the validity of the recommendation and how its evaluation might change from version to version, possibly identifying its limits and enriching the information that may be provided to the DM. For instance, rather than saying that x is the best alternative in a choice problem, one may inform the DM that all alternatives are outranked by either x or y, explaining what are the main differences between the versions that favour x and those favouring y, and adding that y is always a relatively good choice, while there are versions where x receives a poor evaluation.

Before discussing a third possibility, we may note that for the approaches we mentioned before the set of versions is considered to have been defined a priori. As Roy notes in this Forum, this may cause a dilemma between the wish to take into account every conceivable version and the wish to obtain some useful conclusions. It is perhaps because of this dilemma that Roy [11] had earlier proposed the notion of approximately robust conclusion: a formal assertion that is verified for all the versions, except a few ones, which are considered negligible.

When we consider preference-related parameters, a third possibility is based on the idea of trying to progressively reduce the set of versions considered. This means using RA throughout the whole decision process as a tool to guide that process. The decision aiding process will reiterate phases of elicitation and RA. In elicitation phases, the DM will be questioned about parameter values, possibly indirectly, without requesting precise numbers (e.g., the answer can be an interval, or a comparison relation between two parameters), and noting that difficult elicitation questions may be avoided at early stages (allowing the DM to learn before answering). The DM’s answers will then be used to constrain the set of versions considered. In RA phases, the robust conclusions corresponding to the versions are to be discussed. This may in turn motivate new elicitation questions, when returning to an elicitation phase.

If this third possibility is adopted, then RA becomes interactive, which is best achieved when there exists software to aid the DM and (possibly) an analyst during the successive iterations. We next provide two examples of such software.

 

VIP Analysis (for details see [2])

This software is intended to support choice decisions using additive value functions, allowing to draw robust conclusions when using different versions for the scaling weights (k1,…,kn). In elicitation phases, the DM may indicate any information that can be translated as a linear constraint, such as intervals for weights or weight ratios, parameter comparisons (e.g., k1 ≥ k2), or holistic comparisons (e.g., a1 is not worse than a2). In RA phases, VIP Analysis uses linear programming to identify the minimum and maximum value that each alternative may achieve, as well as the minimum and maximum differences of value between each alternative and the other ones.

The outputs of RA indicate which alternatives are most affected by imprecision, indicating also the versions leading to the extreme results (hence inviting the DM to ponder whether such versions are acceptable or not). In a choice problematic, RA also highlights which alternatives may be discarded (dominated or quasi-dominated with respect to versions), allowing a progressive reduction of the number of alternatives.

 

IRIS (for details see [5])

This software is intended to support sorting decisions using Electre Tri models, allowing different versions for the weights (k1,…,kn) and cutting level (λ). It implements the idea of integrating RA with an aggregation/disaggregation approach (parameter inference) proposed by [4]. In elicitation phases, besides linear constraints on the weights, the DM may indicate sorting examples, which should be reproduced by IRIS. In RA phases, IRIS uses linear programming to show the range of categories where each alternative may be sorted, and to infer which of the versions would satisfy the constraints with maximum slack. It also provides some guidance when the inputs happen to be inconsistent.

IRIS encourages the DM to interact with it through communicating sorting examples, aiming to reduce progressively the interval of categories where each alternative may be sorted. As in VIP Analysis, IRIS indicates the versions corresponding to extreme results (worst and best categories for each alternative), thus inviting the DM to ponder the acceptability of such versions.

 

In common, these tools implement RA as a tool to guide a decision-aiding process, prompting questions for the DM to analyze, indicating what are the results more affected by his/her answers, and showing what can be robustly concluded. The aim will not be to select a version, but to highlight a set of robust conclusions that is found to be requisite (in the sense of [10]). This type of approach seems particularly well-suited when the RA concerns parameters related with preferences, in that the number of versions can be reduced as a result of learning or increased effort from the DM (it may also be indicated for other parameters than can be known with higher precision but at an additional cost, e.g., data from surveys or experimental data).

When the motivation for RA stems from the existence of multiple DMs, this type of approaches also seem promising as tools to guide a group decision process. In such processes, many versions may be needed to accommodate all the different views, and this set of versions can be discussed throughout an interactive process based on successive agreements. RA will show where disagreement is stronger, it will motivate the issues to be discussed, and will highlight robust conclusions (agreement). Some steps exploring these ideas have begun recently [1,3,8].

  

References:

[1]     Damart S, Dias LC, Mousseau V (to appear), Supporting groups in sorting decisions: methodology and use of a multi-criteria aggregation-disaggregation DSS, Decision Support Systems.

[2]     Dias LC, Clímaco JN (2000), Additive aggregation with variable interdependent parameters: the VIP Analysis software, Journal of the Operational Research Society 51, 1070-1082.

[3]     Dias LC, Clímaco JN (2005), Dealing with imprecise information in group multicriteria decisions: a methodology and a GDSS architecture, European Journal of Operational Research 160, 291-307.

[4]     Dias L, Mousseau V, Figueira J,  Clímaco J (2002), An aggregation/disaggregation approach to obtain robust conclusions with ELECTRE TRI, European Journal of Operational Research, 138, 332-348.

[5]     Dias, LC, Mousseau V (2003), IRIS: A DSS for multiple criteria sorting problems, Journal of Multi-Criteria Decision Analysis, 12, 285-298.

[6]     Hites R, De Smet Y, Risse N, Salazar-Neumann M, Vincke P (2003), A comparison between multicriteria and robustness frameworks, Preprint SMG/ULB 2003/16.

[7]     Kouvelis P, Yu G (1997), Robust discrete optimization and its applications, Kluwer,.

[8]     Lamboray C (2005), Some ways of analyzing prudent orders from a robustness perspective, Preprint SMG/ULB 2005/04.

[9]     Mulvey JM, Vanderbei RJ, Zenios SA (1995), Robust optimization of large-scale systems, Operations Research 43, 264-281.

[10]   Phillips LD (1984), A theory of requisite decision models, Acta Psychologica, 56, 29-48.

[11]   Roy B (1998), A missing link in OR-DA: robustness analysis, Foundations of Computing and Decision Sciences 23, 141-160.

[12]   Roy B, Bouyssou D (1993), Aide multicritère à la décision: méthodes et cas, Economica, Paris.

[13]   Vincke Ph (1999), Robust solutions and methods in decision aid, Journal of Multi-Criteria Decision Analysis 8, 181-187.

 

EWG-MCDA Newsletter, Series 3, No.13, Spring 2006



[HOME - EWG MCDA]    [Newsletter articles of the EWG - MCDA]