GDRR 2011 - Second Symposium on Games and Decisions in Reliability and Risk

GDRR 2011 - Second Symposium on

GAMES AND DECISIONS IN RELIABILITY AND RISK

Hotel Villa Carlotta, Belgirate (VB), Italy

May, 19-21, 2011

TALKS


Debarun Bhattacharjya and Ali Abbas

Normative Executive-Manager Incentive Plans with Multiple Target Ranges

Target-based incentive structures are a practical way to delegate decision making in organizations. When set arbitrarily, the target-based incentive structure may lead to conflicting choices between manager and organization: a manager under such an incentive structure will make decisions that do not necessarily maximize the organization's expected utility. A target is said to be normative when it ensures that decision making by the manager (person trying to achieve the target) is consistent with expected utility maximization of the executive (person representing the organization and setting incentives). The literature on normative target-based decision analysis considers a single target and a single pay-off: the manager is rewarded when s/he exceeds the target regardless of the amount by which s/he exceeds it. In many practical situations, incentive plans are designed by considering multiple target ranges, where the payoff is determined by the amount by which the manager exceeds the target. We show how to set normative target-based incentives with multiple payoffs and target ranges. Our approach works for all risk attitudes, inducing a manager who chooses projects to simultaneously maximize the expected utility of the executive. Moreover, the alternative that is optimal for the executive stochastically dominates all other alternatives for the manager.

Debarun Bhattacharjya and Ross Shachter

Solving Asymmetric Decision Problems with Decision Circuits

Decision analysis problems have traditionally been solved using either decision trees or influence diagrams. While decision trees are well-equipped to solve asymmetric problems, they are inefficient for problems that involve several decisions and uncertainties. On the other hand, influence diagrams can solve larger real-world problems by exploiting conditional independence, but they are often ineffective when asymmetry is involved, which is often the case in reliability and risk analysis applications.

Decision circuits are graphical representations for decision analysis computations, building upon the advances in arithmetic circuits for belief network inference. A compact decision circuit can be compiled by exploiting both global structure (conditional independence) as well as local structure (asymmetry) in a decision problem. Decision circuits are particularly effective in real-time decision making and expert systems since they can be compiled offline and then used for analysis through subsequent online sweeps through the circuit. The wealth of partial derivative information available in a circuit can be used for efficient sensitivity analysis and value of information calculations. We show how to directly construct a decision circuit from an asymmetric problem, thereby leveraging all the benefits that a circuit can offer the analyst. We demonstrate the concepts using classic examples from the literature.

Ji Hwan Cha

Decision Problems in Burn-in for Heterogeneous Populations

Burn-in is a method used to eliminate the initial failures in field use. To burn-in a component or system means to subject it to a period of use prior to the time when it is to actually be used. Under the assumption of decreasing or bathtub-shaped population failure rate functions, various problems of determining optimal burn-in for homogeneous populations have been intensively studied in the literature. In this paper, we consider burn-in procedure for heterogeneous populations. Recently, there have been new approaches in this direction of research on burn-in: (i) the elimination of weak items by using environmental shocks; (ii) the elimination of weak items based on the failure/repair history during the process; (iii) burn-in procedure which minimizes the risks of selecting items with large levels of individual failure rates. These approaches will be introduced and relevant new decision problems will be discussed in detail.

Alireza Daneshkhah and Tim Bedford

Sensitivity Analysis of the System Availability and Reliability: an Emulator-Based Method

The availability of a repairable system is a function of time and parameters which can be determined through a set of integral equations and usually are calculated numerically. The main purpose is to study the sensitivity of the system availability - and other related quantities of interests in reliability analysis such as expected costs - with respect to the changes in the main parameters. Unfortunately, the computation of the sensitivity measures associated with the complex system would be infeasible or might be time-consuming. In addition, the well-known sensitivity measures used to deal with the aforementioned issues are the first order reliability methods and Monte Carlo simulation methods which both have some disadvantages and in the most cases are not practical.

Here, we introduce a computationally cheap alternative sensitivity measure which enables us to understand how changes in the model input (failure/repair times) influence the output (availability measure). Our approach is based on using the Gaussian process as an emulator is computationally highly efficient and suitable to use for highly complex models and requires far smaller number of model runs.

The emulator-based sensitivity method is illustrated on several examples, and we discuss the further implications of the technique for reliability and maintenance analysis.

Silvia Deandrea, Eva Negri and Fabrizio Ruggeri

Using Bayesian informative priors for decision making: the case of the risk/benefit assessment of benzodiazepines and antiepileptics prescription in older people

Background: Accidents are the fifth leading cause of death in adults at least 65 years old and falls cause two thirds of accidental deaths. Falls are also very common, with about 30% of community-dwelling older adults falling every year. Drugs are both a powerful way to treat various diseases and risk factors for falls. When a drug is prescribed for an older person a risk/benefit assessment should be carefully done.
Methods: Observational published studies about benzodiazepines and antiepileptics as risk factors for falls in community-dwelling older people were retrieved. A questionnaire to elicit beliefs about risk factors for falls in older people was administered to a sample of geriatricians and general practitioners. The elicited opinions were translated into different prior distributions and included in a fully Bayesian meta-analysis of prospective studies.
Results: There were 8 studies about benzodiazepines and only 4 studies about antiepileptics. When considering the information provided by published literature, the association between benzodiazepines and falls was significant; a stronger association was detected between antiepileptics and falls, but with great variability across studies and a borderline significance. Integrating clinicians' opinion in the meta-analysis confirmed the role of benzodiazepines as a risk factor for falls and strengthened the association between antiepileptics and falls.
Conclusions: Antiepileptics are associated with an increased risk of falls. The literature evidence was scanty and the use of a valid elicitation procedure for integrating experts' opinion in meta-analysis provided external and useful information for decision making.

Philippe Delquié and Alessandra Cillo

Portfolio Selection with Enhanced Behavioral Features

We present results of stocks portfolio optimization using a new risk measure, so-called M-R, which we introduced in previous work (Cillo & Delquié 2010). The risk measure was derived from a simple behavioral assumption that preferences depend on multiple reference points, and it was shown to fulfill a number of normative properties. The portfolios resulting from M-R present several desirable features: they tend to be more diversified and retain more positive skewness in the returns than the Mean-Variance portfolios. In some cases, we observe Restricted Second Order Stochastic Dominance of the distribution of M-R portfolio returns over the distribution of Mean-Variance returns. Finally, unlike the classic model, our risk measure allows thresholds in investment decisions, that is, the investor may decline holding an efficient portfolio if its risk/reward profile is insufficiently favorable. In sum, the M-R model can help construct diversification strategies and portfolios of projects or securities that may be more to the taste of decision makers.

Josep Freixas and Montserrat Pons

Structural Importance Measures in Reliability Systems and Power Indices for Simple Games

This work considers several structural measures of importance for components in a reliability system and we link them with power indices for simple games and power indices for games with several levels of approval. All these measures are founded by probability arguments. Indeed, we prove that each of them is the probability of a certain event and give different interpretations for them.

In general, it is computationally difficult to calculate these measures, even for systems with a small number of components. Thus, we propose a mathematical tool, based on generating functions, to compute them in an efficient way for a reasonable number of components. We provide examples taken from different real-world fields, among others, social sciences and reliability.

Simon French

Expert Judgement, Meta-Analysis and Participatory Risk Analysis

Some twenty five years ago, I distinguished three contexts in which one might wish to combine expert judgements of uncertainty: the expert problem, the group decision problem and the textbook problem. Over the intervening years much has been written on the first two, which have the focus of a single decision context, but little on the third, though the closely related field of meta-analysis has developed considerably. The text-book problem relates to how one should draw expert judgements into a decision analysis when those judgements were made originally in other contexts. However, as societal decision making has become more open, as stakeholders have become involved in framing and deciding on policy, and as the imperatives for evidence-based decision making have grown, the need to address the text-book problem has become more apparent. This is particularly the case in societal risk management. Further the growth of the Web and the ease with which we may all access past reports and studies have exacerbated our need for coherent methodologies to combine and use expert judgement "out of context". Put simply, we have meta-analyses for data; we need them for judgements too.

Yigal Gerchak and Yaniv Mordecai

Strategy-Proof Method for Group Assessment of Probability Distributions in Risk Analysis

In complex and unprecedented settings, it is often difficult and, arguably, undesirable to assess the outcome of uncertain events based on objective or statistical data, and experts' assessments are used. Strategic bias is the conscious consequence of personal or local cost/benefit considerations, and arises in formally collaborative but, in reality, not quite cooperative assessment processes. It clearly arises in risk assessment processes, especially when assessors are also stakeholders. It has been recognized and discussed in the Social Choice and Decision Analysis literatures, but not in the context of risk assessment. Moreover, common assessment combination methods, such as averaging and Bayesian inference, are vulnerable to strategic bias.

We study the characteristics of strategic bias in risk assessment, and provide a strategy-proof assessment combination mechanism called MEDAS - Median Distribution Assessment Scheme - based on the Median Voter scheme. The mechanism is useful in collaborative assessment of general continuous and discrete probability distributions and can be applied in risk analysis.

Ahmad Izhar

Optimality Conditions and Duality in Nondifferentiable Multiobjective Programming

A nondifferentiable multiobjective problem is considered. Fritz John and Kuhn-Tucker type necessary and sufficient conditions are derived for a weak efficient solution. Kuhn-Tucker type necessary conditions are also obtained for a properly efficient solution. Weak and strong duality theorems are established for a Mond-Weir type dual. Moreover, for a converse duality theorem we discuss a special case of nondifferentiable multiobjective problem, where subgradients can be computed explicitly.

Francesca Ieva, Alessandra Guglielmi, Anna Maria Paganoni and Fabrizio Ruggeri

Provider Profiling for Supporting Decision Makers in Coronary Patient Care

Provider profiling is the evaluation process of the performances of hospitals, doctors, and other medical practitioners to increase the quality of medical care. This work reports the analysis, through a semiparametric Bayesian model, of MOMI2 survey (MOnth MOnitoring Myocardial Infarction in Milan); the data concern patients with STEMI (ST-Elevation Myocardial Infarction) diagnosis, admitted in one of the hospitals belonging to the Urban Area Network of Milano. The main aim of the work is to point out process indicators to be used in health-care evaluation to support decisions of the clinical and organizational governance.

Serguei Kaniovski and Alexander Zaigraev

Exact Bounds on the Probability of at Least k Successes in n Exchangeable Bernoulli Trials as a Function of Correlation Coefficients

We compute the minimum and maximum of the probability of k or more successes in n exchangeable Bernoulli trials as a function of the correlation coefficients. Finding the minimum and maximum requires solving linear programming problems with constraints imposed by the non-negativity of the joint distribution. We show that the maximum can be lower than certainty (no certain success), whereas the minimum can be higher than zero (positive residual risk).

The above probability plays an important role in the reliability of k-out-of-n systems. The reliability of a system is defined as the probability that the system will function. Factors that may lead to dependent component performance include the influence of a common operating environment, and the fact that failure of one component may increase the strain on the remaining components, leading to the failure cascades.

The same probability finds an application in decision theory. The literature on Condorcet's Jury Theorem studies the expertise of a group of experts. The experts cast their vote in favor of one of two alternatives. Individual votes are aggregated into a collective judgment using a voting rule. Stochastic independence cannot be reconciled with commonalities in experts' preferences, information asymmetries and strategic behavior.

Krzysztof Kolowrocki and Joanna Soszynska

Integrated Safety and Reliability Decision Support System

The Integrated Safety and Reliability Decision Support System - IS&RDSS composed of the methods of complex technical systems operation processes modeling, the methods of unknown parameters of complex technical systems operation, reliability, availability, safety models identification, the methods of complex technical systems reliability, availability and safety evaluation and prediction, the methods of complex technical systems reliability, availability and safety improvement and the methods of complex technical systems operation, reliability, availability, safety and cost optimization are presented in Kolowrocki and Soszynska (2011).

The procedure of the IS&RDSS usage is presented in the form of detailed and clear scheme-algorithm.

Kolowrocki K., Soszynska J.: Reliability and Safety of Complex Technical Systems and Processes: Modeling - Identification - Prediction - Optimization. Springer, 2011 (to appear).

Miguel Lejeune

Game Theoretical Approach for Reliable Enhanced Indexation

Enhanced indexation is a structured investment approach which combines passive and active financial management techniques. The objective is to construct a fund that replicates the behavior of a benchmark, while providing an excess return. We propose a reliable enhanced indexation model whose goal is to maximize the excess return that can be attained with high probability, while ensuring that the relative risk does not exceed a given threshold. The relative risk is measured with the coherent semideviation risk functional. The asset returns are modelled as random variables characterized by a joint discrete probability distribution. To hedge against the estimation risk, we consider that only limited information about the probability distribution of the index return is available, and we derive an ellipsoidal distributional set for the random index return. We formulate the enhanced indexation problem as a minimax game theoretical model, in which the inner optimization problem identifies the worst possible outcome within the distributional set. In order to avoid model specification issues, we compute the variance of the excess return with an approach that does not need the estimation of a (positive-definite) variance-covariance matrix. Finally, we show that the game theoretical model can be recast as a convex programming problem, and discuss the results of numerical experiments.

Marco LiCalzi and Oktay Surucu

The power of diversity over large solution spaces

We consider a team of agents with limited problem-solving ability facing a disjunctive task over a large solution space. We provide sufficient conditions for the following four statements. First, two heads are better than one: a team of two agents will solve the problem even if neither agent alone would be able to. Second, teaming up does not guarantee success: if the agents are not sufficiently creative, even a team of arbitrary size may fail to solve the problem. Third, "defendit numerus": when the agent's problem-solving ability is adversely affected by the complexity of the solution space, the solution of the problem requires only a mild increase in the size of the team. Fourth, groupthink impairs the power of diversity: if agents' abilities are positively correlated, a larger team is necessary to solve the problem.

Marco Maggis and Marco Frittelli

Complete Duality for Quasiconvex Dynamic Risk Measures

Quasiconvex analysis has important applications in several optimization problems in science, economics and finance. Recently, a Decision Theory complete duality involving quasiconvex real valued functions has been proposed by [1]. Important inputs to finance has been given by Maccheroni et al. [2] and Kupper et al. [3] in the theory of Risk Measures. During financial crisis a lack of liquidity in the market may cause many troubles in covering the losses and consequently cash additivity must be dropped. Giving up the cash additivity of risk measures, convexity and quasiconvexity are not anymore equivalent: the authors in [2] argue that from a theoretical point of view quasiconvexity gives a better explanation and description of the diversification principle.
In this talk we consider quasiconvex maps and analyze their dual representation, comparing two dirent possible approaches. The aim is to present a conditional version of the dual representation.

Cerreia-Vioglio, S., Maccheroni, F., Marinacci, M. and Montrucchio, L. (2009) Complete monotone quasiconcave duality , The Carlo Alberto Working Papers No. 80.

Cerreia-Vioglio, S., Maccheroni, F., Marinacci, M. and Montrucchio, L. (2009) Risk measures: rationality and diversification, to appear on Math. Fin..

Drapeau, S. and Kupper, M. (2010) Risk Preferences and their Robust Representation, Preprint.

El Karoui, N. and Ravanelli, C. (2009) Cash sub-additive risk measures and interest rate ambiguity, Mathematical Finance, 19(4) 561-590.

Frittelli, M. and Maggis, M. (2009) Dual representation of quasiconvex conditional maps, preprint.

PDF file

Massimo Marinacci and Itzhak Gilboa

Ambiguity and the Bayesian Paradigm

This is a survey of some of the recent decision-theoretic literature involving beliefs that cannot be quantified by a Bayesian prior. We discuss historical, philosophical, and axiomatic foundations of the Bayesian model, as well as of several alternative models recently proposed. The definition and comparison of ambiguity aversion and the updating of non-Bayesian beliefs are briefly discussed.

Alfonso Mateos, Antonio Jimenez and Hector J. Pichardo

Risk Assessment on the National Railway System of Spain

Fatal train accidents are not frequent, but they attract a high level of public attention. The appraisal of safety measures requires estimates of the numbers of casualties that can be expected to save. These estimates can be obtained by projecting trends in the frequencies and consequences of fatal train accidents. This paper is aimed at estimating the likely scale of future fatalities in train accidents and the benefits of possible safety measures in the national railway system of Spain using data over a 21-year period 1988-2009. Accidents are presumed to occur randomly at a rate of λ per train-kilometers. On the other hand, the number of fatalities in accidents is also random, and has a probability distribution with mean μ. Then, the mean number of fatalities per train-kilometers is λ μ, which can be considered as the primary measure of the risk of accidental death. Either or both of λ and μ may change over time, and there may be sets of different λ's and μ's for different types of accident.

Parameters λ and μ are not directly observable, but they can be estimated directly from data on past accidents. Estimating fatality risks λ μ consequently involves three steps: First, estimating mean accident rates λ, and trends in λ, from data on accidents frequencies; then, estimating mean accident consequences μ and, finally, multiplying both estimates to give fatality risk λ μ.

Sophie Mercier

Bivariate Subordinators as Wear Processes in Reliability, with Application to Preventive Maintenance

Since a few decades, the development of on-line systems observation has allowed to think of enhanced models in reliability, based on the effective measurement of a system deterioration level. In the common case of an increasing deterioration, classical models are (univariate) compound Poisson and Gamma processes, which both are subordinators (non-increasing Levy processes). Based on such univariate models, lots of different preventive maintenance policies have already been proposed, according to the available observation of the system (continuous monitoring or point-wise inspections). A system deterioration cannot however always be reduced to one single indicator, and multivariate increasing models may be necessary. Such a point has not yet been much studied in the reliability literature.

In this talk, we shall show that bivariate subordinators are well adapted for the modeling of bivariate deterioration and we shall present first results for the optimization of associated preventive maintenance policies. An application to railway track data from the SNCF (French national railway society) will be provided.

Roberto Monte and Sara Massotti

Inefficient Stock Prices and Stock Market Anomalies under Rational Expectations

This paper presents a stock pricing model with asymmetrically informed risk-averse investors extending Wang's celebrate model (1993) (see also Campbell & Kyle (1993), by accounting for correlation between private and public information changes, liquidity investors' overreaction (underreaction) to changes in public information and the impact of transaction costs on rational investors' trading strategies. Wang focuses on the determination and properties of a single equilibrium candidate of the model, characterized by a stock price which is ex ante informationally efficient in the semi-strong form. The existence of other candidates is not addressed. By contrast, in a Bayesian-Nash perspective, we discover multiple equilibrium candidates, among which Wang's equilibrium, many of which exhibit informationally inefficient stock prices and Pareto dominate Wang's candidate under high market noise volatility or high subjective risk aversion. In addition, we show how the overreaction (underreaction) to dividend changes and the transaction costs allow to address several of the empirically tested stock market anomalies (e.g. price momentum, reversal, post-earnings announcement drift, excess return volatility) with respect to the perfectly competitive rational expectation benchmark

Robert Nau

Imprecise Probabilities in Non-cooperative Games

Game-theoretic solution concepts such as Nash equilibrium and Bayesian equilibrium are commonly used to predict or prescribe strategic behavior in terms of precise probability distributions over outcomes.However, there are many potential sources of imprecision in beliefs about the outcome of a game:incomplete knowledge of payoff functions, non-uniqueness of equilibria, heterogeneity of prior probabilities, and distortions of revealed beliefs due to risk aversion, among others.This paper presents a unified approach for dealing with these issues, in which the general solution of a game is a convex set of correlated equilibria expressed in terms of risk neutral probabilities.

Tapan K. Nayak and Haojin Zhou

Equivariant Statistical Prediction: A General Framework and Some New Results

We consider a general statistical decision problem involving a random observable X, an unobservable Y, a family of joint distributions of (X, Y), indexed by a parameter &theta (possibly vector valued), and a loss function that depends on the decision d and y and possibly also on x and &theta. This setup covers prediction, estimation and many nonstandard inference problems. We formalize the equivariance concepts in our context by describing the formal structure of the problem appropriately and defining loss invariance suitably under a general transformation group that is consistent with the functional and formal equivariance principles. Such a group is more complex than the ones which arise in standard decision problems. We present a characterization of equivariant decision rules in terms of a maximal equivariant and a method for finding a best equivariant predictor. We explore the possibility of simplifying the problem by integrating out the unobservable (Y) in the loss function. This approach works out more conveniently in many applications. We also discuss connections between equivariance and risk unbiasedness. Applications of our theoretical results are illustrated with several examples.

Monica Oliveira, Carlos Bana e Costa and Mafalda Figueiredo

Improving Probability Impact Diagram Matrices using Multiple Criteria Decision Analysis Tools

Risk matrices have been recommended by international organizations and are widely used as a framework for practical risk analysis by many enterprises and risk consultants. Nonetheless, available studies indicate that not only the use of risk matrices might generate inconsistent risk criticality ratings but also that risk matrices disrespect important theoretical properties. This study investigates how multiple criteria decision analysis (MCDA) tools can be used to improve the design and the deployment of risk matrices in the context of prioritization of risks and of risk reductions. Using MCDA, it is herewith proposed a modeling approach based on: (1) a multi criteria additive value model applying the Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) both to measure risk impacts and to build subjective probabilities; (2) the transformation of a risk matrix into a Probability Impact Graph (PIG) that uses probabilities and multicriteria value scores; (3) the use of multicriteria and non-compensatory classification procedures to classify risks from the PIG by severity; (4) and the use of multicriteria resource allocation models to derive the most effective set of interventions to reduce risk, taking into account cost and other constraints. The proposed modeling approach is illustrated with data from a real case study developed at ALSTOM Power.

Shariefuddin Pirzada and Muhammad Ali Khan

Clusterability of Portfolio Signed Graphs

Signed graphs have been used to model positive and negative correlations between the securities in a portfolio. The idea is to represent the securities as nodes; the correlations as signed edges and study the structural balance of the resulting signed graph. Harary, Lim and Wunsch [IMA Journal of Management Mathematics 13(2002), 201-210] argued that the portfolios leading to balanced signed graphs with at least one negative edge are more predictable than the ones characterized by unbalanced signed graphs. However, in practice only few signed graphs are balanced and it is unlikely that a complex real-life portfolio is balanced. We therefore introduce a relaxed notion of structural balance for portfolio signed graphs, namely, clusterability. We show that a clusterable portfolio is more predictable than a non-clusterable portfolio and that a clusterable portfolio with at least one negative edge provides a good hedging mechanism. It is noteworthy that the portfolio analysis presented in this paper is accessible to all practitioners due to the availability of free tool support with detailed documentation. This makes the implementation easier for analysts with limited or no background in graph theory.

Nicholas G. Polson and Morten Sorensen

A Simulation-based Approach to Stochastic Dynamic Programming

We develop a simulation-based approach to stochastic dynamic programming. To solve the Bellman equation we provide a Monte Carlo estimates of the associated Q-values. Our method is scalable to high dimensions and applies to both continuous and discrete state and decision spaces whilst avoiding discretization errors that plague traditional methods. We provide a geometric convergence rate. We illustrate our methodology with a dynamic stochastic investment problem.

Jesus Rios, David Rios Insua and Juan Carlos Sevillano

Managing Adversarial Risks: The Somali Pirates Case

We have recently introduced a framework for adversarial risk analysis (ARA), aimed at the analysis of decision making when there are intelligent opponents and uncertain outcomes. We present here how such a framework may help the owner of a cargo ship in managing risks from piracy in the coast of Somalia. The ship's owner will proactively decide on a defensive strategy to reduce piracy risks, ranging from different levels of deployed armed security to sailing through alternative much longer routes avoiding such area. The Pirates will respond to the defender's move by launching (or not) an attack with the intention of taking over the ship and asking for ramson. If the Pirates' operation is successful, the ship's owner will have to decide on: paying or not paying the ramson, or even sending armed forces to release the ship. We illustrate how the sequential defend-attack-defend model can be used to formulate our supported decision maker's problem. We show the kind of solution proposed by our ARA methodology and computations. Emphasis will be put on explaining how we can model the Pirates thinking in order to anticipate their behaviour and how it would lead to a predictive probability distribution, from the defender's perspective, over what the Pirates may do. We also compare our solution to those obtained using an standard game theoretic approach.

David Rios Insua

Bayesian Analysis for semi-Markov Processes with Applications to Reliability and Maintenance

We study Bayesian inference, forecasting and decision making procedures with semi-Markov processes. We consider both short-term and long-term forecasting. We then consider reliability, availability and maintenance applications, using flowgraph models to deal with generalized phase-type absorbing times, and illustrate them with a HW reliability problem.

Renata Rotondi

Bayesian Inference in Seismic Risk Analysis: Nonparametric Estimation of Earthquake Recurrence Time Density

The knowledge of a physical model able to explain in its entirety the generation of earthquakes at the various time-space scales involved in the process is still, and will be for long time, a challenging goal. Consequently, the stochastic model development is grounded on the combination of widely shared physical hypotheses and powerful statistical tools. Among the formers there is the idea that the time elapsed since the last strong shock affects the occurrence of a future event, that is, that after a large earthquake the stress accumulation process restarts so as the times between consecutive large seismic events can be considered as realizations of independent, identically distributed random variables. Various probability distributions $F$ have been proposed for the intervent times in the literature but the results are not completely satisfactory, also because the data, generally sparse and irregular, are difficult to fit through parametric models. The proposed solution consists in assuming that $F$ is a random distribution modelled by a mixture of Polya tree processes which, under mild assumptions, assigns probability one to the space of the continuous distributions. It relies on a binary tree partitioning of the domain $R^{+}$ obtained through the quantiles of a generalized gamma distribution, family of distributions properly including the most used ones for the recurrence time: gamma, Weibull, lognormal. The parameters of this distribution are themselves gamma-distributed random variables and the hyperparameters are chosen by fusing the geological and tectonic information provided by the Italian database of seismogenic sources (DISS). The definition of this hierarchical model is completed by assigning to each $F(B_{\epsilon})$, being $B_{\epsilon}$ any set of the partition, a beta distribution with parameters guaranteeing the continuity of $F$. The estimation technique is based on a MCMC sampler using Metropolis-Hastings within Gibbs sampling and on data simulation.

The model is applied to the Italian earthquakes with $M_w$ ≥ 5.3 drawn from the CPTI04 catalogue and having epicenter into the seismogenic areas of the database DISS, subdivided into eight tectonically-coherent macro-regions. Maps of occurrence probability at different forecasting horizons are evaluated by estimating, for each region, the density of the intervent time.

Retrospective validation on the events recorded in some decades of the past century is used: (1) to test the model; (2) to support changes of the geological database, (3) to inspire guidelines of a national protocol for the operational use of long-term forecasts by Civil Protection.

Sujit Sahu

On Assessing the Probability of Exceedance of Secondary Ozone Air Quality Standards

The clean air act requires certain air quality standards to be maintained for important air pollutants such as ozone. These standards set limits to protect public health and also provide protection against decreased visibility, damage to animals, crops, vegetation, and buildings. Specifically, this paper considers an ozone metric called W126 that is the annual maximum consecutive three month running total of weighted sum of hourly concentrations observed between 8AM and 8PM on each day during the high ozone season of May through September. The paper develops a hierarchical Bayesian spatio-temporal model to assess the risk of W126 exceeding the regulatory standards at unmonitored locations inside a vast study region such as the eastern United States monitored by a sparse network of air quality measurement stations. The model, auto-regressive in time, combines monitoring data together with the output of a relevant numerical model which produces estimates of ozone levels on a grid. Model based Bayesian spatial interpolation of the W126 metric at several validation sites are shown to be very accurate for the corresponding observed values. These methods are used to obtain an accurate map of risk to public health arising from exceedance of the W126 standards. Such a map is shown to be of great benefit to policy makers in environmental decision making and risk management.

Michele Scuotto, Pasquale di Tommaso, Massimiliano Giorgio and Alfredo Testa

Operational Availability Evaluation of Complex Systems with Non Exponential Downtimes

The impact of some management/maintenance decisions on the operational availability of a fleet of trams is studied. Each tram of the fleet is treated as a multi-state system. Failure times and inherent repair times of the trams are modeled as stochastically independent random variables. It is assumed that trams to be repaired are arranged in a queue. Failure times are modeled as exponential random variables. Inherent repair times are realistically assumed to be non-exponential. The presence of a queue gives rise to stochastic dependence between the unscheduled downtime due to a failure of a given tram and the fleet state. To account for this form of dependence, the whole fleet of trams is modeled as a multi-state system. The process which describes the state of the fleet results to be a Semi-Regenerative process. Thus, the device of Stages technique and the Monte Carlo Simulation method, two widespread approaches used to handle Semi-Regenerative processes, are adopted to compute the fleet operational availability. Advantages and drawbacks of these two alternative approaches are discussed.

Nozer Singpurwalla

The Stochastics of Diagnostic and Threat Detection Tests

Tests for medical diagnosis and threat detection pose some interesting issues vis a vis the characteristics of their ROC Curves. In this talk I will overview such tests and show how notions of partial +order (convex and star-shapedness), the Lorenz Curve, and information theoretic ideas such as the entropy and the Kullback Liebler distance can be fruitfully brought into this arena. The topic poses opportunities for connecting stochastic ordering and information theory. This work is preliminary and in progress, almost surely.

Jim Q. Smith and Lorraine Dodd

On Command and Control Management of Autonomous Agents facing Conflicting Objectives

UK military commanders of different battle groups, who have devolved responsibility from command and control (C2), are trained and are expected to act rationally and accountably. In this paper we interpret this demand from a Bayesian perspective that they should act as if they are expected utility maximizers. One of the most stressed environments for fielded commanders to retain rationality and coherence are those where current tactical objectives conflict with broader campaign objectives. In this talk we assume that even when C2 has only limited understanding of the broad shape of their agents' utility functions she is still often able to identify those environments where different commanders, players in a collaborative game, may not act coherently and when commanders will be faced with rationality eroding "contradictions" - that is the stress that they later learn that they should have acted very differently than they did with no possibility of recovering. We do this by analysing the different geometries of expected utilities arising when decision makers are assumed to have a utility with two value independent attributes. The ideas are illustrated and informed by results from various simulated decision scenarios.

Relevant papers:

Dodd, L and Smith, J.Q.(2011) "Developing C2 regulatory agents for modelling and simulation of C2 agility in network enabled forces" (submitted to Journal of Defense Modeling and Simulation)

Dodd, L. and Smith, J.Q. (2010) "Devolving Command Decisions in Complex Operations" CRISM Res. Rep. 10 -17 (submitted to J. of the Operational Res. Soc.)

Dodd L., Moffat, J. and Smith, J.Q. (2006) "Discontinuity in decision making when objectives conflict: a military command decision case study", Journal of the Operational Research Society, 57, 643-654

Joanna Soszynska and Krzysztof Kolowrocki

Integrated Safety and Reliability Decision Support System Applications in Maritime and Port Transport

The Integrated Safety and Reliability Decision Support System - IS&RDSS [1] is testified by its application to the operation, reliability, safety and operation cost modeling, identification, prediction and optimization of the port oil piping transportation system and the maritime ferry technical transportation system.

The successive steps of the scheme-algorithm proposed in [1] that should be followed and the way of using the support given in the forms of practical instructions and theoretical backgrounds included in [2] are clarified by these application.

[1]Kolowrocki K., Soszynska J.: Integrated Safety and Reliability Decision Support System. Talk at 2nd Symposium on Games and Decisions in Reliability and Risk - GDRR 2011.

[2]Kolowrocki K., Soszynska J.: Reliability and Safety of Complex Technical Systems and Processes: Modeling - Identification - Prediction - Optimization. Springer, 2011 (to appear).

Fabio L. Spizzichino and Rachele Foschi

The Burn-in Problem and Related Aspects of Ageing and Risk Aversion

The burn-in problem is a classical and very natural decision problem in the field of reliability. The related literature is by now very extended and several burn-in models have been analyzed by different Authors. The decision whether implementing or not a burn-in procedure is related to the possibility of early failures or, in other words, on aspects of negative ageing in the life-distribution of units of interest. If a burn-in procedure is justified, then a further decision problem concerns the choice of the optimal duration of the test. In this decision, more detailed aspects of the ageing properties of the units' lifetimes become relevant.

The solution of the problem, of course, also depends on the structure of associated costs and it is interesting to analize how cost structure and ageing interact in determining an optimal solution.

In this talk we will analyze in details different aspects of this type. We shall see that the assumed structure of costs can influence the probabilistic description of ageing properties.

Along the talk we shall also remind, for a fixed utility function u, the definition and basic properties of $\alpha_{u}(x)$, the local coefficient of risk aversion (introduced by de Finetti, Arrow, and Pratt). Let G be the survival function of the lifetime of a unit to be tested and consider its failure rate function r_G and û:[0,∞) -> [0,1) the utility function obtained by letting

û(x)=1-G(x).

We shall see in particular how $\alpha_{û}$ and $r_G$ interact in determining the optimal solution of the burn-in problem.

Andreas Thümmel and Dennis Bergmann

Agent Based Simulation of Market Economy Dynamics for Risk Management

This work presents dynamic models for on one hand for a market and the related firms, on the other hand banks financing the firms by liability capital, having deposits and are limited to regulatory rules; at least a central bank was defined as the player for money stability politics. The game theory approach defines the relationships between these partners in this closed economy driven by their interests (strategies) and issues (limitations). This leads to a combined principal-agent and evolutionary setup. The models were proofed by some plausibility checks, e.g. by the power law of firm size distribution. The results gives hints for optimal strategies for the firms and banks having most economic success, and additional regulatory rules are studied for economy stability.

Fabio Trojani and Ilaria Piatti

Predictable Risks and Predictive Regression in Present-Value Models

Within a present-value model with time-varying risks, we develop a latent variable approach to estimate expected market returns and dividend growth rates consistently with the conditional risk features implied by present-value pricing constraints. We find that expected dividend growth and expected returns are time-varying, but while the explained fraction of dividend variability is low (with average R2 values below 1%), the portion of return variation predicted is large (with average R2 values of about 50%). Expected dividend growth is more persistent than expected returns and generates a substantial price-dividend ratio component, which masks the large predictive power of valuation ratios for future returns. The model implies (i) standard predictive regressions consistent with a weak return predictability and a failing dividend predictability by aggregate price-dividend ratios, (ii) predictable market volatilities and (iii) volatile and often counter-cyclical Sharpe ratios. Our results show the importance of accounting for time-varying risks and the potential long-run effect of persistent dividend forecasts when predicting future returns using valuation ratios.

Paola Vicard, Julia Mortera and Cecilia Vergani

Object-Oriented Bayesian Networks for Solving Symmetric Repeated Games

Here we present a new approach to dealing with symmetric repeated games. Firms in many cases have incentives to cooperate (collude) to increase their profits. The Antitrust Authority's (AA) main task is to monitor and to prevent potential anti-competitive behaviour and their effects. Here the AA decision process is modelled via a Bayesian network (BN) estimated from real data. We study how monitoring by the AA affects firms strategies about cooperation. Firms strategies are modelled as a repeated prisoners dilemma using Object-Oriented Bayesian Networks (OOBNs). We first reformulate single stage games as a Bayesian network that encodes the probabilistic relationships among the variables of interest allowing for the application of fast general-purpose algorithms to compute inferences. Using this approach we can also incorporate various sources of uncertainty within the game itself.

Here we show how OOBNs can be used to model a duopolist decision process integrated with external market information. Both the relational structure and the parameters of the market behaviour model are estimated (learned) from a real dataset provided by the AA. Thanks to the modularity and flexibility of this approach generalizations and complications of the repeated prisoner's dilemma can be analysed. Various decision scenarios are shown and discussed.

Robert Wolpert

Hazard Assessment for Pyroclastic Flows

We combine use of (deterministic) computer models and (stochastic) statistical models to assess the probability of inundation at a specified location by a pyroclastic flow from an active volcano. As a testbed we study pyroclastic flows of the Soufrière Hills Volcano on the Caribbean island nation of Montserrat.

We compute the probability of a catastrophic event at a specific location in the next T years, for values of T ranging up to fifty years, using
  • computer implementations of mathematical models of flows to allow extrapolation to unseen situations;
  • statistical models for needed stochastic inputs to the computer model, appropriate for rare events;
  • a computational strategy for rare events, based on adaptive emulation of the computer model.