# GoldSim Appendices

date post

14-Apr-2015Category

## Documents

view

50download

5

Embed Size (px)

### Transcript of GoldSim Appendices

GoldSim Users Guide Appendix A: Introduction to Probabilistic Simulation - 789 Appendix A: Introduction to Probabilistic Simulation O Ou ur r k kn no ow wl l e ed dg ge e o of f t t h he e w wa ay y t t h hi i n ng gs s w wo or rk k, , i i n n s so oc ci i e et t y y o or r i i n n n na at t u ur re e, , c co om me es s t t r ra ai i l l i i n ng g c cl l o ou ud ds s o of f v va ag gu ue en ne es ss s. . V Va as st t i i l l l l s s h ha av ve e f f o ol l l l o ow we ed d a a b be el l i i e ef f i i n n c ce er r t t a ai i n nt t y y. . K Ke en nn ne et t h h A Ar rr ro ow w, , I I K Kn no ow w a a H Ha aw wk k f f r ro om m a a H Ha an nd ds sa aw w Appendix Overview This appendix provides a very brief introduction to probabilistic simulation (the quantification and propagation of uncertainty). Because detailed discussion of this topic is well beyond the scope of this appendix, readers who are unfamiliar with this field are strongly encouraged to consult additional literature. A good introduction to the representation of uncertainty is provided by Finkel (1990) and a more detailed treatment is provided by Morgan and Henrion (1990). The basic elements of probability theory are discussed in Harr (1987) and more detailed discussions can be found in Benjamin and Cornell (1970) and Ang and Tang (1984). This appendix discusses the following: - Types of Uncertainty - Quantifying Uncertainty - Propagating Uncertainty - A Comparison of Probabilistic and Deterministic Analyses - References In this Appendix Types of Uncertainty 790 - Appendix A: Introduction to Probabilistic Simulation GoldSim Users Guide Types of Uncertainty Many of the features, events and processes which control the behavior of a complex system will not be known or understood with certainty. Although there are a variety of ways to categorize the sources of this uncertainty, for the purpose of this discussion it is convenient to consider the following four types: - Value (parameter) uncertainty: The uncertainty in the value of a particular parameter (e.g., a geotechnical property, or the development cost of a new product); - Uncertainty regarding future events: The uncertainty in the ability to predict future perturbations of the system (e.g., a strike, an accident, or an earthquake). - Conceptual model uncertainty: The uncertainty regarding the detailed understanding and representation of the processes controlling a particular system (e.g., the complex interactions controlling the flow rate in a river); and - Numerical model uncertainty: The uncertainty introduced by approximations in the computational tool used to evaluate the system. Incorporating these uncertainties into the predictions of system behavior is called probabilistic analysis or in some applications, probabilistic performance assessment. Probabilistic analysis consists of explicitly representing the uncertainty in the parameters, processes and events controlling the system and propagating this uncertainty through the system such that the uncertainty in the results (i.e., predicted future performance) can be quantified. Quantifying Uncertainty When uncertainty is quantified, it is expressed in terms of probability distributions. A probability distribution is a mathematical representation of the relative likelihood of an uncertain variable having certain specific values. There are many types of probability distributions. Common distributions include the normal, uniform and triangular distributions, illustrated below: 0.000.020.040.060.080 10 20 30 40PDF 0.000.010.020.030.040 10 20 30 40PDF 0.000.020.040.060.080 10 20 30 40PDF Normal Distribution Uniform Distribution Triangular Distribution All distribution types use a set of arguments to specify the relative likelihood for each possible value. For example, the normal distribution uses a mean and a standard deviation as its arguments. The mean defines the value around which the bell curve will be centered, and the standard deviation defines the spread of values around the mean. The arguments for a uniform distribution are a minimum and a maximum value. The arguments for a triangular distribution are a minimum value, a most likely value, and a maximum value. The nature of an uncertain parameter, and hence the form of the associated probability distribution, can be either discrete or continuous. Discrete distributions have a limited (discrete) number of possible values (e.g., 0 or 1; yes Understanding Probability Distributions Quantifying Uncertainty GoldSim Users Guide Appendix A: Introduction to Probabilistic Simulation - 791 or no; 10, 20, or 30). Continuous distributions have an infinite number of possible values (e.g., the normal, uniform and triangular distributions shown above are continuous). Good overviews of commonly applied probability distributions are provided by Morgan and Henrion (1990) and Stephens et al. (1993). There are a number of ways in which probability distributions can be graphically displayed. The simplest way is to express the distribution in terms of a probability density function (PDF), which is how the three distributions shown above are displayed. In simple terms, this plots the relative likelihood of the various possible values, and is illustrated schematically below: Note that the height of the PDF for any given value is not a direct measurement of the probability. Rather, it represents the probability density, such that integrating under the PDF between any two points results in the probability of the actual value being between those two points. Numerically generated PDFs are typically presented not as continuous functions (as shown above), but as histograms, in which the frequencies of the various possible values are divided into a discrete number of bins. Histograms of the same three PDFs shown above would look like this: 0.000.020.040.060.080 10 20 30 40PDF 0.000.010.020.030.040 10 20 30 40PDF 0.000.020.040.060.080 10 20 30 40PDF Note: Discrete distributions are described mathematically using probability mass functions (pmf), rather than probability density functions. Probability mass functions specify actual probabilities for given values, rather than probability densities. An alternative manner of representing the same information contained in a PDF is the cumulative distribution function (CDF). This is formed by integrating over the PDF (such that the slope of the CDF at any point equals the height of the PDF at that point). For any point on the horizontal axis r, the CDF shows the cumulative probability that the actual value will be less than or equal to r. That is, as shown below, a particular point, say [ri , P1], on the CDF is interpreted as follows: P1 is the probability that the actual value is less than or equal to ri. Quantifying Uncertainty 792 - Appendix A: Introduction to Probabilistic Simulation GoldSim Users Guide By definition, the total area under the PDF must integrate to 1.0, and the CDF therefore ranges from 0.0 to 1.0. A third manner of presenting this information is the complementary cumulative distribution function (CCDF). The CCDF is illustrated schematically below: A particular point, say [ri, P2], on the CCDF is interpreted as follows: P2 is the probability that the actual value is greater than ri . Note that the CCDF is simply the complement of the CDF; that is, P2 is equal to 1 - P1. Probability distributions are often described using quantiles or percentiles of the CDF. Percentiles of a distribution divide the total frequency of occurrence into hundredths. For example, the 90th percentile is that value of the parameter below which 90% of the distribution lies. The 50th percentile is referred to as the median. Probability distributions can be characterized by their moments. The first moment is referred to as the mean or expected value, and is typically denoted as . For a continuous distribution, it is computed as follows: }= dx f(x) x where f(x) is the probability density function (PDF) of the variable. For a discrete distribution, it is computed as: ==N1 ii i) p(x x in which p(xi) is the probability of xi, and N is the total number of discrete values in the distribution. Characterizing Distributions Quantifying Uncertainty GoldSim Users Guide Appendix A: Introduction to Probabilistic Simulation - 793 Additional moments of a distribution can also be computed. The nth moment of a continuous distribution is computed as follows: }= dx f(x) ) - (x nn For a discrete distribution, the nth moment is computed as: ==N1 iini n ) p(x ) - (x The second moment is referred to as the variance, and is typically denoted as o2. The square root of the variance, o, is referred to as the standard deviation. The variance and the standard deviation reflect the amount of spread or dispersion in the distribution. The ratio of the standard deviation to the mean provides a dimensionless measure of the spread, and is referred to as the coefficient of variation. The skewness is a dimensionless number computed based on the third moment: 33 skewness = The skewness indicates the symmetry of the distribution. A normal distribution (which is perfectly symmetric) has a skewness of zero. A positive skewness indicates a shift to the right (and example is the log-normal distribution). A negative skewness indicates a shift to the left. The kurtosis is a dimensionless number computed based on the fourth moment: 44 kurtosis = The kurtosis is a measure of how "fat" a distribution is, measured relative to a normal distribution with the same standard deviation. A normal distribution has a kurtosis of zero. A positive kurtosis indicates that the distribution is more "peaky" than a normal distribution. A negative kurtosis indicates that the distribution is "flatter" than a normal distribution. Given the fact that probability distributions represent the means by which uncertainty can be quantified, the task o

*View more*