Posterior Sampling using Particle Swarm Optimizers and Model Reduction Techniques

Posterior Sampling using Particle Swarm Optimizers and Model Reduction Techniques

J. L. Fernández Martínez (Stanford University, University of California-Berkeley, USA and University of Oviedo, Spain), E. García Gonzalo (University of Oviedo, Spain), Z. Fernández Muñiz (University of Oviedo, Spain), G. Mariethoz (Stanford University, USA) and T. Mukerji (Stanford University, USA)
Copyright: © 2010 |Pages: 22
DOI: 10.4018/jaec.2010070102
OnDemand PDF Download:
No Current Special Offers


Inverse problems are ill-posed and posterior sampling is a way of providing an estimate of the uncertainty based on a finite set of the family of models that fit the observed data within the same tolerance. Monte Carlo methods are used for this purpose but are highly inefficient. Global optimization methods address the inverse problem as a sampling problem, particularly Particle Swarm, which is a very interesting algorithm that is typically used in an exploitative form. Although PSO has not been designed originally to perform importance sampling, the authors show practical applications in the domain of environmental geophysics, where it provides a proxy for the posterior distribution when it is used in its explorative form. Finally, this paper presents a hydrogeological example how to perform a similar task for inverse problems in high dimensional spaces through the combined use with model reduction techniques.
Article Preview

Particle Swarm Optimization (Pso) Applied To Inverse Problems

Particle swarm optimization is a stochastic evolutionary computation technique inspired by the social behavior of individuals (called particles) in nature, such as bird flocking and fish schooling (Kennedy & Eberhart, 1995).

Let us consider an inverse problem of the form jaec.2010070102.m01, where jaec.2010070102.m02 are the model parameters, jaec.2010070102.m03 the discrete observed data, and

is the vector field representing the forward operator and jaec.2010070102.m05 is the scalar field that accounts for the j-th data. Inverse problems are very important in science and technology and sometimes referred to as, parameter identification, reverse modeling, etc. The “classical” goal of inversion given a particular data set (often affected by noise), is to find a unique set of parameters m, such the data prediction error jaec.2010070102.m06 in a certain norm p, is minimized.

The PSO algorithm to approach this inverse problem is at first glance very easy to understand and implement:

  • 1.

    A prismatic space of admissible models, M, is defined:


where jaec.2010070102.m08 are the lower and upper limits for the j-th coordinate of each particle in the swarm, n is the number of parameters in the optimization problem and jaec.2010070102.m09 is the swarm size.

  • 2.

    The misfit for each particle of the swarm is calculated, jaec.2010070102.m10 and for each particle its local best position found so far (called jaec.2010070102.m11) is determined as well as the minimum of all of them, called the global best (jaec.2010070102.m12).

  • 3.

    The algorithm updates at each iteration the positions jaec.2010070102.m13 and velocities jaec.2010070102.m14 of each model in the swarm. The velocity of each particle i at each iteration k is a function of three major components:

    • a)

      The inertia term, which consists of the old velocity of the particle, jaec.2010070102.m15 weighted by a real constant, jaec.2010070102.m16, called inertia.

    • b)

      The social learning term, which is the difference between the global best position found so far (called jaec.2010070102.m17) and the particle's current position (jaec.2010070102.m18).

    • c)

      The cognitive learning term, which is the difference between the particle's best position (called jaec.2010070102.m19) and the particle's current position (jaec.2010070102.m20):


Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing