SIAM News Blog
SIAM News
Print

Dakota Software: Explore and Predict with Confidence

By Michael Eldred, Brian Adams, and Laura Swiler

The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification (UQ). These capabilities enable advanced exploration and risk-informed prediction with a wide range of computational science and engineering models [1].

In its simplest mode, Dakota automates iterative analysis using a general-purpose interface to a computational model, as shown in Figure 1. Its fundamental strength is a broad suite of algorithmic techniques facilitating parameter exploration, global sensitivity analysis, design optimization, model calibration, UQ, and statistical inference. These core algorithms provide a foundation for more advanced, multicomponent solution approaches, including hybrid optimization, surrogate-based optimization, multi-fidelity UQ and optimization, mixed aleatory-epistemic uncertainty analyses, and optimization under uncertainty. By integrating these capabilities within a single software tool, users can easily transition between different types of studies when exploring a computational model – from identifying to calibrating influential parameters and from characterizing the effect of uncertainties to performing design optimization in their presence.

Figure 1. Interaction between Dakota and a parameterized simulation. Image credit: Sandia National Laboratories.

Dakota’s development activities span a spectrum from algorithm research and prototyping to production application deployment, with the goal of delivering exploration and prediction capabilities for all kinds of computational models. Efficient computing is also a central goal, with support ranging from desktops to the latest supercomputers.

Algorithm Research and Development

The Dakota project started in 1994 as an internal research and development effort, and has retained this emphasis throughout its history. Research in new algorithms is guided by challenges in deploying methods to complex, high-fidelity engineering and science applications where parameter spaces may be high-dimensional, quantities of interest may be nonsmooth or unreliable, and simulation budgets may be severely constrained.

In the case of optimization methods, we have primarily addressed these issues through surrogate-based approaches relying on data fit, multi-fidelity, or other approximations. We mitigate simulation defects and accelerate local and global search processes through the use of adaptive model management approaches.

For UQ, we seek scalability by exploiting anisotropy, sparsity, and low-rank structure using spectral expansion methods. Dovetailing multilevel multi-fidelity model hierarchies further amplifies their efficiency gains. Moreover, these forward UQ techniques enable scalable statistical inversion; we are presently crafting inference approaches that leverage dimension reduction, emulator acceleration, and multi-fidelity modeling.

Building on these optimization and UQ investments, we tailor model management-based optimization algorithms to the capabilities of specific UQ approaches, enabling efficient optimization under uncertainty. This allows for the design of statistically robust and reliable engineered systems.

Persistent investment in algorithm research allows us to deploy the latest algorithmic approaches and evolve and mature them alongside time-tested and production-hardened methods. This continuous influx and maturation of novel capabilities is essential for supporting new challenges within our mission space.

Enabling Architecture

Dakota’s object-oriented C++ architecture empowers developers to effectively craft and deliver algorithm capabilities and enables users to productively apply them. Supported by software engineering infrastructure and processes, the architecture facilitates the concurrent research, development, production, and deployment necessary to meet diverse Department of Energy (DOE) program goals.

Iteration is a central theme that inspires the C++ class abstractions in Dakota. Models manage the mapping of variables (parameters) through an interface to responses (quantities of interest). Model instance types support mappings based on computational simulations, surrogate approximations, formulation recastings, and nested recursions. Method classes implement iterative analysis algorithms, which are broadly grouped into parameter exploration and design of experiments, non-deterministic methods for UQ and inference, and minimization for optimization and calibration. These classes are both composable and extensible, allowing developers to prototype new algorithms and users to flexibly configure them in Dakota studies.

Dakota supports parallel computing from desktops to supercomputers. We use multilevel parallel computing (comprised of message passing, asynchronous local, and hybrid approaches) to utilize coarse-grained parallel concurrency within a recursive scheduling approach in order to augment and amplify fine-grained simulation parallelism. A new Dakota graphical analysis environment aims to help users interface to simulations, create/execute studies in parallel, and interpret Dakota results.

To facilitate greater interactivity, we are transforming Dakota’s software architecture into a more modular and extensible system of components with an increasingly flexible integration layer. Additional fine-grained C++ application program interfaces (APIs) will ease library integration of Dakota into simulation codes for a better-integrated user experience. Developers and end users alike will be able to directly access individual components and orchestrate them with Python, or perhaps with a domain-specific language. Essentially, Dakota’s architecture is evolving to support larger order-of-magnitude problems (in terms of parameter/response dimension), effectively span the spectrum from black box to embedded methods, and scale to next-generation extreme-scale hybrid computer architectures.

Impact

Dakota is open source and distributed under the GNU Lesser General Public License (LGPL). More than 25,000 users worldwide have downloaded the software since January 2010. It has hundreds of users at DOE laboratories and is widely used across government, industry, and academic sectors.1 Dakota runs on Linux, Mac, and Windows operating systems, including high-performance computing clusters. Software downloads (source and binary), system requirements, and installation details are available on the Dakota website. Given Dakota’s active research efforts, accompanying release notes indicate emerging capabilities and their corresponding maturity.

Computational science and engineering practitioners use Dakota across many disciplines and in conjunction with a wide variety of computational models, some of which are shown in Figure 2. For example, Dakota has been used to support the following DOE mission applications:

  • Optimization of the performance of neutron generators to ensure that designs meet specifications in terms of voltage, current, and space
  • Establishment of a simulation model’s credibility for thermal battery performance through a detailed verification and validation/UQ analysis
  • Sensitivity analysis of nuclear reactor fuels performance to understand parameter influence in pressurized water reactors versus boiling water reactors
  • Calibration of parameters governing thermal-hydraulic models that simulate cooling flows within a reactor core
  • Abnormal thermal safety analysis using sparse grids, compressed sensing, and mixed aleatory-epistemic UQ methods
  • Analysis of circuit performance and circuit variability given radiation damage to electrical components
  • Quantification of the performance of vertical axis wind turbines subject to uncertain gust conditions
  • Inference of uncertain basal conditions underlying the Greenland ice sheet, based on available observational data
  • Estimation and propagation of uncertain atomistic potentials to quantify material performance

Figure 2. Dakota supports a variety of mission areas, including problems related to climate (Greenland ice sheet model), energy (ΒΌ core model of a nuclear reactor), and defense (thermal load on a weapon). Defense and climate images courtesy of Sandia National Laboratories, energy image courtesy of the Consortium for Advanced Simulation of Light Water Reactors, www.casl.gov.

These applications motivate both our stewardship of time-tested production methods and our investment in new algorithms to support growth in problem scale and complexity.

Community

The Dakota team strives to cultivate an active worldwide user community that contributes to collaborative research, software development, requirements definition, and user support. Research collaborations currently span dozens of universities and labs, and are essential for advancing the state of the art in model-based prediction and decision-making. Publicly-accessible repositories facilitate joint software development. Contributors can implement algorithms, help Dakota scale to next-generation computing platforms, improve architecture, develop adapters or interfaces, and improve quality through software engineering infrastructure. Developer resources are continually improving to expedite such cooperation.

Dakota users come from diverse science and engineering domains, business sectors, and geographies, thus mandating that we nurture a self-sustaining user community. Members support each other via a public mailing list and (soon-to-come) web-based forums, and users can contribute bug reports, enhancement ideas, and case studies. Domain- or simulation code-specific topical groups might integrate Dakota with popular simulation codes and workflows, sponsor-focused user group meetings or training sessions, or contributed tutorials. A vibrant and engaged user community is central to sustaining Dakota as a leading open-source optimization and UQ tool. We invite your participation!


1 In "Building Sustainable Decision Tools for a Sustainable Environment," read about Dakota's role in developing an integrated modeling system for water management.  

References
[1] Adams, B.M., Bohnhoff, W.J., Dalbey, K.R., Eddy, J.P., Ebeida, M.S. Eldred, M.S….Wildey, T.M. (2016, November). Dakota, a Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis: Version 6.5 User’s Manual. Technical Report SAND2014-4633. Albuquerque, N.M.: Sandia National Laboratories. Retrieved from http://dakota.sandia.gov/documentation.html.

Michael Eldred (mseldre@sandia.gov), Brian Adams (briadam@sandia.gov), and Laura Swiler (lpswile@sandia.gov) develop Dakota in the Optimization and Uncertainty Quantification Department in the Center for Computing Research at Sandia National Laboratories. Eldred, a Distinguished Member of Technical Staff and aerospace engineer by training, founded Dakota in 1994 and now leads its algorithm R&D activities. Adams, a Principal Member of Technical Staff, is the Dakota Project Lead and has been a team member in various roles since completing his applied mathematics degrees. Swiler, a Distinguished Member of Technical Staff, applies her operations research and engineering training to Dakota research, development, and applications. 

Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.



blog comments powered by Disqus