By Thomas Halsey
In the upstream oil and gas industry our aim is to identify and produce subsurface accumulations of oil and gas. From the earliest beginnings of our industry, our key scientific and engineering challenges have been related to our ignorance of the detailed structure of the subsurface. Our abilities to image the subsurface, thereby locating oil and gas fields, and to predict the flow of these fluids through subsurface porous rocks are often unsatisfactory, particularly in view of the large capital investments required in our business. While we do apply the most advanced technologies available to these problems, much uncertainty inevitably remains—one proverb of the industry is that we only truly understand an oil or gas field the day we close it down (and perhaps not even then!).
Because of our inability to directly measure important features of the subsurface, much of our decision-making is based on models. We use a variety of models to reconstruct seismic data into three-dimensional images of the subsurface, and we construct large reservoir models based on these images and other information to predict the flow of multiphase fluids through the reservoir during the production phase of an asset. Successful construction of these models requires the skills of expert applied physicists, mathematicians, geoscientists, and engineers; these are among the most advanced modeling challenges, in scope and technical depth, to be found in any industry.
Many oil and gas operating and service companies have invested in petascale computing over the past few years.
These models are invariably embedded in software. A large service industry has grown up to provide the software needed to construct and solve these models; in addition, many oil and gas operating companies have internal software development units that develop standalone or plug-in software used by technical decision makers throughout their businesses. While equipment and hardware advances remain critical to the industry, much of the innovation has moved into the software space, with research methods moving seamlessly into software used throughout our businesses.
The size of the computational problems we face has made the oil and gas industry one of the largest private sector users of high-performance computing. Numerous oil and gas operating and service companies have advertised their rapidly improving capabilities in petascale computing, and these companies have developed sophisticated technical teams to deploy, maintain, and develop specialized code for these systems.
Within ExxonMobil, we often respond to the need to develop and maintain technical excellence in an area by forming a functional organization dedicated to that area. Thus, in the last few years we have formed internal functional organizations dedicated to computational sciences in both our research and information technology organizations.These organizations drive rapid sharing of best practices among teams working diverse business problems, as well as assisting in key skills assessment, development, and maintenance. Of course, it is important to maintain strong links between computational work and the geoscientists and engineers with whom we must collaborate closely to impact our business.
Specialized software allows oil and gas professionals to visualize, analyze, and simulate subsurface phenomena.
We have concluded that there are four key skills areas that must be closely combined to succeed in computational sciences in our company. These are as follows:
- Modeling physics: the construction and validation of practical multi-physics models covering phenomena that are key to the performance of our business
- Computational and applied mathematics: the design and implementation of robust, accurate algorithms to solve our models on the necessary time scales
- Technical software engineering: the development of computer code suitable for solution of scientific and engineering problems within the constraints posed by our IT environment
- High-performance computing: the design, procurement, maintenance, and specialized code development required to solve industry grand challenge problems on leading high-performance computing architectures.
These skills, combined with more traditional industry geoscience and engineering skills, have enabled us to advance several major innovations based on new computational approaches to seismic imaging as well as reservoir modeling and simulation since we increased our focus on the computational sciences a few years ago.
In addition to the subsurface element in our business, we also manage complex logistical operations and global supply chains in both our upstream and our refining and petrochemicals businesses. Business analytics and optimization are important tools to improve our profitability in this aspect of our business as well.
Our industry is currently experiencing a cyclical downturn in oil and gas prices, which is driving an urgent need to improve business performance. One of the key methods to do so is to ensure that we are getting as much production as possible from our current assets, which requires optimization of a complex system including the subsurface reservoir, wells, pipelines, and surface facilities.
A new wave of innovations is emerging in the industry to help with this optimization; many of these innovations are based on data analytics methods that have been very successful in the finance, marketing, and social media industries. The oil and gas industry differs from these industries in the richness of the physics and geology that is embedded in our legacy modeling approaches, which must be effectively combined with the newer data analytics methods to enable our continued success. I am certain that the oil and gas industry will emerge as a leader in developing new science and mathematics that effectively combine data-driven and physics or geoscience-driven methods in powerful ways.