New Boeing Method Accelerates Turbulence Modeling Uncertainty Analysis

A simulation of a physical wind tunnel airplane model (the NASA Common Research Model), widely used for CFD benchmarking and analysis. Boeing researchers recently used OLCF resources to perform simulations that would aid them in identifying and reducing uncertainty in a computational turbulence model called the Spalart–Allmaras model. Quantifying the uncertainty in predictive models provides a measure of confidence that may eventually accelerate the time to certify new aircraft and reduce cost.

Boeing has long used computational tools as part of its aircraft design process, but now engineers at the world’s largest aerospace company are increasingly shifting their focus from traditional subscale models in wind tunnels to computational models for the most challenging components of an aircraft.

Computational simulations can provide information about how predictive mathematical models will perform and can complement tests of physical models and actual flight testing, thereby saving companies time, money, and computing hours. However, these mathematical models contain varying amounts of uncertainty, which can make assessing confidence in different outcome scenarios difficult.

For engineers at Boeing, putting a number to uncertainty means determining which values in important flight simulations are more likely to lead to an increased range of possible outcomes. Reducing the variability of outcomes —and thereby the uncertainty of predictive models — can decrease the number of complex simulations and experiments scientists must perform, accelerating the time to design new aircraft and reducing that cost. In addition, Boeing and the aerospace industry would like to leverage simulations to reduce some of the extremely expensive flight testing required for Federal Aviation Administration certification. Reducing the uncertainty and increasing confidence in their predictive models are necessary to achieve this goal.

John Schaefer, aerodynamics engineer at Boeing, and Boeing Technical Fellow Andrew Cary have been studying mathematical models for turbulent airflows to determine how much variation in simulation outcome can occur because of the modeling unknowns. The pair recently studied a model found in Reynolds-Averaged Navier–Stokes computational fluid dynamics (CFD) codes used for simulating fluid flows such as air over a wing.

“We were looking at what happens when we change the parameters within our turbulence model,” Schaefer said, referencing the Spalart–Allmaras (SA) turbulence model, which uses a partial differential equation to model the turbulent airflows near solid surfaces such as the wing or body of an aircraft. The SA turbulence model was developed by Boeing researchers more than 25 years ago and is widely used in industry today.

Schaefer continued, “We wanted to know: if we treat these parameter values as uncertain, how does that impact the uncertainty in our output values?”

But running enough simulations to obtain an accurate measure of the variation in results proved extremely difficult without leadership-class computing. The researchers turned to the computational resources of the Oak Ridge Leadership Computing Facility (OLCF), a US Department of Energy (DOE) Office of Science User Facility located at DOE’s Oak Ridge National Laboratory.

Using the Boeing CFD (BCFD) code containing the SA turbulence model, the team simulated the conditions of NASA’s Common Research Model (CRM), a 3D physical wind tunnel airplane model that boasts a realistic geometry and is used in the CFD community for benchmarking.

After completing hundreds of simulations, the team discovered that some of the model parameters have unexpectedly strong relationships between one another. This allowed the researchers to reduce the dimensions of their problem. The finding will significantly reduce the number of simulations that scientists need to perform for precise results, leading to faster time to solution and more accurate predictive models.

Computational models usually contain constants — numbers that never change. Constants can represent values such as the velocity of oncoming airflow, how much momentum is being transported at different areas, and how much heat is moving out of particular flows. When these constants are used to solve a mathematical model such as the SA turbulence model, the resulting output data can provide engineers with information about the force that holds an airplane in the air (lift), the air resistance on an airplane (drag), and other variables such as the pressure distribution on solid surfaces. However, sometimes these constants are not known precisely and exist as possible ranges of numbers, generating uncertain output values.

“We end up with a minimum and maximum possible value on the output quantities,” Schaefer said. “So instead of just getting a single value of drag, now we have a minimum possible drag and a maximum possible drag.” Understanding this range is one of the keys to increasing the accuracy of predictive models.

The team previously studied 2D geometries to obtain general results and then zeroed in on NASA’s CRM to obtain results for more realistic 3D geometries that could be used to look at uncertainty in the mathematical SA model. But Schaefer and Cary needed millions of CPU hours — substantially more computing power than they had internally at Boeing — to run the simulations necessary to assess the uncertainty in the 3D CRM predictions in a timely manner.

The team applied for computing time at the OLCF and received a Director’s Discretionary allocation through the OLCF’s industrial partnership program, Accelerating Competitiveness through Computational Excellence (ACCEL). To analyze the uncertainty and sensitivity for multiple cases of conditions, the team estimated that it needed to run a minimum of 440 simulations on computational grids of 109 million cells. Using BCFD, the team simulated NASA’s CRM model at cruise condition — the condition at which an airplane travels for the majority of a flight, maintaining a constant speed and altitude.

Treating their nine input parameters as individually uncertain, the engineers solved for the impact of each one separately. But in working with one of the authors of the SA model — Boeing Senior Technical Fellow Philippe Spalart — and after analyzing the results of 460 completed simulations, they discovered they could reduce the dimensions of their problem.

“We were able to pair some of the values in novel ways, which reduced the statistical problem dimension from nine to five uncertain variables,” Schaefer said. “This gave us a better representation of the uncertainty, and it also greatly reduced the amount of computation required for our analysis.”

Of the remaining five variables, three were shown to contribute more significantly to uncertainty in output quantities. Once the researchers removed the insignificant variables and applied the physics-based relations described above, they realized they had discovered an innovative new method for characterizing the input uncertainty for the SA turbulence model. With this new method, they would need only 20 simulations in total—instead of hundreds of simulations—to complete this class of uncertainty analysis in the future on their internal Boeing systems.

“We originally estimated that this uncertainty analysis project would have taken years to run with our in-house resources, or we would have had to scale this down to a much smaller problem,” Cary said. “This would have led to more significant questions about whether we were capturing the full 3D dynamics of the problem.”

“The new method that we’ve developed at OLCF reduces the computational requirements of our uncertainty quantification by 95.5 percent, permitting us to do future uncertainty analyses on our in-house systems as well as do larger-scale analyses in-house,” Schaefer said.

By correlating variables in their model and removing insignificant variables from the statistical problem, the researchers established a new method for performing analyses of uncertainty and sensitivity of the SA model, one that will significantly cut down computing hours and time to solution for future simulations.

Source