17:35

conference time (CEST, Berlin)

conference time (CEST, Berlin)

On the Importance of Abstraction Validation for Fluid Flow Applications: Case Studies Demonstrating Difficulties in Ensuring All Relevant Physics are Considered for CFD Validation

26/10/2021 17:35 conference time (CEST, Berlin)

Room: F

S. Howell, P. Middha (Abercus Limited, GBR)

The ASME V&V 10 diagram outlines a framework for the verification and validation of engineering simulations. It comprises an abstraction step to yield a conceptual model that is representative of the reality of interest, and two subsequent branches: a simulation and an experimental branch. Verification activities strictly occur within each branch and are required to ensure that the outcome of each branch is robust, whereas validation requires the quantitative comparison of the outcomes of the two branches. For the simulation branch, the verification activities are entirely within the realm of mathematics only, and the validation by comparison with the experimental outcome is a check on the physics. This diagram is also endorsed by NAFEMS and is included in the joint NAFEMS/ASME document WT09.
The V&V 10 diagram provides a robust framework for the verification and validation of the conceptual model, from which both the simulation branch and the experimental branch stem. However, there is no guarantee that this represents robust validation of the reality of interest. The difficulty is that the abstraction process falls outside of the validation comparison between the simulation branch and the experimental branch. If the validation comparison is assumed to be applicable to the reality of interest, this implicitly assumes that the conceptual model has captured all of the relevant physics and is representative of the reality of interest, but this may not be the case.
Three CFD case studies are presented which highlight the disconnect between the reality of interest and the conceptual model. For all three case studies, it turns out that thermal radiative transfer can be important, but this may not necessarily be obvious at first and, therefore, this aspect of the physics could be missed during the abstraction process. Indeed, it is even possible that missing the physics of thermal radiative transfer could be commonplace for many other CFD applications, and this may be important for low momentum and buoyancy driven flows.
Exploring how good a representation the conceptual model is of the reality of interest is the domain of abstraction validation. However, this is not necessarily straight forward because the abstraction process can include unknown unknowns. A useful recent addition is the Simulation Validation Methods diagram which identifies a range of admissible validation methods under the Engineering Simulation Quality Management Standard (ESQMS), published by NAFEMS in 2020 – see Annex D to the ESQMS. This diagram shows seven admissible forms of validation for the purpose of demonstrating ISO 9001 compliance, and they are shown (from left to right) in order of decreasing strength of validation. Simply by including seven forms of validation, this acknowledges that some forms of validation are better than others, which is in itself useful. Only for the strongest form of validation (1.1 on the left of the diagram) can it be assumed that the conceptual model is actually representative of the reality of interest. For the remaining six forms of validation, there is implicitly an acknowledgement that the conceptual model may representative a departure from the reality of interest.
Whilst this may not provide abstraction validation in a quantitative sense, the ESQMS validation diagram provides a standard mechanism to clearly identify shortcomings associated with abstraction validation. It is noted from the case studies presented that the necessary level of validation can depend upon the level of risk associated with the flow application, and the level of dependence upon the simulation predictions. For low risk applications, perhaps one of the weaker forms of validation is appropriate. For high risk applications that are entirely reliant upon the simulation predictions, a stronger form of validation will generally be appropriate.

Abstraction validation, ESQMS, ISO 9001

17:55

conference time (CEST, Berlin)

conference time (CEST, Berlin)

How FEA and Structural Verification According to Standards Helps EMO Terminal to Assure Safe Operations and Extend the Lifetime of the Lifting Equipment

26/10/2021 17:55 conference time (CEST, Berlin)

Room: F

O. Ishchuk (SDC Verifier, NDL)

Finite element analysis and structural verification of lifting equipment in EMO Terminal according to Eurocode 3, EN13001, FEM1.001, and other industry standards. Case study of assuring continuous safe and effective operations with the help of FEA simulations and checks according to Standards.
A lot of steel structures such as cranes, heavy machinery and other equipment subjected to repetitive loading is highly likely to develop cracks or failures because of the fatigue damage. The cost of every hour of equipment downtime is usually very expensive, making it mandatory to perform residual life calculations, fatigue checks, buckling, and stability verification and analysis of bolts, welds, and other connections.
This Case study will describe how the complete structural verification according to multiple industries was performed for EMO with the help of the finite element analysis method. EMO, the European dry bulk commodities and transshipment company, is one of the largest dry bulk terminals in Europe.
General FEA programs like Ansys and Femap were used to build finite element models of the equipment and simulate the behavior of these models under different operational and exceptional conditions. These programs are delivering to engineers outstanding simulation possibilities. But to know that the calculation results are OK, to get the results verified and certified, and to analyze not only stresses and displacements but also check multiple failure modes it was required to follow the Design Rules and Industry Standards which already contain industry best procedures and recommended practices. Static stress check, fatigue, beam member check, plate buckling, weld strength, bolted and riveted connections strength, joints verification, stability analysis and other checks were done for EMO Terminal unloaders, grabbers, conveyors, and other equipment. Structural verification of Femap and Ansys FEA models was performed with a help of the SDC Verifier. Results of this analysis, found issues and typical problems, and the procedure of checking will be shown in this presentation.
With the method, presented in this use case it is possible to perform code checking analysis for multiple loading conditions, and on multiple structural members according to Eurocodes, AISC, ABS, DNV, Norsok, API, DIN, FEM, DVS, FKM, ISO, ASME, and other codes directly in general FEA program.
Detailed workflow of FEA analysis and code-checking routine was developed during the multiple engineering analyses for EMO Terminal equipment. The presented use case demonstrates how to save up to 40% of the time on repetitive engineering routines, simplify the design process, and reduce the deadline pressure. This presentation is about how to make computer-aided engineering fast, accurate, and effective.

FEA, Civil, Offshore, Marine, Machinery, Heavy Lifting, Aerospace, Oil & Gas, Equipment, structural verification, Static stress check, fatigue, beam buckling, plate buckling, weld strength, bolted and riveted connection strength, joints verification, calculation, Eurocodes, AISC, ABS, DNV, Norsok, API, DIN, FEM, DVS, FKM, ISO, ASME

18:15

conference time (CEST, Berlin)

conference time (CEST, Berlin)

Accuracy of Predicting Stress Risers for Several Ansys Element Types

26/10/2021 18:15 conference time (CEST, Berlin)

Room: F

C. Roche, R. Teja Dhanalakota (Western New England University, USA)

Master Degree Students in Civil and Mechanical Engineering were given specific geometries to model in a finite element analysis course to validate the theoretical stress concentration factor for a common stress riser. Their results that met all proper analysis guidelines were considered typical in industrial applications even though each student obtained a different K_t.
The authors take the uniaxial, isothermal results and expand the study to element types in ANSYS. For more challenging finite element analyses, one must consider composite materials definition, biaxiality of stress, thermal gradients, pre-stress modeling, nonlinear property modeling, etc. Those effects on prediction can be added stochastically to variation due real world loads, wall thickness variation, actual thermal values, real world stress strain relationships, etc. One can postulate level of difficulty in a finite element analysis increases with level of complexity and variation in the system being modeled.
The analyses detailed include the K_t of a hole where the theoretical K_t approaches 2.5 in a finite width plate. This represents the simplest of analyses in that it is effectively two-dimensional, isothermal, uniaxial, and isotropic. The authors expand results for gradually improved mesh densities.
The accuracy of the results would be suspect in most applications and so the authors attempt a patch test study.
Verification and validation have been critical to the aerospace industry. The authors have also noted variation in the simplest of analyses. This computational variation can be added to physical variation in the same set of analyses and may inherently be present. The authors considered the following computational influences
1) Linear isotropic material properties
2) Element shape biasing
3) Element skewness
4) Element aspect ratio
5) Convergence criteria for load and displacement
6) Solver chosen
7) Thick walled shell theory vs thin walled
Finite Element analysis results can vary for the same problem due to changes in the finite element code, the operation system, the user, the mesh bias, the element skewness, element mesh density, exactness of boundary conditions, and unknown sources. We postulate the more variables in an analyses, the greater the deviation can be from the exactly solution and that a design system has to consider these factors. We further postulate a critical review of certain elements may be necessary.

Don't know, completed by JD

18:35

conference time (CEST, Berlin)

conference time (CEST, Berlin)

Applying Machine Learning to Detect Errant Behavior in Multiscale Physics-Based Models

26/10/2021 18:35 conference time (CEST, Berlin)

Room: F

A. Cox (Aerospace Systems Design Laboratory, USA ); H. Johnston, D. Mavris (Georgia Institute of Technology, USA)

Developing physics-based models requires a detailed understanding of the important aspects of a system as well as many modeling-related factors. The trade between fidelity and efficiency requires abstractions of physical behavior that impose limits on the predictive ability of a model. The computational intensity of these models often requires a multifidelity approach, increasing the desire to understand the risks associated with each model. While traditional statistical techniques can identify irregularities in some model responses, determining the valid operating space for a non-linear, time-dependent, or stochastic simulation can be difficult due to both complexity and the amount of generated data.
Identifying when a model fails to provide adequate response prediction is necessary for troubleshooting, verification, and analysis. Computational models often have difficulty reaching a stable solution, and even when a model runs to completion, the behavior can still be erroneous. Additionally, the region of applicability might typically be presumed to be a simple bounded area around the developed baseline, but some models have non-intuitive failures. It is easy for a human to visually verify the validity of a non-linear simulation response such as a stress-strain curve, but it is difficult for a computer program to do the same. Furthermore, when deviating from a baseline to perform uncertainty analysis or design space exploration, manual inspection quickly becomes impractical as the amount of generated data increases.
This work seeks to develop a method using machine learning techniques of classification and clustering to provide automated detection of regions where predictive capability is lost. This is done by allowing the algorithm to compare trendlines and return recommendations for filtering prior to the application of traditional statistical or uncertainty quantification techniques. Automating this process allows for model developers and users to avoid tedious data processing steps and more effectively understand the limitations of a given model.

machine learning, multiscale modeling, verification and validation, simulation