B11
Integration of Analysis & Test 1

Back to overview

10:40
conference time (CEST, Berlin)
Statistical Simulation: Integrating Analysis, Test, and Reality Through Bayesian Inference
27/10/2021 10:40 conference time (CEST, Berlin)
Room: B
F. Günther (Knorr-Bremse SfS GmbH, DEU)
F. Günther (Knorr-Bremse SfS GmbH, DEU)
A digital twin based on rigorous statistical principles is motivated for simulation-based product verification and validation as well as condition-based maintenance. Two examples from the railway industry are presented to illustrate the application of the method: • Digital twin of the longitudinal kinematics of a train combined with incomplete, noisy field data, • Integration of test and analysis for fatigue validation of mechanical components, including multilevel modeling. Modeling and numerical solution of the examples is accomplished by Bayesian Inference using the statistical modeling language and open source software Stan. Bayesian Inference has only become popular and feasible over the last couple of decades with the development of more and more powerful simulation hardware. Our simulations were partly performed on standard workstations and partly on a numerical simulation cluster. Bayesian Inference has several nice properties: • In effect, we can fuse data and information from many different sources – simulation, test, field data, engineering knowledge, prior experience – in a statistically rigorous way. • We can flexibly create arbitrary, physically motivated statistical models. • Modeling assumptions are clearly stated, and not implicit as with many traditional statistical techniques. Thus, they can be discussed and improved and contribute to our engineering understanding of the results. • Uncertainty and predictive accuracy can be quantified and analyzed. • The amount of data required is much smaller than for physics-agnostic, data centric methods such as neural networks. • We can easily deal with partially missing data. Some challenges also need to be mentioned: Statistical modeling and simulation require combined in-depth expertise in engineering, statistics, and computational science, as well as sufficient computational resources. As with any other simulation method, convergence and correct results cannot be assured unless you are working with a well-posed model and know what you are doing. Overall, we highly recommend Bayesian Inference and Statistical Simulation for Uncertainty Quantification.
11:00
conference time (CEST, Berlin)
Towards Matching Simulation and Experiment
27/10/2021 11:00 conference time (CEST, Berlin)
Room: B
R. Helfrich (INTES GmbH, DEU)
R. Helfrich (INTES GmbH, DEU)
Since Finite Element Analysis (FEA) has emerged, the comparison of simulation results with experimental results has become a big issue. Who has to be considered as ‘Master’, and who has to follow? Experiments used much longer than FEA were often seen as the ‘Master’, and the question was, how to match the simulation results and experimental results. It was clear from the very beginning that this challenge can only be solved by using suitable optimization methods, where the parametrization has to be done on the simulation side. Now, optimization methods are available in simulation. So, one has to consider the type of experiments and the type of test specimen, in order to see which analysis and optimization methods are required to start a matching process. One frequently used experiment is called ‘Hammering Test’, where acceleration transfer functions are measured over a certain frequency range. This leads to a direct comparability with a frequency response analysis in simulation. But the situation is becoming more complex, if we consider not only single parts in the experiments but assembled parts in a prestressed state. Then simulation has to take a static loading into account and contact is becoming a crucial point. The main problem is that the experiment does usually not provide any contact pressure distribution of the assembly, but the simulation has to take this effect into account. This is important, because a low contact pressure allows for more vibrations of two coupled parts than a high contact pressure. And what is a low contact pressure, and what is a high contact pressure? As an industrial example, we consider a disc brake, where we use not only the hammering test results for the single parts but also for the assembly with different brake pressures. Therefore, we could try to match the single parts and the assembly as well. The remaining question is, which are the parameters for the matching process. Under the assumption that the geometry of the parts is exact enough in experiment and simulation, because their congruency can be measured, the remaining parameters are the material properties. In fact, a non-homogeneous distribution of material properties for cast or forged parts is obvious. The paper will describe the process of modeling, analysis, and optimization to match simulation results and the experimental results. For the comparison of both results, the concept of FRAC (Frequency Response Assurance Criterion) is used, which correlates the frequency response curves of test and simulation.
Contact analysis, frequency response analysis, experimental results, optimization
11:20
conference time (CEST, Berlin)
A Novel Dynamic Material Characterisation Method
27/10/2021 11:20 conference time (CEST, Berlin)
Room: B
S. Sriraman, H. Goyder (Cranfield University, GBR); D. Brown, P. Ind (AWE, GBR)
S. Sriraman, H. Goyder (Cranfield University, GBR); D. Brown, P. Ind (AWE, GBR)
Non-metallic materials have frequency dependent dynamic properties which must be characterised for use in computer simulations. The characterisation of such properties is important as most modern structures utilise these materials. Hence, a novel test method has been developed, which combines vibration testing with finite element analysis, to yield dynamic modulus of elasticity and damping. Material properties can be measured in the frequency range 2Hz - 2000Hz. The test method involves a can¬tilever beam. Two samples of the test material sandwich the root of the beam and are held in place between inertial masses. Experimental modal analysis techniques, where an instru¬mented hammer vibrates the beam, are used to exercise the material. Acceleration and force time histories are measured and processed to obtain a spectrum, from which the natural frequencies of the test setup are derived. The modulus of elasticity of the material is found by constructing a finite element model of the test setup and tuning the simulated response with that of the experiment. This is done for each of the bending modes, resulting in the measurement of elastic modulus as a function of frequency. Damping properties are extracted by applying data fitting techniques to the time histories and spectrum. These are converted into material properties by simulating and accounting for the energy balance between the samples and test setup. Doing so also eliminates the need to estimate damping ratio at each frequency, as it can be calculated using the strain energies in undamped modes. It is important that the sample is gripped in a manner that exercises it effectively and is simple to simulate. Eliminating slipping, and thus difficult to model friction, is a key concern and has been investigated in depth. The best solution found is an axisymmetric bolting arrangement which holds the samples in place. Additionally, the test setup utilises suspension, together with inertial masses and an orthogonal layout to isolate against external vibrations. Polymers and rubbers, which exhibit complex frequency dependent behaviour, have been validly characterised using this method. The damping material Sorbothane has also been characterised and produced results that aligned within manufacturer specifications. This method proves to be a reliable procedure for dynamic material property testing.
Non-Metals, Finite Element Analysis, Simulation, Dynamic Material Properties, Vibration, Damping, Modulus of Elasticity, Sample Gripping
11:40
conference time (CEST, Berlin)
A Digital Twin for Lightweight Thermoplastic Composite Part Production
27/10/2021 11:40 conference time (CEST, Berlin)
Room: B
M. Meyer, A. Delforouzi (Fraunhofer SCAI, DEU); R. Schlimper, M. John (Fraunhofer IMWS, DEU); T. Link (Fraunhofer ICT, DEU); D. Koster, U. Rabe (Fraunhofer IZFP, DEU); C. Krauß (KIT FAST, DEU)
M. Meyer, A. Delforouzi (Fraunhofer SCAI, DEU); R. Schlimper, M. John (Fraunhofer IMWS, DEU); T. Link (Fraunhofer ICT, DEU); D. Koster, U. Rabe (Fraunhofer IZFP, DEU); C. Krauß (KIT FAST, DEU)
Simulation- and data-driven digital twins of individual machines and systems already lead to significant efficiency gains through improved production control and machine maintenance, but their promised potential across stages of value chains is still largely untapped. This work proposes a digital twin system on the series production of thermoplastic composite (TPC) structures, which require hierarchized manufacturing steps across diverse production systems and facilities. The need for integration of non-destructive measurements and simulation-based analyses requires flows of materials, data and goods among several institutions. This work reports on the project MAVO digitalTPC, in which the Fraunhofer Institutes IMWS, ICT, IZFP and SCAI jointly develop a distributed digital twin for thermoplastic composite production in an industry-representative multi-institutional setting. In this use case of a digital twin, the heterogeneous microstructure of the composite as well as the influence of the manufacturing parameters place enormous demands on process control and quality assurance. Concerning production, the manufacturing of unidirectional tapes from raw materials and large-scale hybrid injection moulding processes is in the focus. Cognitive sensor technology is integrated to characterize components and detect defects at multiple points along the process chain, while machine data is fully logged. Proposed AI tools add high-level knowledge to the raw data for the tape production and a chain of inter-mapped multi-physics simulations assesses physical stresses, fibre orientations, and further quantities in 3D. These heterogeneous data sources are integrated into a first data space demonstrator, in which the resolved raw data remains distributed, while meta-data is structured according to an ontology that semantically defines the unique correspondences of individual parts, simulations and measurements in the overall system. Based on this demonstrator, the final system will enables engineers to identify and individualize resources along the overarching real and virtual process chain and simulate production under the real material and process conditions.
distributed digital twin, thermoplastic composite, process optimization
×

[TITLE]

[LISTING

[ABSTRACT]

[DATE]

[ROOM]

[KEYWORDS]