F19
Simulation Management

Back to overview

16:05
conference time (CEST, Berlin)
Measuring the True Value of Your IT Assets
28/10/2021 16:05 conference time (CEST, Berlin)
Room: F
K. Trappe, S.M. Stenseth (Open iT Norge AS, NOR)
K. Trappe, S.M. Stenseth (Open iT Norge AS, NOR)
Does your IT spending align with your corporate goals? As worldwide IT investments are projected to continue to grow, it often ranks as one of the highest expenditures in an enterprise. Much of this is due to the complexity in managing IT assets, especially with the multitudes of software vendors and the increasing shift to cloud services. To maximize your IT budget, oversee them by tracking all IT systems to measure how often it is being used, detect duplication, and assess if any upgrades need to be made. Software is generally regarded as an intangible asset and their business value is hard for IT managers to quantify. A deeper understanding of the worth of an asset in comparison to the cost of an asset requires the right system of measurement from a large, unfiltered collection of data. Collecting and analyzing data about asset and user utilization is not about cost but maximizing the value of assets to your business. Metering and analyzing IT usage is the key to employing effective cost cutting measures. Without the proper analytics to accurately gauge active usage of its software investments, organizations frequently resort to renewing their previous licensing contracts, resulting in a mismatch of purchases vs needs, leading to unproductivity, and not meeting business objectives. By utilizing usage data reports, IT leaders can plan their budgets with reliable forecasting of future costs. This provides a more structured approach to cost management, while also gaining more control and visibility in optimizing IT assets to serve the organization.  In this presentation, we will be sharing best practices for preventive and strategic methods of implementing a software optimization program. We will also demonstrate the results of this program with case study examples. Join us to learn how focusing on the right metrics can guide you to an understanding of the true value of your IT assets, leading to a successful alignment with corporate goals.
analytics licensing, software usage, license utilization, engineering software management, IT asset optimization
16:25
conference time (CEST, Berlin)
Best & Worst Practices To Exploit Simulation Tools and Capability Across Enterprises : Lessons Learned
28/10/2021 16:25 conference time (CEST, Berlin)
Room: F
S. Demattè, N. Saccenti, F. Franchini, V. Primavera (EnginSoft SpA, ITA)
S. Demattè, N. Saccenti, F. Franchini, V. Primavera (EnginSoft SpA, ITA)
The specific design knowledge and the build-up of economic factors draw on the skills and know-how of multiple teams and experts. Today many organizations, even those that already apply simulation and design optimization, still approach product-development tasks as a handful of disconnected iterations that are not aligned with the broader business strategy. Among the answers to boost performances, reduce time-to-market and slash the whole product-development costs there is a holistic approach that includes collaboration, data sharing and the pervasive use of multidisciplinary frameworks, advanced simulation technologies, parametric analysis, artificial intelligence and automatic Multidisciplinary Design Optimization techniques. We have learned that supplying the right software tools and the related technical training is only a minor part of the journey of meshing state-of-the-art ideal practices into existing engineering processes, data platforms, and tool-chains. Indeed, organizational and cultural aspects are fundamental to maximize the outcomes of an IT investment as you basically need to bring together people and technology. Structured procedures are mandatory to exploit simulation capabilities and design process integration, and ensure the safeguards of data and the respect of security levels and Intellectual Properties. This implies that peculiarities of each specific industrial sector and each single organization have to be identified and understood. If teams are not ready, processes and culture must be the first focus of intervention. Lessons learned on the field suggest that the first steps should be to identify and then ensure the correct flow of data across different departments and with external stakeholders. In doing so, our aim is also to provide a clear and transparent vision on assumptions, capabilities and limitations of various models. Our experience with companies from multiple industries and countries, of various sizes and with heterogeneous readiness levels will be presented, focusing on dos and don’ts that we have learned along our way.
knowledge, culture, costs, multidisciplinary product-development processes, collaboration, data-sharing
16:45
conference time (CEST, Berlin)
A Capability Scale for Engineering Simulation
28/10/2021 16:45 conference time (CEST, Berlin)
Room: F
K. Meintjes (NAFEMS Technical Fellow, USA); B. Webster (Fidelis FEA, USA)
K. Meintjes (NAFEMS Technical Fellow, USA); B. Webster (Fidelis FEA, USA)
A Capability Scale for Engineering Simulation Keith Meintjes, CIMdata k.meintjes@cimdata.com Bill Webster, Fidelis FEA Bill.Webster@fidelisfea.com NAFEMS World Congress, 2001 Session: Simulation Governance and Management Overview: We present a capability scale for engineering (digital, physics-based) simulation. This scale is used as a planning tool to subjectively assess simulation applied in product design and development, particularly in relation to physical testing. Introduction: Following Moore’s Law, the capability for technical computing has doubled every eighteen months for nearly sixty years. His has completely changed how products are designed and developed. All of this means that the product engineering process must be continually reassessed and changed to leverage increasing simulation capability. A Scale for Simulation Capability: A scale for simulation capability has been used for over two decades to provide a somewhat objective planning framework. This scale is in the spirit of the CMMI (ref) and other progressions proposed with various goals in mind. We note: a. The assessment is subjective and represents an organization’s self-assessment. b. The assessment is primarily of technical ability. c. It does not so much measure maturity, or process capability. d. It seeks to balance simulation and physical test, not to make a binary choice between them. We look at every product system and subsystem requirement, ask the question: Can this be addressed by simulation? If yes, we then rank the simulation capability on the following scale: 1. Simulation is not useful; it cannot address this requirement. 2. Simulation can be used to rank design alternatives; A is better than B. 3. Simulation can assess whether a design meets the requirement, but testing is required to calibrate the simulation model. 4. Simulation can assess whether a design meets the requirement, and no development testing is required. Final product validation testing is required. 5. Simulation can assess whether a design meets the requirement, and no testing is required. Comments and Examples: If a requirement is not ranked on this scale, that means it has not been evaluated. Level 1: There are many requirements that are assessed by expert opinion: Neither test nor simulation are needed. Other requirements may not have a basis in physics, and simulation simply cannot be applied. Level 2: At a first pass, most simulation procedures will be in this category. But this capability is particularly important: It can support early decision-making in ranking alternatives, and it can guide experimental evaluations. Level 3: As an organization develops its simulation capability, most simulations will be ranked at level 3. Engineers will use fractional rankings that asymptote to Level 4: Giving up that confirmation test is a difficult leap to make. Level 4: Imagine. Products are engineered without development testing and, by implication, are expected to pass validation with no major issues Level 5: Feasible in many cases of simple physics, for example, linear strength and deflection. Discussion: We immediately see this is a very fine-grained assessment, at the level of each individual load case. For a complex product like an automobile, there are thousands of items to consider. This leads to a process of reaching a consensus on how the organization does its work. We asked for agreement between all stakeholders. Test vs. Simulation Comment on issue of test and simulation being in competition for resources. “Simulation is Better Than Test” Here we will discuss applications (stochastics, robust design, etc.) that are infeasible to accomplish by physical test alone. Closure This process has been very successfully applied over two decades to help create an organizational culture that excels, and a culture that is quick to change and adapt.
Governance, Management, Capability, CMMI, Maturity, Product Development, Engineering Process
17:05
conference time (CEST, Berlin)
Functional Decomposition (FuDe) Method as a Simulation Planning Tool
28/10/2021 17:05 conference time (CEST, Berlin)
Room: F
Z. Turhan (ASML Netherlands B.V., NLD)
Z. Turhan (ASML Netherlands B.V., NLD)
Functional Decomposition is an engineering method where any technical artefact’s functions are divided into manageable lower level functions and inter-connecting architectural relationships are identified. When a function definition is clear, then we know what that artefact is intended to do. Putting this scheme in the center, we started using FuDe at ASML as one of the primary methods that handles complex simulation planning in a structured way. It not only helps out the analyst to clarify and communicate his analysis plan but also helps the design team and the project management on robust product development and accurate capacity planning, respectively. Thanks to the modularity of the FuDe models, engineering teams can also utilize from this method for their large system/multiphysics models, i.e. Digital Prototypes-Twins. This method uses the well know “p-diagrams” as the baseline, where functional inputs and outputs are classified as Intended or Unintended (non-controllable outside influences or noises) variable sets. This method was first explained by J.M. Juran back in 1993 and later on in 2008 it was introduced into the FMEA process by Automotive Industry Action Group. In 1996, “Breakdown of Functions“ by Pahl & Beitz (1996), explains in more depth that every input or output of a function can only be classified into 3 types of physical quantity, i.e. Energy, Material or Signal; which are ultimately forming our simulation loads (heat, force, acceleration etc), material inputs (cooling water, mass or signals like (control parameters /loops). We use all these content along with a process definition for robust simulation plans. The FuDe process is explained in following steps: Step 0: System Structuring / Hierarchy Layers Step 1: Identifying and naming the Functions Step 2: Identifying Inputs & Outputs of the function Step 3: Identifying the Limits of the Functions and Visualization Step 4: Creating the Input / Output Matrixes Step 5: Creating the Simulation Cases Table Once the FuDe is complete, the team then can choose problems/cases that are most critical and urgent to the success of the product. It is much easier to agree handshake on details once they are laid out with clarity.
Functional Decomposition, Simulation Planning, Inputs, Outputs, Analysis Plan
×

[TITLE]

[LISTING

[ABSTRACT]

[DATE]

[ROOM]

[KEYWORDS]