Newsletter EnginSoft 2015 n°1

Page 1

0010111010100 111 01 01 1

010100101 0 01110 1 0 10 011 0 10 1 01 10 00

Newsletter Simulation Based Engineering & Sciences Year

12

n째1 Spring 2015

0110010 1101 001 01 01 11

Custom Prosthesis Optimization with Mechanical Simulation

The role of innovation in the railway industry: a testimonial from Trenitalia

Vibration analysis and chassis damping design on a packaging machine

Multi-objective optimization of the timing system on a small 2-wheeler engine

ESAComp meets challenging marine applications

Evaluation of the air flow within a climate control unit

1D simulation applied in an Oil&Gas Company


Special Issue on Oil&Gas Industry 11 10

100101110101001 010 110 10 11

Newsletter

01010 0 1 0 1 010 01110 10 011 01 10 01 10 00

Simulation Based Engineering & Sciences in the Oil&Gas Industry

0

01100101001 1101 1 0 11 001 10 01 01 01 11

00

EnginSoft Newsletter - Simulation Based Engineering & Sciences

CFD analysis of a lube oil tank: air ingestion investigation

The SuperCups Trays – Saipem’s further innovation in Urea Technology

Second Hydro Power Plant Turkey is taking off

CO2 Stripping Column Effluent System Application employing Saipem mechanical innovative Technology

System Control logic enhancements through Fluid-Mechanical valve dynamic transfer functions

Evaluation of Surge Pressure Dynamics in a Closed Fluid Circuit

The Oil&Gas Industry encompasses a wide range of diverse and highly demanding environments which pose a continued and varied stream of relevant issues and problems to solve. These must be addressed both for safe working practices and retaining organizational competitiveness within the marketplace. EnginSoft has compiled a special publication dedicated to the Oil&Gas Industry from the point of view of engineering simulation. Here, the challenges are all characterized by having an important impact on the operating life of an installation. Tackling a virtual project risk assessment represents an exacting sector requirement; and simulation designers must consider on a daily basis - the industry regulations that establish criteria for ensuring reliability and safety. Within this complex world of simulation, EnginSoft has been making all the difference thanks to its globally distributed, multi-disciplinary consultancy and thirty years of hard experience using virtual tools. EnginSoft allocates necessary resources through its strategic partnerships with major companies such as ESSS, and in this context Ansys is a natural partner where an intelligent fusion of technology and software can provide a real difference for continuous improvement. Engineering simulation software from ANSYS has been critical in enabling Oil&Gas engineers worldwide to create virtual prototypes, and to capture new knowledge about the detailed workings of equipment and processes. With our special issue newsletter dedicated to O&G, we provide our readers with an overview of real Computer Aided Engineering applications highlighting EnginSoft’s experiences and competencies, whilst showcasing the benefits of our real partnerships via exemplars from ESSS, and affirming the core and innovative application of ANSYS technology by highlighting what the real value was to individual customers. As change continues to accelerate, we know that adopting and developing simulation based engineering can make a real difference. So if you want to investigate in depth a range of diverse issues including tanks, valves, connectors, offshore structures, ventilation systems, furnaces, and even power plants then our O&G publication will provide you with real testimonials, relevant case studies and articles dedicated to our specific approaches.

DOWNLOAD your free copy on: www.enginsoft.com/oilgas-newsletter


F

LASH

“Software is a great combination between artistry and engineering.” I agree with this quotation from Bill Gates, and I am sure you will agree too. Each day, our highly committed engineers use a vast array of different software solutions, which enable them to simulate and optimize even the most complex processes and designs. For us, Finite Elements Modelling, Rigid Body Dynamics, Computational Fluid Dynamics, Design Optimization and the like, are not pretentious phrases – these terms reflect our daily business and are the artists’ tools in which we use to create beautiful engineered designs. In this quarterly edition of the Newsletter, we will again take you on a tour through a ‘gallery’ of the latest design developments, all achieved using CAE tools. Each ‘work’ has helped deliver value to its customers by exceeding design expectations. One main ‘exhibition’ area in this edition is the application of different tools, such as ANSYS. Find out on page 8 how ANSYS Mechanical has been used in the design and testing of bespoke orthopaedic limbs for the anatomical reconstruction of a leg; an important area to obtain both good structural design and aesthetic appeal. Also modeFRONTIER – the leading solution for multi-objective and multidisciplinary optimization – occupies a significant part of this Newsletter. For example, turn to page 19 to discover how modeFRONTIER has been used to conduct a multi-objective analysis of a valve train system design, in order to optimize the engine performance of a motor scooter. In addition, read about the developments within the EnginSoft Group. We are proud to announce the launch of a new subsidiary in Turkey and the acquisition of TTC³ and VSA – the German experts in complex dimensional and tolerance management. Both developments reflect our persistent and strong commitment to maximize the value for our stakeholders. Finally, I would like to encourage you to participate in one of this year’s main events – the international CAE Conference 2015 – to seize the opportunity of learning, understanding and shaping the way in which CAE technologies will be applied in future. The conference will take place in October, with preparations already well under way. Make your contribution by submitting a technical paper or entering the Student Poster Award competition and help create another successful edition of the CAE Conference. I would like to take the opportunity to thank all of you for your contribution to the overwhelming success of last year’s activities, and look forward to continuing our partnership in the ‘art’ of engineering through the coming year.

Stefano Odorizzi, Editor in chief

3 - Newsletter EnginSoft Year 12 n°1

Flash


Contents INTERVIEW

SOFTWARE UPDATE

6 The role of innovation in the railway industry: a testimonial from Trenitalia 18 1D simulation applied in an Oil&Gas company

46 48 51 52 54

CASE HISTORIES 8 Custom prosthesis optimization with ANSYS Mechanical 10 ESAComp meets challenging marine applications 12 CFD analysis for the evaluation of the air flow within a climate control unit 14 Vibration analysis and chassis damping design on VT 6000 OF packaging machine 19 Multi-objective optimization of timing system of a small 2-wheeler engine (SOHC): methodology and case study 26 Bonfiglioli: a complete range of gear motors for the industry 28 Hardware resources optimization in Pierburg Pump Technologies 29 Optimization of the curing system of a Hot-Box Core with MAGMA C+M 31 Accelerating the Innovation of Design Process using CAE and 3D Printing 35 Integrated Life Cycle Design for Buildings: Invest in Simulation to Increase Comfort and Environmental Sustainability at a Reasonable Cost 38 Simulation of Quasi-Static Impact Response of Flat Marine Grade Foam-Filled Composite Sandwich Hull Panels 40 Structural optimization of a bridge beam section subjected to instability of equilibrium 42 Investigation about the warm deep drawing of Mg alloys using Metamodels

Contents

ANSYS CFX R16.0 ANSYS R16.0 Mechanical Flowmaster V7.9.3 Total Materia Integrator Pre-salt Simulation Issues

RESEARCH 56 Assembly Layout & Process Estimator

COMPANY NEWS 58 59 60

EnginSoft launches a new subsidiary in Istanbul, Turkey EnginSoft Turkey started business with great inaugurations both in İstanbul and Ankara EnginSoft Group grows

JAPAN COLUMN 61 Certification Program for Computational Mechanics Engineers in Japan

Newsletter EnginSoft Year 12 n°1 - 4


OUR ACKNOWLEDGEMENT AND THANKS TO ALL THE COMPANIES, UNIVERSITIES AND RESEARCH CENTRES THAT HAVE CONTRIBUTED TO THIS ISSUE OF OUR NEWSLETTER

To receive a free copy of the next EnginSoft Newsletters, please contact our Marketing office at: newsletter@enginsoft.it All pictures are protected by copyright. Any reproduction of these pictures in any media and by any means is forbidden unless written authorization by EnginSoft has been obtained beforehand. ©Copyright EnginSoft Newsletter.

EnginSoft S.p.A.

24126 BERGAMO c/o Parco Scientifico Tecnologico Kilometro Rosso - Edificio A1, Via Stezzano 87 Tel. +39 035 368711 • Fax +39 0461 979215 50127 FIRENZE Via Panciatichi, 40 Tel. +39 055 4376113 • Fax +39 0461 979216 35129 PADOVA Via Giambellino, 7 Tel. +39 049 7705311 • Fax +39 0461 979217 72023 MESAGNE (BRINDISI) Via A. Murri, 2 - Z.I. Tel. +39 0831 730194 • Fax +39 0461 979224 38123 TRENTO fraz. Mattarello - Via della Stazione, 27 Tel. +39 0461 915391 • Fax +39 0461 979201 10133 TORINO Corso Marconi, 10 Tel. +39 011 6525211 • Fax +39 0461 979218 www.enginsoft.it - www.enginsoft.com e-mail: info@enginsoft.it The EnginSoft Newsletter is a quarterly magazine published by EnginSoft SpA

COMPANY INTERESTS EnginSoft GmbH - Germany EnginSoft UK - United Kingdom EnginSoft France - France EnginSoft Nordic - Sweden EnginSoft Turkey - Turkey VSA-TTC3 - Germany www.enginsoft.com

CONSORZIO TCN www.consorziotcn.it • www.improve.it Cascade Technologies www.cascadetechnologies.com Reactive Search www.reactive-search.com SimNumerica www.simnumerica.it M3E Mathematical Methods and Models for Engineering www.m3eweb.it

ASSOCIATION INTERESTS NAFEMS International www.nafems.it • www.nafems.org TechNet Alliance www.technet-alliance.com

ADVERTISEMENT For advertising opportunities, please contact our Marketing office at: newsletter@enginsoft.it

RESPONSIBLE DIRECTOR

Stefano Odorizzi - newsletter@enginsoft.it

ART DIRECTOR

Luisa Cunico - newsletter@enginsoft.it PRINTING Grafiche Dal Piaz - Trento

The EnginSoft Newsletter editions contain references to the following products which are trademarks or registered trademarks of their respective owners: ANSYS, ANSYS Workbench, AUTODYN, CFX, FLUENT and any and all ANSYS, Inc. brand, product, service and feature names, logos and slogans are registered trademarks or trademarks of ANSYS, Inc. or its subsidiaries in the United States or other countries. [ICEM CFD is a trademark used by ANSYS, Inc. under license]. (www.ANSYS.com) modeFRONTIER is a trademark of ESTECO Spa (www.esteco.com) - Flowmaster is a registered trademark of Mentor Graphics in the USA (www.flowmaster.com) MAGMASOFT is a trademark of MAGMA GmbH (www.magmasoft.de) - FORGE, COLDFORM and FORGE Nxt are trademarks of Transvalor S.A. (www.transvalor.com) FENSAP-ICE is a trademark of NTI Newmerical (www.newmerical.com) - LS-DYNA is a trademark of LSTC (www.lstc.com) - Particleworks is a trademark of Prometech Software Inc. (www.prometech.co.jp) - RecurDyn™ is a registered trademark of FunctionBay, Inc. (www.functionbay.org)

5 - Newsletter EnginSoft Year 12 n°1

Contents

Autorizzazione del Tribunale di Trento n° 1353 RS di data 2/4/2008

Newsletter EnginSoft Year 12 n°1 - Spring 2015


The role of innovation in the railway industry: A testimonial from Trenitalia Trenitalia was founded to meet the needs of the railway sector and aims to face the market challenges through customer care in all aspects. Trenitalia is 100% controlled by Ferrovie dello Stato Italiane SpA, who manages passenger transport and logistics. It strives to make its industrial organization more modern and efficient, with increased productivity, active balance, excelling the quality of its service together, with a high level of social responsibility and environmental sustainability. This approach is complemented by a strong commitment to remain flexible and quickly adapt its strategies and objectives. Trenitalia is one of the first railway operators in Europe, with over 9 thousand trains travelling every day, used by over half a billion travelers, as well as transporting 80 million tons of commercial product every year. Its activity is supported by a great international connection: several commercial agreements with other European railway operators and considerable shareholdings in foreign operators. We had the pleasure of conducting an interview with Eng. Claudio Goracci, who has been working for Trenitalia in Florence since 1987 - within the Office of Tecnologie Meccaniche that is part of Direzione Tecnica/Ingegneria Rotabili e Tecnologie di Base. He is involved in the mechanical design of the rolling stock body frame (shell) and related FEM calculations. As complementary activities, he also deals with the technical criticalities of important components/units assembled on the traction system, such as the mechanical drives between the traction motor and the wheelset, which are: • • •

1. How much relevance does innovation have (and should have) in the industrial world? The technological innovation of the industrial system is an essential tool to guarantee the economic progress and therefore enable the spread of wealth in society.

Gear boxes; Cardan shafts; Elastic transmissions with element in rubber/metal.

All the mentioned activities above are carried out to meet the maintenance needs of the locomotives, coaches and wagons fleet of Trenitalia. In this context, it is necessary to re-design some parts/components due to wear and damage, for repair action and/or performance increase. Fig. 1 – Ing. Claudio Goracci, Trenitalia

Interview

Newsletter EnginSoft Year 12 n°1 - 6


2. What are the strategies on being innovative and what are the steps of evolution leading towards innovation? In my opinion, the strategies to be adopted in order to create innovation are the following ones: • to devote adequate economical resources to research and development activities, and possibly constantly maintaining them over time • to arrange structured collaborative relationships with local Universities who are engaged in the specific technical/scientific disciplines and to promote research activity in line with the company’s aims. This action is particularly important in two ways: On one hand, it helps orient the academic training towards disciplines that are appropriate for companies in the local area - thus working as a professional trainer that supports the growth of the local economy. On the other hand, it provides more innovative technical/scientific knowledge and capability to local companies, fostering their improvement and development. 3. What role does CAE tools and virtual prototyping play in this context? The advantage of CAE is the possibility of performing engineering analysis by just using information technologies (numerical simulations), thus reducing the time and costs that would be otherwise required for the same activities if carried out through traditional standard tools (prototypes production and experimental tests). 4. How have users’ needs changed through the last years? In recent years, information technology has advanced yet again, thanks to great hardware innovation. This has allowed very detailed FEM (Finite Element Method) models to obtain, with several Mdof (Mega degrees of freedom), to be solved in quite a short time. Today, this is possible with just a PC costing only a few thousand Euros, whereas this would have been impossible 10-15 years ago. Not too long ago, similar analyses would have required a powerful network of centralized computation stations with different local work stations or connected clusters, but both solutions would have cost at least 5-10 times more.

courses with skilled personnel and provide a competent post-sale software support to ensure maximum return and results. 7. In your perspective do you believe there will be a need for computation codes to handle future challenges? As far as the scale of the metallic structures project/test is concerned, I think that the most relevant implementation needs are mainly related to hardware improvements, so to increase the model size on one hand (increase in number of the FEM elements/nodes) and to reduce the solution time on the other. This is particularly true for the most timeconsuming structural simulations, such as those developed in a nonlinear field (elastoplastic analyses, crash computations, analyses with non-linear elements (GAP elements). 8. What projects, objectives and new targets are you aiming towards thanks to such tools and application capability? In the company I work for, ANSYS Structural Mechanics has been successfully used for many years in the continual improvement of rolling stock maintenance. We have also recently acquired ANSYS nCODE software, which we are planning to start using in the coming months, to simulate the evolution/ propagation of cracks that occur on particular vehicle structures/ equipment during commercial service. 9. What do you wish to the world of scientific technology, who continuously seek a balanced dimension between creativeness and competitiveness? I wish that the international community and all its institutions and organizations understands and remembers that progress has a social role before creating an economical impact. Just think about the health sector and the research concerning rare diseases and pathologies; applied research activities, creativeness and engagement are all necessary and should be adequately rewarded. Last but not least, technology has to make companies competitive without forgetting workers’ health, dignity and respect for the environment. Edited by the editorial staff

5. From your professional experience, what advantages have you recognized and how has the design/production changed? The use of software for structural analysis with FEM has proved to be very fruitful from the very beginning (30-40 years ago). It has allowed a global scaling of the structures, which were simplified as frames of girders, each one of them long enough and with a constant section, so that the De Saint Venant principle could be applied, to a detailed scaling of the simulated structures in their precise conformation/geometry. For the first time, designers were able to simply verify the scaling of all those structural details characterized by strong variation in shape/section, where both static and fatigue damages are normally concentrated. 6. What contribution has EnginSoft provided and in what ways has it expressed the potential capability of the company? EnginSoft has significantly contributed in the dissemination and transfer of the necessary knowledge to enable us to use ANSYS FEM simulation tools in obtaining the most value to our company. They organize training

7 - Newsletter EnginSoft Year 12 n°1

Fig. 2 – First natural frequency from modal analysis of railway car body frame

Interview


Custom prosthesis optimization with ANSYS Mechanical Lima Corporate is a global medical device company that envolves more than 600 staff employees and 17 subsidiaries worldwide providing reconstructive orthopaedic solutions to surgeons who face the challenges of improving their patient’s quality of life. Lima Corporate designs, develops, manufactures and markets joint replacement and repair solutions that include large joint implants for hip and knee and systems for the restoration of the extremities focusing on disorders of the shoulder and traumatic injuries. The company products portfolio is made up of a large standard series of products for Knee, Shoulder, Hip, Small Joints and Trauma. A particular new challenge for Lima is represented by the custom made products: the peculiarity of these devices is that they are completely designed under the surgeon request so to adapt perfectly onto the patient specific anatomy. The custom made products apply mostly to very critical cases, where standard products have no application and require specific engineering to be produced, often in very short time (3-4 weeks from the request). A hinged knee custom case A very tough case was subjected to our attention in 2014: a patient with a failed previous knee implant needed a complete leg reconstruction and the substitution of the knee joint to restore the leg motion.

Case Histories

Fig.1 – Final configuration of a hip and knee implant solution

The starting situation of the patient’s leg with failed implant is shown in fig.2. The x-ray shows the presence of a long femoral stem (from a competitor) inside the right leg, with and a cage whose function was to stabilize the connection between the bone and the stem. Unfortunately, the lower portion of the leg broke, leading to knee failure. Lima custom division was asked to solve this problem with a custom system, as displayed in fig.3. The solution for this particular custom case was to design a tough structure, a custom femur made of Titanium alloy (Ti6Al4V), so to substitute the old cage and connect together the humeral stem already in situ with a hinged knee system (Multigen H, Cobalt Cromium alloy component). There were some structural critical points related to this case, in particular the two mechanical interfaces: the first between the custom femur and the stem, the latter between the custom femur and the femural component. ANSYS is officially partner of Lima Corporate since 2008, chosen, after been evaluating several other softwares, for its superior capabilities, its user friendly interface and its superior contact features.

Newsletter EnginSoft Year 12 n°1 - 8


Once the part was designed, it has been optimized by iterating with FE results (Ansys mechanical, structural static environment) in order to reduce stresses. Two different structural configurations were set up to test the custom component in the worst case conditions for its two critical areas, taking into account all the common human activities (stand-up sit down, going upstairs, normal walking, etc...). The stem thread would be hard loaded during stand up from a chair and its weak point is located at the interface with the upper surface of the custom femur. The second critical point is the conical taper of the femural component, which is assigned to withstand all the loads, together with the two fixing screws. As shown in figure 4, all the simulations returned positive results in terms of maximum principal stresses. The custom component was implanted at the end of January 2015, with positive accomplishment of surgeon and patient. Matteo Boccalon, Lima Corporate Ask the expert in Simulation applied to Health: Michele Camposaragna, EnginSoft m.camposaragna@enginsoft.it

Fig.2 – Patient situation Implant breakage

Fig.3 – Patient situation Restored leg length with Lima custom

Fig.4 – Results of ANSYS analysis on conical taper (left) and stem thread (right) in terms of max principal stresses [MPa]

VIRTUAL PROTOTYPING AND BIOMEDICAL DEVICES Medical devices, especially those that are implanted in the human body, are some of the most complex and cutting-edge products available today. While they can save or improve the quality of a patient’s life, they also pose some of the biggest engineering challenges, leaving no room for error. Because these devices are surgically implanted, they must work correctly the first time, every time. Even as organizations push the boundaries of medical

9 - Newsletter EnginSoft Year 12 n°1

applications, biomedical device developers anticipate a return on their sizeable investments — which includes design, prototyping, selection of materials, manufacturing and testing. ANSYS tools allow researchers to completely work through these issues before devices are implanted in patients. For example, simulation software can verify that a given device is an ideal fit for a specific patient, which can maximize surgical success in both short and long terms.

Case Histories


ESAComp meets challenging marine applications About Olivari Composites Olivari Composites specializes in the engineering of composite structures for motorboats and sailboats, with many years of experience in the design of a vast number of racing and cruising yachts. Their knowledge and expertise is also applied to many other sectors: from aeronautics to transportation to renewable energy and natural fibers. Dr. Luca Olivari is a naval architect and a major European expert in the engineering of composite materials; “To give you an example, over the years, I’ve been involved in the design and analysis of over 60 composite rudders of all sizes up to 350kg”, explains Dr. Olivari. “I was also involved with Fincantieri

Military Dept. in the project of a stealth composite upper structure for a 90m ship. Since 1987, I have done all the structural projects for Del Pardo and, from 2004, all the structural projects for the Azimut-Benetti Group”. Over 15 years ago, Olivari Composites were the first Italian ESAComp user - the software solution provided by Componeering Inc. (Finland), which remains as the first choice for the design of many composite structures. “My practical experience, gained from building and sailing a lot of the boats, allowed me to optimize the theoretic aspect of the calculation and construction technique with composite materials”. “I mainly use ESAComp for my marine projects, both sail and powerboat composite structures”, explains Dr. Olivari. RIB “Rigid-hulled Inflatable Boat”: the latest project challenge for Olivari Composites

Fig. 1 – Dr. Luca Olivari, Olivari Composites Engineering

Case Histories

Let’s talk about the design of the carbon Rigid-hulled Inflatable Boat “The design-development of the rigid-hulled inflatable boat (RIB) was really challenging, it is a high-performance boat, 16.5m in length with carbon fiber primary structures (keel and hull). This boat has a very powerful engine with surface propellers and an advanced transmission system, reaching a maximum speed of

Newsletter EnginSoft Year 12 n°1 - 10


Fig. 4 – ESAComp “through-the-thickness” stress analysis

Fig. 2 – Olivari Composites Engineering technical team

about 60 knots. The choice of material depends on the requirement of the main design related to the stiffness/mass ratio. The design of the hull was heavily influenced by constraints such as interior spaces, low profile and external boat profile”. What are the design drivers for this kind of boat? “First of all the boat is classified by length and performance requirements. The reinforced panels of the hull have been verified according to several industry Standard Rules. The ABS Rules (American Bureau of Shipping) can be considered a good reference, but in this case the Italian RINA and Norwegian DNV Rules were also taken into account. Once the shape and size of the boat were fixed, the structural grid and the inner constraints were also defined. The entire structure is then simplified in several elementary subsystems, which are then verified through ESAComp. According to these Standards, the pressure distribution on the panels is applied one by one”. What is the added value provided by ESAComp? A suitable definition of the material properties are mandatory to build a robust and reliable numerical model. “The ESAComp Data Bank is an excellent reference source that is continuously updated. I usually couple this data with my personal material library - built from the years of working with the main Italian shipyards”. In addition, ESAComp provides several analysis features, used to verify the mechanical static and dynamic response of the composite reinforced panels, under the operative conditions - as suggested by the Standards. Do you interface ESAComp to external FE software? “Sometimes I use this capability, the material properties of the plies and laminate lay-ups are directly exported, so the FE model set-up is really straightforward and quick”. “In this case, however, only the analysis tools provided by ESAComp was able to

11 - Newsletter EnginSoft Year 12 n°1

Fig. 5 – ESAComp “through-the-thickness” stress analysis

evaluate the mechanical performances of each panel; guaranteeing a suitable margin of safety”. Jo Hussey - ESAComp Team - Componeering Inc. Fabio Rossetti, EnginSoft Ask the expert in composites: Fabio Rossetti, EnginSoft f.rossetti@enginsoft.it

Fig. 3 – ESAComp panel analysis, section 10

Case Histories


CFD analysis for the evaluation of the air flow within a climate control unit Introduction In order to assess the possible advantages offered by CFD analysis in the new product design process, a series of numerical simulations was performed by Climaveneta SpA with the support of EnginSoft. The aim of the activity was to reproduce the new ACU171 Expanded close control unit (see Figure 1), matching numerical results with experimental measures. Afterwards, some geometric modifications were tested to optimize the air flow velocity distribution on the filter surface to improve the heat exchange properties.

Validation The reliability of the CFD model was initially verified by comparing numerical results with the laboratory measurements. Two different machine configurations were available and reproduced: the baseline, equipped with three 630mm diameter fans, and a larger version with two bigger 710mm fans. Experimental data was collected regarding the total mass flow and air velocity distribution on the suction side, as well as the absorbed power and electric current. A very satisfying correlation between the numerical solution and measured parameters was obtained. As a further comparison, the pressure drop through the filter and the heat exchanger were checked. The main results of these simulations are reported in Table 1. The analyses also demonstrated the possibility to obtain a wide set of qualitative information about the behavior of the fluid within the close control unit. For example, in Figure 2 a global streamline distribution is shown.

Model description The 3D simplified model of the baseline geometry was created with SolidEdge. A further arrangement was carried out by the ANSYS CAD system, while all the analyses was performed in ANSYS Workbench using the CFX R15.0 solver. The core components of the machine were treated as porous sub-domains (the filter and the exchanger) and general momentum sources (fans), although in a second phase a tridimensional representation was also tested for fans. The porous losses were expressed by Darcy’s law and set according to the pressure drop curves determined by the manufacturer, and then applied to the designed domains respectively. Due to the complexity of the system, a 5.5 million elements tetrahedral mesh with prism inflation layers was generated. The two domain openings were characterized by an averaged static pressure equal to 0 Pa relative to a reference pressure of 1 atm. The turbulence model chosen was the Fig. 1 – 3D view of the top-down flow machine Shear Stress Transport (SST).

Case Histories

Optimization of the filter frontal velocity All previous simulations resulted in a highly non-uniform distribution of the frontal velocity upstream of the filter. The same consideration is valid for the exchanger below. This situation is not optimal from the point of view of the global yield of the heat exchanger, which is

Newsletter EnginSoft Year 12 n°1 - 12


Table 1 – Model validation results

forced to work differently depending on the local air velocity. For this reason, the validated CFD model was used to investigate the possible introduction of some geometric devices in order to prevent this problem. The expanded, two fans model was chosen as the base configuration. Some of the interventions taken into account are presented below. The first implemented solution was a reduction of the finned pack’s tilt, compatibly with the machine’s lateral size, but this change did not induce a meaningful benefit. The second and the third simulated modifications were quite similar, and both concerned one or more vertical separation septum located above the filter along the entire upstream volume. The concept was to divide the total air flow into two or more equal sub-flows, trying to rebalance the velocity distribution from left to right. In fact, the analysis of the baseline geometry showed that the region of maximum air velocity values was located on the lower left corner of the finned pack. Simulations demonstrated the inefficacy of this type of intervention because the velocity distribution on the filter maintained the initial asymmetric appearance. Much better results were obtained by using a 54° inclined deflector, horizontally fixed under the electrical panel. In this case, a quite uniform velocity distribution over the filter was reached (Figure 3). Compared to the baseline configuration, a slight fall was detected considering the total air flow (less than 4%) and the mean velocity on the inlet at suction side (less than 2%). Nevertheless, the standard deviation calculated in terms of

Fig. 2 – Air patterns within the machine

frontal velocity upstream of the filter passed from 0.34 to 0.146 by using the deflector, confirming the effectiveness of this simple improvement. Conclusions In summary, CFD simulations performed by ANSYS Workbench/CFX allowed to obtain a good correlation between the experimental data and the numerical solutions. Furthermore, these analyses gave a wide range of additional information which would be very difficult to obtain in the laboratory. The possibilities offered in terms of virtual prototyping have been positively benefited by seeking a solution to the unevenness of the exchanger’s frontal velocity distribution. A satisfactory result was achieved in few attempts, with a very simple mounting device which does not require any modification to the initial design. Ufficio Sviluppo Prodotti Climaveneta Ask the expert in Climate Control Simulation Luca Brugali, EnginSoft l.brugali@enginsoft.it

Fig. 3 – Comparison of the frontal velocity field before (left) and after (right) the introduction of the deflector

13 - Newsletter EnginSoft Year 12 n°1

Case Histories


Vibration analysis and chassis damping design on VT 6000 OF packaging machine Thanks to the simulation, the amplification on the resonances has been reduced up to values of even 50 times lower than the undamped case, with obvious repercussions on the noise level of the machine during operation ILAPAK designs and produces industrial packaging machines, electronic multi-head scales and automatic systems for food. ILAPAK offer includes a wide range of different machines, each one of them available in different models in order to meet specific industrial application requirements, from the simplest till to the most demanding ones. ILAPAK has many years of experience in the design and production of specialized wrapping solutions for a wide range of products, each with their own unique packaging requirements. Furthermore, it has achieved a detailed knowledge of the environment and production conditions, having installed thousands of packaging machines (both horizontal and vertical) into a great variety of product processes. Apart from packaging machines, ILAPAK offers a series of food automated systems that can be integrated with third party devices. The collaboration between ILAPAK and EnginSoft started with the VT 6000 OF project, which is a new machine for vertical packaging. This is a continuous wrapping machine used for high production speed. The goal of ILAPAK was to obtain a food packaging solution able to work under different operating conditions (from 120 to 219 envelopes/minute), to reduce the handling vibration and keep to production costs. In order achieve this goal, EnginSoft and ILAPAK have collaborated to first understand, the kinematics of the machine and to identify the mechanical parts responsible for the vibrations. The kinematics of continuous machines are mainly characterized by a vertical and horizontal motion; the vertical motion of the slide follows the film during the welding process and the horizontal

Case Histories

one to the two jaws which close the sealing bars. Considering that the horizontal motion involves balanced masses, the main vibration is caused by the vertical motion of the slide, whose mass (56 Kg) effects the structure through the two pulleys. A preliminary dynamic FEM analysis using LS-Dyna, on a simplified model of two connected pulleys linked by a belt (associated to a concentrated mass of 56Kg), has helped verify the load effecting each pulley during the motion varies in relation to time and belt pre-load. Since such pre-load value is not precisely known for each machine, the following vibration analysis has taken into account the most conservative load situation, which is when the whole load is applied to the slide at the top pulley. This is the farthest one from the ground support of the machine. The motion of the slide in relation to a cycle time has been provided by ILAPAK for a packaging speed of “219 bags/minute�. In this context, the acceleration curve of the slide (and the result of the m*a on the structure at the pulley connection) and the torque pattern in relation to time variation (figure 2) are known. In order to understand the dynamic behavior of the structure under operating cycle repetition (shown in figure 2), it has been considered that such cycle is repeated infinite times and the components of the Fourier series of the signal have been evaluated.

Newsletter EnginSoft Year 12 n°1 - 14


Fig. 2 - Pattern of the vertical resultant on the structure in the pulley connection and torque pattern (in relation with the reduction unit)

Fig. 3 - Co-efficients of the Fourier series of the slide acceleration

Fig. 4 - Comparison between the reconstruction of the accelerating signal obtained from the first three frequencies (pink) with the original signal (blue)

Figure 3 shows the slide acceleration amplitude as a function of the frequency variation, where the frequency is the result of the harmonic number multiplied for the basic frequency of the slide cycle repetition (up-down). The time is calculated as 0,274s and therefore its frequency is 3,65 Hz. Analyzing the Fourier series is used to highlight this, by considering the first 3 frequencies of the series to almost completely represent the full signal (see figure 4); in any case the first 5 frequencies have been taken into account for the following harmonic analyses. Once the dynamic load conditions have been characterized by frequency, EnginSoft set up a structural FEM simulation using ANSYS Workbench 15, in order to understand the machine vibration level in the actual project state (“AS IS” model) and to

15 - Newsletter EnginSoft Year 12 n°1

study how to work on the structure to reduce such vibrations. The first step of the analysis flow is the generation of a machine FEM model, simplifying the CAD of the project structure with the aim of making it suitable for numerical simulation and to understand the physics of the problem. The FEM model is characterized by numerous discretized bodies (in accordance with SHELL theory) so to dramatically reduce the number of degrees of freedom of the system and to parameterize the thickness (which is a useful part of the structure variation phase). In the model no concentrated masses have been included, thus keeping the real topology of the connection “mass” units to the structure (engines, pump, reel of paper, …). First of all, a modal analysis has been set up on the “AS IS” model to understand the main Eigen frequencies in the range of 3-50Hz. Then, five harmonic analyses have been set up, one for each of the five characteristic frequencies of the input signal (related to the packaging speed 219 bags/minute). The analyzed frequencies and the related load amplitudes are reported in Table 1 (the amplitude coincides with the real part of the frequency, since the imaginary one is almost null).

The values of the damping b of aluminum, steel and plastic are included as variables to the frequency f, so to have a “loss factor” tand equal to 2pbf, with a tand equal to 0.1% for steel and aluminum components and 4.0% for plastic ones. Since the machine has to work under different operating conditions, which depends on the number of bags packed per minute (not necessarily 219), it is fundamentally important to understand the machine’s feedback for frequencies different from the five ones related to the 219/minute case. Changing the packaging speed from 120 to 219 bags/minute, the first five characteristic frequencies vary and cover the whole range between 3 and 50Hz. Therefore, it has been necessary to set up a frequency sweep (harmonic series) on the “AS IS” model between 3 and 50Hz., analyzing such time frame with a sufficiently high frequency resolution so to identify the peaks in the Eigen frequencies (the tightest is the frequency the less is the imposed damping). The imposed sweep has predicted a resolution of 1410 harmonics with a geometric progression instead of a non-constant frequency pitch (so to catch with the same number of points, the bells around both the low and the high frequencies). In order to compare the displacements related to the same amount of applied loads, each harmonic for such sweep has been performed applying

Case Histories


obtained on the whole structure in this way is very high and obtains an average tand value of 1/3 of the tand of the damping material. Even when considering a damping material with a frequency constant tand, the damping which can be achieved is still not uniform when varying the frequency. The frequency, necessary to obtain the highest damping value possible in the whole system, can be calibrated, managing in the right way the damping material, the plate and the retainer plate thickness and the damping material type (shear and tand module).

Fig. 7 - Scheme of the “constrained layer” plate Fig. 5 - FE model of the machine VT 6000 OF

Freq (Hz)

Spectrum Vertical Force (N) Spectrum Torque (Nm)

3,646973 18,23487 25,52881 40,1167 47,41065

1605,4108 1021,6248 617,46552 60,095616 32,4981944

64,75159328 41,20554909 24,904452 2,423857423 1,310761

Table 1 - Amplitude of the vertical force related to the slide motion and the torque amplitude to the reduction unit for the first five investigated frequencies

a vertical force of 1000N (instead of a real value obtained from the Fourier series of the slide motion load) and a torque of 40,333Nm (so to keep the real relationship to the vertical force). Thanks to the problem linearity, it will be enough to scale the result of x to get the response for increased loads of x times. Figure 6 shows, the displacement result towards Z of one of the control point of the structure as a function of the frequency, having the sweep. By analyzing the obtained results for the “AS IS” model, and in particular the displacement field coming out from the five harmonic analyses and the frequency sweep, so to reduce the system vibration level in the functioning range, it has been decided to damp the machine chassis (and the electronic components cabinet) using the “constrained layer” technique. The constrained layer method consists of overlapping a damping material layer on the structure, constrained by an adequate metallic retainer plate. With this solution, especially good for flexible wave dissipation, the energy is dissipated in the damping element in a cutting way (shown in green in figure 7). The theory supporting this damping solution was set up by Ross - Kerwin - Ungar (RKU) and it is a very complex one. The damping which can be

Case Histories

This methodology allows the design and optimization of the materials and of the thickness to be used so to select and reduce the vibration amplitudes in the critical frequencies under investigation. The sandwich properties that have been calculated can be included in the ANSYS FE model already used to quantify the amplification reduction in the frequency range of interest. The elastomeric damping material ISD112 produced by 3M is available with different thicknesses. Considering this specific case and since the main signal frequencies are not so high, reaching 50Hz, a thickness of 5/1000 inch has been chosen (thus lower thicknesses are recommended for higher frequencies ranges). Once the Young’s modulus and tand were known, the properties of the plate-rubber-retainer plate system were calculated by applying the RKU theory using a specific tailor-made software created by EnginSoft (in collaboration with the Eng. Mauro Peselli). This was able to provide the equivalent properties as a function of the frequency, considering an infinite plate. Such software has allowed to carry out a real design optimization of the thicknesses of the machine chassis, identifying the best thickness of each sandwich layer, taking into account the constructive constraints and weight required by ILAPAK. The tand of the plate-elastomer-retainer plate sandwich, obtained for one of the modified structural components (in white in figure 8) is shown in the figure in relation to frequency

Fig. 6 - Displacement on a control point in Z direction VS. frequency

Newsletter EnginSoft Year 12 n°1 - 16


Fig. 8 - Tand as a function of the frequency of the sandwich plate included in the white component

variation. These results have been conservatively scaled at 60% so to consider also the edge effect with respect to the theory considering an infinite plate. The tand and the Young’s modulus of the sandwich of the different components of the chassis have been assigned according to the frequency variation on the ANSYS FE model. Once the damping material has been design for the different components of the chassis, the five harmonic analyses and the frequency sweep have been repeated inside the ANSYS Workbench simulation environment, in order to evaluate the reduction of the machine vibration amplitude. For example, figure 9 shows the comparison between the displacement towards Z in relation to the machine frequency variation for the “AS IS” machine and for the constrained layer machine on the control point. This comparison has been performed for a number of control points as fixed with ILAPAK, for the three displacements components (X, Y and Z). To summarize, the results have shown that the structure damped with the constrained layer method, thus presenting some resonances, has a displacement pattern in relation with the frequency variation characterized by clearly lower amplifications on the resonances, proving to behave in a much more uniform way

with relation to the frequency variation. For this reason, since the steady state foresees different conditions in case of 219 bags/min considering a variable range between 120 and 219 bags/min, the first five characteristic frequencies of the Fourier series of the input signal vary from: 1. 2. 3. 4. 5.

1° frequency: from 2 to 3,646973 Hz 2° frequency: from 10 to 18,23487 Hz 3° frequency: from 14 to 25,52881 Hz 4° frequency: from 22 to 40,1164 Hz 5° frequency: from 26 to 47,41065 Hz

As a consequence, the whole frequency field from 3 to 50 Hz can be affected by a steady state variation; it has been therefore fundamental to analyze the frequency sweep results (and not just those related to the only 5 harmonic) in order to easily understand that the damping solution using the constrained layer technique, presenting a much more uniform pattern, wouldn’t be excessively affected even if the structure was excited in correspondence to the Eigen frequency. The resonances amplification has been reduced to values up to 50 times lower than not damped cases, with clear repercussion on the noise level of the machine at work.

Figure at the top of the article: Picture of the new 6000 machine of ILAPAK Ing. Valentina Peselli, Ing. Daniele Calsolaro - EnginSoft Ing. Nicola Baldecchi - ILAPAK

Ask the expert in the Machinery Simulation Valentina Peselli, EnginSoft v.peselli@enginsoft.it Fig. 9 - Comparison between the displacement (m) of a checking point in Z direction as a function of the frequency for the machine “AS IS” and the machine with constrained layer

17 - Newsletter EnginSoft Year 12 n°1

Case Histories


1D simulation applied in an Oil&Gas company Interview with Gabriele Bertoni, Gas Treatment Process Group Leader KT- Kinetics Technology (KT) is an international Clients nowadays have drastically increased Process Engineering Contractor with high their level of competences, they know what expertise in hydrocarbon processing industry. they want, and most of the times they ask the The company, a subsidiary of Maire Tecnimont Contractor to push the available technologies to Group, operates as a provider of proprietary their edges. Our greatest challenge is to provide technologies and as an EPC (Engineering, them the best product every time. To do this we Procurement, Construction) Contractor in need to be constantly updated on the latest and the chemical, petrochemical and oil & gas best technologies available on the market . complexes. KT has significant references in the field of Sulphur Recovery Facilities, Gas Can you briefly describe the Processing, Hydrogen and Syngas Production, last project you worked on? Refinery Process Units and the supply of SP19 is a project for the development of the Process Fired Heaters. With its own proprietary engineering for a 2000 BSCFD Gas Field in known-how and expertise, KT is an experienced the south of Iran (Tombak area) on open art EPC Contractor for Refinery Licensed Catalytic technology. KT was in joint venture with 2 Iranian Units: Continuous Catalytic Reforming (CCR), engineering companies (Sazeh and Nargan) and Hydrocracker (HCK), Hydrotreater (HDT), and was in charge of the process design activities others. Based on its strong process engineering and HSE studies for the open art technology background, it boasts major expertise in projects units. implementation and can provide services and projects based, whenever required, on thirdWhat was the contribution party licensed technologies. Since 1971, KT Fig. 1 - Ing. Gabriele Bertoni, Kinetics Technology provided by EnginSoft? has successfully completed more than 500 EnginSoft developed for us the surge analysis projects, with a continuous evolution from process engineering to for the cooling water pipeline using Flowmaster, the CFD software for EPC contracts, improving its core competences and know-how as system level simulations. well as developing and updating in-house technologies. Gabriele Bertoni joined KT in 2009 and is the Gas Treatment Process How did CAE helped you to value Group Leader. He is the head of a group of 15 process engineers your work and your company? which are dedicated to Oil & Gas projects. In parallel with the group EnginSoft developed this water hammer dynamic simulation in order management activities is involved directly on strategic projects as to confirm the material selection of the water pipeline. This system Process Manager. was a 20000 m3/h piping network requested by our Client in GRP. We needed to be sure not to exceed the design pressure of the pipes even in surge scenarios without the installation of a surge drum. Oil & Gas Enterprise has several requirements and needs. Together with EnginSoft we achieved the goal and this resulted in a Can you explain what are the main issues and challenges you have to face in your work? sensible saving of capital investment for our Client.

Interview

Newsletter EnginSoft Year 12 n째1 - 18


Multi-objective optimization on the timing system of a small 2-wheeler engine (SOHC): methodology and case study In recent years 2-wheeler engine companies have been focused on increasing overall engine efficiency, which can be achieved by resolving factors such as, engine down-speeding, engine down-sizing and by reducing the frictions. However, to maintain or improve vehicle performance, it is necessary to provide a corresponding increase in specific power. In accordance with these trends, the studied approaches and methodologies have been exploited, during the development of the new Piaggio small scooter engine. In this project, a multi-objective analysis was applied to the valve train system design, in order to optimize the engine performance in terms of friction reduction, power curve and timing system response. These calculation methodologies have been followed using the modeFRONTIER platform for the multi-objective optimization analysis and the commercial simulation software GT–SUITE®. The several configurations investigated obtained the desired trade-off between the appropriate and efficient valve train behavior and the engine performance. The global engine friction reduction was therefore obtained thanks to low resisting cam torque profiles and a specific valve springs set up. Introduction The growing demands for fuel economy in recent years calls for development and application of new efficient technologies, resulting in resolving rising fuel prices, adhering to the more stringent government fuel economy standards and awareness of increasing environmental impact. In view of this scenario, the control of engine frictions (in other terms fuel efficiency) and performance in internal combustion engines becomes decisive to provide a competitive vehicle for 2-wheels mobility and the use of numerical models and calculation methodologies provide an important support in pursuing these goals.

19 - Newsletter EnginSoft Year 12 n°1

The methodology followed is based on the development of an optimization model. It investigates and finds out the appropriate valve lift event shape, with the objective of maintaining or increasing engine performance and also fulfilling the objective to minimize the resisting cam torque in conjunction with design constraints satisfaction. The cam design started from a multi-polynomial valve lift curve and is fundamental in designing the timing system to take into account valve train stability, durability and noise, as well as engine breathing. The valve train system is one of the major parts of internal combustion engine: the valves draw the air and fuel into the cylinders and allow the exhaust gas out. Therefore the method to design valve lift profiles and the valve train components are essential in defining the engine performance, valve train durability and NVH. The valve train system should be optimized to avoid abnormal valve movement (jump or bounce phenomena) throughout the engine’s speed range. Another advantage of in optimizing valve events is that, cam follower separation speed is increased, valve spring margin is improved and cam torque resistance is reduced. This allows the design of a new valve spring setup, with lower loads and mass, in order to reduce frictions. This work aims to demonstrate the effectiveness of the optimization methodology and its robust practical application to the scooter’s engine valve timing: the described approaches have been exploited for the development of the new Piaggio scooter engine (125cc, 4-stroke,

Case Histories


1-cylinder, 4-valves), whose valve train arrangement includes a SOHC (Single OverHead Camshaft) with a roller follower. The project start with existing valve lift profiles (intake and exhaust), referring to the 125cc 3-valves engine. These profiles have been adopted initially for 4-valves new engine. The exploitation of the methodology described provided a new optimized valve lift event configuration. Valve lift profile design optimization The valve lift event is one of the most important factors when improving an engine’s performance by maximizing the area under the valve lift curve. But it is also true that each engine working condition needs a certain amount of charge to be trapped into the cylinder, depending on engine speed is required to optimize. It would be desirable if the valves were opened and closed as quickly as possible, however the lift area integral is limited by certain kinematic constraints. Therefore when a new valve lift profile is conceived the designer needs to evaluate whether the guidelines are satisfied or not, such as maximum positive and minimum negative acceleration, valve-piston distance, Hertzian stress at the cam lobes-roller interface, minimum cam radius of curvature, spring margin to avoid cam-follower separation, etc. Manually modifying the lift event can be time consuming when trying to satisfy the valve train recommendations and have a direct feedback of the engine response across the operating speed range. It is therefore becomes important to automate the process of controlling the engine and valve train system behavior.

process are two nodes built using GT-SUITE® to perform the operations to reach the goal. The data flow begins from the input variables and ends with the output variables, which can become objectives or design constraints (the details of this phase will be described hereafter). Input parameters for the multi-polynomial design approach The optimization model starts with the definition of a certain valve lift event. A valve profile design process generally begins with defining the shape of the valve acceleration curve. In this work a representation with a multi-polynomial approach is described and applied to the valve acceleration and consequently to the valve lift definition. The design technique to identify the valve acceleration is based on a multiple-polynomial scheme. The profile is divided into a total of 14 zones and the shape of the profile in each zone is modeled with a 5th order polynomial. In this process of lift curve synthesis the constraints on derivatives of lift are expressed in terms of non-dimensional design parameters. The half-cam profile acceleration curve schematic (excluding ramps) and zone lengths and non-dimensional design parameters are shown in Figure 2 (in the model described a symmetrical type of the lift profile has been defined).

The automated process has been created within modeFRONTIER to perform a multi-objective optimization of the calculation software (GTSUITE®), used to evaluate engine performance and valve train system behaviour. In Figure 1 a snapshot of the workflow of the entire optimization model is shown (to improve the readability, the data workflow is represented by subsystems).

Fig. 2 - Half-cam profile acceleration curve

Therefore 10 design parameters have been selected in the optimization analysis to modify the shape of the acceleration curve and they have been constrained in modeFRONTIER to vary in a certain range conveniently defined. They are shown in Table 1. The total lift event duration, computed indirectly using lengths of zones 1-6, has been constrained to maintain

Fig. 1 - Snapshot of the workflow of the entire optimization process

The model workflow in Figure 1 shows two data flows: logical and data flow. The first is the sequence of operations performed automatically by modeFRONTIER during the optimization. This logical flow consists of initial DOE and Scheduler “node”. These are the twin “node” that drives the whole optimization process. In particular the scheduler is the “engine” that learns from results of each simulation and plans the next step to improve the input variables and fulfill the objectives. Central to this

Case Histories

Table 1 - Design parameters defined in the model

Newsletter EnginSoft Year 12 n°1 - 20


an appropriate value. The duration of the cam ramp is calculated by the solver based on the input for ramp type, ramp height and ramp velocity. Afterwards, specifying the maximum lift, the solver is able to find a solution for flank and nose acceleration (Figure 2). The valve timing anchor has been maintained constant. The choice adopted during this application with regard to the input variables definition until now described, as well as the output variables, in particular objectives and constraints settled for the optimization process, is meant to represent one of the potential applications in relation to using this methodology. Any other considered problem configuration could be applied and evaluated, in order to satisfy any other design constraint and fulfill whichever objective based on user needs and targets. The model build-up in modeFRONTIER follows a modular approach to the problem.

Engine CFD and valve train model node description During the optimization loop the first step has consisted in having the solver node run the engine simulation. The commercial code used to build the numerical model has been GT-POWER® (the blocks diagram of engine model is shown in Figure 3). The above mentioned 125cc displacement, 4-stroke engine has been modeled (1D simulation). The good predictivity of the model about the behavior of the engine related phenomena, can be seen in Figure 4.

Fig. 5 - Snapshot of the GT® valve train model

The graph in Figure 6 shows the good numerical-experimental matching in relation to the dynamic valve displacement at engine over-speed, confirming the good level of prediction of the model.

Fig. 6 - Valve displacement: kinematic and dynamic profiles: comparison between measured and computed

Fig. 3 - Snapshot of the GT-Power® engine 1D model

Fig. 4 - Engine torque and power curves: comparison between measured and computed

The next step has been to carry out the valve train simulation in the relevant solver node. The specific lumped mass model has been built using GTVALVETRAIN® (the schematic diagram of the valve train analysis model is shown in Figure 5, where a roller rocker arm layout has been evaluated).

21 - Newsletter EnginSoft Year 12 n°1

GT-SUITE® and modeFRONTIER have been interfaced by means of the valve lift profile definition. Thus, the parameters of the multiplepolynomial scheme have been the input variables in the modeFRONTIER environment. The output variables controlled during the optimization have been the engine torque and power curves on the one hand, the valve-train kinematic and dynamic characteristic parameters on the other hand. Logical flow and followed approach during the optimization analysis The optimization process has been conducted at more levels, involving first a statistical analysis and then a proper optimization analysis. A pre-statistical analysis has been carried out to check workflow redundancies, to perform a sensitivity analysis and to be able to simplify the following optimization analysis. The second step consists of a robust optimization for a global search, based on guidelines coming from the previous analysis. Subsequently, it has been possible to select a rank of the top lift profiles based on the valve train and engine performance results. The last step of the project, to address user needs and targets for the specific application and consistency with the computational time, could be a final refined optimization to precisely hit an optimum using an accurate optimizer, for a refinement, starting from the previous global search result. In this work this last step has not been reported, being beyond the scope of this work.

Case Histories


The analysis has focused on minimizing the cam driving torque resistance, maximizing the engine torque at maximum torque speed rotation and maximizing the engine power at maximum power speed rotation. The constraints to be satisfied have been related to the most significant valve train recommendations and guidelines; additionally a target power curve as a lower boundary condition has been considered (this numerical curve has represented the engine performance before to perform the optimization process). Objectives and constraints investigated are shown in the Table 2.

Cases studied have shown that 3000÷5000 DOE input profiles provide enough resolution to explore the investigated design space. The “DOE Sequence” has been used to determine the general behavior of the examined objective functions. Table 3 summarizes the number of blocks used in modeFRONTIER model during the statistical analysis.

Table 3 - Number of blocks used in modeFRONTIER model during the statistical analysis

After the DOE table has been evaluated, it has been possible to postprocess the results extracting important information about the problem.

Table 2 - Objectives and constraints defined in the model

In Figure 7 the optimization process flow-chart implemented on modeFRONTIER is shown.

Data post-processing has shown there have been some inputs that have interfered insignificantly with outputs variables, while others have affected the output results in a more or less substantial way. This can be seen in the matrix correlation (Figure 8), that illustrates a first order dependency. The correlation value is a normalized index spanning from -1 to +1: a value equal to +1 (-1) denotes a full direct (inverse) correlation, while a low absolute value means low correlation. The same correlation is identified by shades of red (direct) and blue (inverse) color. It has been found that some variables are the least significant input variables and others have been found to affect significantly the results of the analysis (in relation to the ranges defined during the entire process). This is summarized in the Table 4. Finally it has been found that some pairs of objectives or outputs are negatively correlated, that means that such objectives (outputs) are conflicting and thus an optimization strategy should be used to find a good compromise. During the next step of optimization some design variables will be considered a constant and a more suitable range for other input parameters will be adopted, according to the indications coming from the statistical analysis. As an example, the Parallel coordinates chart (Figure 9) clearly

Fig. 7 - Optimization flow-chart

According to the flow-chart, the data flow has started from the 10 input variables, whose values define the valve lift event. After the analysis of each design is completed the output values have generated. Once each numerical analysis has performed (running engine performance and valve train models), they have been evaluated by means of suitable instruments for post-processing: the best configurations will be those that meet targets and guidelines chosen. Statistical analysis A starting DOE has been selected for the statistical analysis taking care that the set of configurations had to be representative of the whole design space and that the input values had to be not correlated to avoid redundancies.

Case Histories

Fig. 8 - Correlation matrix chart: relationships between input and output variables

Newsletter EnginSoft Year 12 n°1 - 22


Table 4 - Input variables influence

indicates (applying filters to reduce the ranges defined for the single variables evaluated) how it is possible to make investigations on variables sensitivity. In this type of graph, firstly a set of parallel axes is drawn with the aim to represent each variable, either input or output. Each design is then represented by a single line intersecting each variable axes at the value held by that variable for that design. Since the parallel coordinates chart permits the modification in real time of the range of every single variable, it can be used to filter the most interesting solutions in the database. The Figure 9 shows the convenience to adopt low values for the maximum valve lift profile, in particular to follow the target of reduce the resisting cam torque

Fig. 9 - Parallel coordinates chart (maximum valve lift vs minimization cam driving torque)

Similar considerations have been made using the method based on t-Student distribution, which has once again allowed to see specific relations amongst all the inputs and a single output. This tool adopts both pie chart and histogram representations (Figure 10).

Fig. 10 - t-Student distribution charts

23 - Newsletter EnginSoft Year 12 n°1

Analyzing these charts it is easy to guess that one of the benefits of this procedure is the possibility to understand how each cam design parameter affects the valve train system (in terms of kinematic, quasidynamic and dynamic characteristics) and engine performance (torque and power curve). Using this post process data as a sensitivity tool, the cam designer is able to parametrically define the valve lift profile in order to change any output response, conscious that the time is over when the simulation brought good results only after long working experience. Once the statistical analysis had been carried out, the following stage of the procedure has been aimed at performing an optimization analysis, having the main purpose to define one or more appropriate valve lift events. Optimization analysis During the optimization analysis a traditional MOGA II algorithm has been used. This is a multi-objective genetic algorithm. The number of starting population and the number of Generations have been a trade-off focused to increase robustness, accuracy and calculation time. Although the choice to use a genetic (and robust) algorithm, particular care has been taken to provide a DOE able to cover sufficiently the dominium of the functions, so that it can provide multiple starting point for the optimization. Run times have been reduced launching multiple designs simultaneously. The percentage of feasible designs depends on how strict the constraints are. The conflict between some outputs or objectives negatively correlated has to be considered too. The calculation time also depends on how the problem has been formulated. In this work the choice adopted has been to evaluate unfeasible designs too (from point of view of performance) and to run valve train analysis (the solver node that follows the engine performance node) regardless of the result coming from the performance node. In fact the design constraints have been defined rigidly and therefore it has been preferred to evaluate even the designs not fulfilling all the constraints. Alternatively it would have been possible to include in the optimization model a logic “if” node and redirect the workflow path to the exit node if the performance constraints hadn’t been respected. During this phase the input number has been reduced (from 10 to 7 variables). Additionally a more suitable range for all the input parameters has been adopted. The optimization analysis has generated a small number of feasible design, demonstrating that the formulation problem has been rather rigid (in terms of design constraints) and highlighting the inverse correlation characterizing the output variables (as shown in the scatter matrix in Figure 11, that confirms the statistical analysis’s trends). The scatter matrix chart contains three kinds of information: the Probability Density Function chart

Fig. 11 - Scatter matrix chart

Case Histories


for each variable (along the diagonal) , all the pairwise scatter plot (above the diagonal) and the correlation values between the variables (below the diagonal, as described in relation to Figure 8). Despite that, the few feasible designs generated, reached the goal planned during the analysis, respecting settled constraints. Having more than two objectives, the so-called “Pareto Frontier” (the set of all non-dominated solutions in the search space) is no longer a curve and becomes an hyper-surface. Otherwise the modeFRONTIER’s user can use a Bubble chart representation (Figure 12), where the two Cartesian axes represent two objectives and the third is pictured by bubble size. The feasible designs lying on the Pareto Frontier are circled in green in Figure 12.

Fig. 12 - Bubble chart and Pareto Frontier

The three most promising solutions are highlighted in blue in the Parallel coordinates chart (Figure 13).

Fig. 14 - Valve lift and acceleration profiles: comparison between the original and the optimized solution

overlap reciprocally. The multi-objective approach permits the analyst to assign a priority to each objective or constraint; making the thermodynamic objective the top-priority one could have been another way to go (to the detriment of the valve-train aspects). The entire optimization procedure has been exploited both for the intake and the exhaust valve lift (this last analysis has not been reported, being beyond the scope of this paper). The graph in Figure 15 takes account of both optimized diagrams. As shown above the effects on gas exchange have not involved any significant variation in engine performance.

Fig. 13 - Parallel coordinates chart showing the three most promising solutions (the arrows point out the objectives to minimize or maximize)

According to the priorities for each objective, an unique final design ID has been selected. The valve lift and acceleration profiles are shown in Figure 14: the comparison between the original and the optimized solution is shown. The new valve lift has provided the valve train results displayed in Table 5. The impact on the engine performance in full load condition is depicted in Figure 15: despite the slight worsening at low engine speeds (within the tolerance constraints), the optimized curves display the numerical improvement at the two engine operating conditions considered, in particular at high regimes. The slight improvement is clearly due to the problem formulation, where many objectives and design constraints

Case Histories

Table 5 - Optimized valve train characteristics

Newsletter EnginSoft Year 12 n°1 - 24


In this project the optimization procedure for the intake valve lift event has been described. The same approach has then been used and applied for the exhaust valve lift (as mentioned earlier this last analysis has not been reported, being beyond the scope of the present work). The whole valve train system has benefitted from this optimization procedure, involving a new valve spring setup and a new cam design. These changes have allowed the improvement of the engine friction as well as stability and durability issues due to the decreased stress; the engine performance has slightly improved without any significant variation. Fig. 15 - Engine indicated torque and power curve: comparison between original and optimized solution (highlighted the engine speeds most significant for the analysis).

Fig. 16 - Cam torque: comparison between original and optimized solution.

The graph in Figure 16 shows the improvement in resisting cam torque and therefore in friction reduction, between the original and the optimized curve at over-speed. The original data refers to the cam design initially adopted for the 4-valves engine, before the exploitation of the current methodology. As a result of the new optimized valve lift event it has been possible to modify the valve spring characteristics (Figure 17), using a new spring setup with lower loads and mass, in order to reduce the frictions. The original data refers again to valve spring configuration used prior to the optimization analysis.

Conclusions With market pressure to decrease the environmental impact of new engines and with increasing software reliability, it’s imperative to find new efficient methods to improve exiting solutions by numerical optimization approaches. The idea at the basis of the paper has been to build a numerical tool for the valve lift profile definition, focusing on valve train system working and on engine performance. Thus, it becomes possible to have a new robust methodology to test effectively and efficiently valve lift solutions. In this work modeFRONTIER has been used to perform a multi-objective optimization. The multi-polynomial valve lift has been the starting point to drive the integrated procedure in optimizing the cam definition with a view to the valve train behavior and to the engine performance. The optimized profile has been found with the use of genetic algorithm tools. Therefore an integrated approach between the modeFRONTIER platform and the GT-SUITE® numerical code has been performed in order to build a methodology to define the valve lift event and to give support to the analyst during the design of a cam profile. This work has focused on the valve train system, because it is an integral part of any engine and is closely related to the flow efficiency, the performance of the engine and impacts on high durability and low frictions of the timing system. Firstly, a statistical analysis has been performed that has sped up the optimization phase by reducing the complexity of the problem, limiting the number of variables and the variables definition range. The subsequent optimization problem has therefore been used to improve the timing system lift event, in terms of the valve train system’s dynamic characteristics and thermodynamic performance requirements. Particular care has been taken to verify some important characteristics, like the engine power and torque, the resisting camshaft torque, the cam-follower separation speed, the Hertz stress, etc. This approach has allowed the simultaneous modification of both the valve springs set-up and the cam profile shapes, in order to obtain the required response as for the engine friction reduction; furthermore the whole timing system has benefitted from this procedure, improving stability and durability issues. To conclude, this project demonstrates the use of state of the art simulation tools and their correct implementation into the development process. It also highlights the great benefit of such a process in the development of Piaggio‘s new small 4 valve scooter engines. Marco Serafini, EnginSoft Francesco Maiani, Piaggio & C. s.p.a.

Fig. 17 - Valve spring characteristics: comparison between original and modified solution.

25 - Newsletter EnginSoft Year 12 n°1

Ask the expert in Optimization technologies Francesco Franchini, EnginSoft f.franchini@enginsoft.it

Case Histories


Bonfiglioli: a complete range of gear motors for the industry “After having considered the best alternatives available on the market, we have set up a pilot project in collaboration with EnginSoft with the aim of accurately evaluating modeFRONTIER so to avoid any possible risk; the results of such activity have fully satisfied our needs also thanks to EnginSoft expertise and competence in this sector”. “All in all, we are confident of our choice which is the best solution on the market and in line with Bonfiglioli Riduttori SpA style, oriented to invest in the best technologies so to offer its customers always innovative solutions, in accordance with market requirements”. Eng. Andrea Torcelli Research and Development Manager Bonfiglioli Riduttori Spa Bonfiglioli Riduttori SpA designs, manufactures and distributes a complete range of gearmotors, drive systems, planetary gearboxes, inverters and photovoltaic solutions to satisfy the most challenging needs in the fields of industrial automation, mobile machinery and renewable energy. Bonfiglioli offers tailored solutions, whose strength lies in the advanced content of each product and the intelligent integration of different technologies. As a leader in global power transmission and control, Bonfiglioli is committed to satisfying its customers’ requirements by supplying high quality products and providing excellent service on an increasingly wide scale. Bonfiglioli has always worked in order to integrate the complex requirements of growth and productivity generated by its development. The essential element of its strategy has been the creation of market-oriented business units, able to design tailored solutions meeting customers’ needs. The four business units namely are: Industrial (with mechatronics and power transmission divisions), Photovoltaic (also focused on regeneration solutions analysis) Wind and Mobile.

Case Histories

Why FRONTIER in design The pilot project we have focused on in Bonfiglioli has been developed considering one of the recurring applications related to the performance optimization of a braking system. The project was selected also considering the possibility of integrating a MDO tool in the design and analysis flows of the company. The main objective of the project was that of determining the optimum geometric configurations of the braking system, able to guarantee also the best performance from a thermal and dynamic point of view in accordance with the design constraints and specifications. The analysis procedure, managed in Bonfiglioli using complex Excel sheets, has been integrated and fully automatized within the modeFRONTIER multi-disciplinary optimization platform, which has allowed to easily manage different mono-objective and multiobjective optimization processes according to the user’s needs. In this way, over 1400 different configurations of the systems have been automatically analyzed in a half a day time achieving improving results both in terms of system performance and in development

Newsletter EnginSoft Year 12 n°1 - 26


time of analysis process. As far as the mono-objective problems are concerned, just few minutes are enough for a full optimization in comparison with the 4-8 hours required by the manual approach. In this way, thanks to modeFRONTIER, the optimum configurations in relation to the starting nominal conditions have been determined, obtaining an improvement ranging between the 18% and the 44%, depending on the different objectives. The customer has highly appreciated the possibility of automatically analyzing a huge number of configurations in the same time frame (from 10 to 1500), thus allowing a real and exhaustive exploration of the whole project space. The next and natural step will be that of integrating in the optimization flows the CAD and CAE tools traditionally used by Bonfiglioli to develop and analyze its products.

HPC Computing New perspectives in terms of use simplification The use of HPC within science has always aroused a lot of interest and great expectations, but has not always repaid adequately. For the last seven years, the computation power for a single workstation has considerably improved to match with the requirements of new software. In CFD, electromagnetic and non-lineal structural mechanic applications, a clear difference in solver efficiency can be observed, even for the most common software, both open and proprietary. Fluiddynamics is the discipline that has always considered this solver efficiency a “must” and not something to wait for. CFD solvers in fact have efficiencies equal to a unit (when the number of cores doubles, the computation time is a half). The situation for electromagnetic and non-lineal mechanic-structural simulation applications is very different, since the solver development has achieved a balanced load across the cores. One of the biggest problems for these kind of disciplines is the complexity of the algorithms and equations and the effect on the core communication speed (which could be on different workstations). A fundamental characteristic is the intercommunication speed and latency between the different workstations used in a parallel computation: nowadays we have a speed of 40 GB/s with a only few nanoseconds of latency. This is two orders of magnitude less than the “usual” communication network of a company. In order to achieve such performance, it is necessary to have dedicated devices and communications in clusters

27 - Newsletter EnginSoft Year 12 n°1

(that is a homogenous group of workstations fully dedicated to numerical computation) rather than normal workstations that have an ad hoc development. New platforms and processors released in March 2015 can certainly simplify computation for the lowest segment (i.e. around 10 million degrees of freedom). It is possible to have a workstation with 36 cores, suitable for an in-house management, using IT resources of the company and be able to meet most of the model size requirements. The system cannot be expanded any further and the real bottle neck is represented by the correct sizing of the resources (RAM and disk). Such machines require at least 128GB of RAM to avoid conflictual situations where the whole investment would become fruitless. Another relevant aspect is related to the cost of these solutions, their life cycle and maintenance so to guarantee the correct expected performance in terms if speed and analyses number. EnginSoft, thanks to its great experience in suggesting customized HPC solutions and related management, is able to offer both turnkey solutions (hardware and software) and consulting services in order to evaluate the correct sizing of the HPC computation system for any company. Third party maintenance services for HPC solutions by customers range today above 2000 computation cores for almost 100TB of RAM and 1500TB of managed disk in total. Ask the expert in HPC: Gino Perna, EnginSoft - gino@enginsoft.it

Case Histories


Hardware resources optimization in Pierburg Pump Technologies The use of simulation software has become more and more relevant in any company willing to improve its products. Finding the right balance between budget, software and the best computation workstation is a hard challenge. In order to face it, we have counted on EnginSoft, whose technical equipment and in-depth knowledge of HPC sector, has allowed us to find the right solution to get the most of out of it. The first phase focused on the evaluation concerning the necessary resources for the simulation software, leading to some simulation tests performed by EnginSoft on different workstations. The aim was to evaluate what technical characteristics of the workstation hardware mostly affected the computation time, as well as the processor speed, the quantity of installed/used RAM and the disk reading/writing speed. Once the objective was achieved in understanding which hardware components affect a workstation, we optimized our choice following EnginSofts’ suggestion and adopted the workstation with the right configuration. EnginSoft supported us in the workstation installation and optimization. The importance of configuring a system correctly is frequently

Case Histories

underestimated, but this is essential to obtaining the best computational performances; limiting the software on one hand and customizing the system on the other. What’s the final result? The test carried out on the workstation have proved that our choice has been the correct one and our results have been even better than those coming out from EnginSoft tests. The perfect balance between budget, software and hardware.

Raffaele Squarcini - Pierburg Pump Technology The company Pierburg has traditionally been one of the closest partners to the automotive industry by successfully advancing the development of the automobile. Decades of experience, combined with comprehensive, innovative and globally acknowledged capabilities in every aspect of the engine, are factors that have driven Pierburg in its mission of repeatedly developing and manufacturing forward-looking components, modules, and systems. The wide portfolio of Pierburg ranges from oil and water pumps to vacuum and purge pumps: Pierburg Pump Technology S.p.A offers high-technology solutions to guarantee the best functionality and reliability of the vehicle. All these developments help to create an economically and ecologically balanced automobile. Gaining significance in the development of new engine generations are factors such as reduced fuel consumption, lower emissions, and enhanced performance, comfort, and safety. As established in the past, Pierburg is still continuously shaping the future of the automobile.

Newsletter EnginSoft Year 12 n°1 - 28


Optimization of the curing system of a Hot-Box Core with MAGMA C+M Several manufacturing processes regard raw material transformation by means of heat supply, using different heating systems that are frequently designed only taking into account effectiveness (product quality) rather than efficiency (cheapness). The kind of heat source and its transmission to the transformation system affect costs proportionally to the product amount. For this reason, the energy required for a hot productive process should be taken into account in the design phase, at least for massproduction. The core-making process of a core for a ventilated brake disc in cast iron has been simulated using MAGMA C+M. The analysis of the thermal results has allowed to estimate power consumption and to validate the numerical model necessary to compare it with some metrological results of the real process. This reference simulation has been used to compare N°3 different virtual design alternatives aiming at identifying the configuration which gives the optimum energy performance. Description of the reference process The reference process refers to the automatic core-making of a hot-box core for a mass production of ventilated brake discs in cast iron. The consolidated heating system has been used for a long time now and it doesn’t present any critical aspect as far as core quality is concerned and therefore it has been assumed to comply with the appropriate curing conditions. The necessary equipment consists of N°4 core-boxes made of N°2 H11 steel inserts, respectively fixed to N°2 heating plates. The plate heating system consists of cartridges made of nickel-chrome alloy and driven by means of a temperature feedback control using thermocouples included in the inserts. The blend, which is shot into the cavity, is constituted by premixed silica sand with an organic binder for hot-box curing. Each cavity is 1 liter volume and requires a total time cycle of about 1 minute. MAGMA C+M reference model Leaving kinematic details of such phenomena aside, the energy conservation principle is taken into account, therefore a feasible

29 - Newsletter EnginSoft Year 12 n°1

hypothesis is that of disregarding the thermal effect of the shot, due to its very short time. The condition after the core shooting, in which the cavity is full and uniformly compact with a hot-box sand-binder blend is therefore assumed. The MAGMA C+M computation code allows a precise definition of the thermo-physical characteristics of the materials in use, including the specific thermal transmittance between the contact surfaces and the feedback control of the heaters. Specific attention has been paid to geometries so to get a final representation loyal to the real domain in which all phenomena related to heat transmission on the border could be considered as negligible with relation to the transformation under investigation.

Fig. 1 - Measurements of the electric energy absorption

Case Histories


Fig. 2 - Thermal regulation of the equipment

Fig. 3 - Temperature distribution in the section of the original equipment

For the validation of the virtual model, to faithfully represent the thermal evolution of the equipment under use, the model is based on the comparison between the real and simulated characteristic of the cartridges ON/OFF frequency trend. The energy consumption validation has been possible considering the electric measurements on the power line of the heaters of some equipment (Figure1). The defined model has been simulated until the thermal regime conditions had been reached and beyond, so to allow the system to settle down. A further validation has been enabled by the comparison between the real and virtual regulation times, which proved to be satisfactory (Figure 2). The analyses of the results obtained from the reference simulation have allowed the formalization of the compliant curing conditions, defining also the temperature range and standard backing times. Specific conditions have been detected, that are when the binder doesn’t cure sufficiently or when it burns, preventing the core to acquire the required mechanical properties. Furthermore the temperature range has highlighted an enhanced thermal unbalance of the heating plates which could explain the reason for a not homogenous curing, detected on the cores (Figure 3). Analyzed solutions and results Once the virtual model has been validated, also taking into account the knowledge provided by experience, some further solutions have been suggested since considered as in line with the project objectives (quality and cheapness). Therefore N°3 different configurations have been implemented with homogenous heating and applying the same boundary conditions and material properties as before. Each new heating configuration is the result of a tight collaboration between Fonderia di Torbole and EnginSoft, able to combine the creativeness of both groups with the feasibility of the design and the validation of the related virtual model. The first solution that has been analyzed has proved how the actual equipment efficiency can be increased, improving the heating control system and introducing insulation. From a quantitative point of view, it has been possible to estimate a saving of 38%.

Case Histories

The second solution that has been analyzed represents the attempt to apply forefront electrical heaters (installed directly in the inserts) so to induct a selective heating of the core and dramatically reducing the metal amount to be heated. The plant investment is rewarded by an estimated saving of 60%. The third solution analyzed has been implemented in an oil-heating system, which has been thermo-regulated by a methane unit, so to take into account a solution with a cheaper energy source, since it does not require any transformation into electricity. The third solution requires a complete re-design of both the equipment and the heating system and the use of a different power supply. The estimated performances can be similar to those obtained with the second alternative when considering the unit efficiency. The overall saving is in this case ranging about the 85%. Conclusions A virtual approach has been taken into account to analyze the curing process of a hot-box core for mass production using the software MAGMA C+M. Three different virtual alternatives have been implemented in the curing system, aiming at improving the energy efficiency. By comparing the simulation results it has been possible to highlight that all configurations provide an improvement in terms of performance, reducing the energy consumption, balancing the temperature distribution in the cavity parts (figure 4) and minimizing the heating and/or the cycle time, proportionally to the requested plant investment. Assuming that the average curing power is 20kW, that the productive amount is 800˙000 pieces/year, that the productive rate is 2 pieces/cycle with cycles of 60s/cycle and that the electric energy cost is 0,15€/kWh, the energy cost to keep the process as it is would be of 20.000 euro/year. Thanks to this example of saving quantification, it is very clear that the return of the investment is extremely favorable, with 38%, 60% and 85% respectively for the proposed three different solutions. The comparison of the virtual results has allowed to validate the expected results from the design hypothesis providing a valuable support for decision-making. Ask the expert in MAGMA C+M: Lorenzo Trevisan, EnginSoft l.trevisan@enginsoft.it

Fig. 4 - Temperature distribution on the surface of the cores manufactured with 4 investigated solutions

Newsletter EnginSoft Year 12 n°1 - 30


Accelerating the Innovation of Design Processes using CAE and 3D Printing The Lion Corporation For more than 120 years, the Lion Corporation (LION) has developed and manufactured a wide variety of toiletries, which many consumers use on a daily basis: soap, detergents, and oral hygiene products. In the company’s Packaging and Container Technical Research Center, Mr. Nobuhito Nakagawa and Mr. Yoshiyasu Murata have focused on the design and development of packaging for various LION products for many years. Because their products are used by various consumers of different demographics, LION puts great emphasis on tolerability, easiness of use, and environmentally friendly designs and ingredients, to ensure the usability is applicable for customers of all ages. The company applies and relies on digital engineering technologies, which their engineers have adapted for their design and development processes. Thus, the evaluation of ideas and plans for designs has become more flexible and efficient than ever. Today, the engineers at LION are concentrating on new exciting designs which call for the use of the most advanced engineering techniques. The LION Research Center has further improved design processes by combining CAE and 3D printing for its packaging design. The company started to use CAE almost 30 years ago, at a time when it was not so common in the Japanese manufacturing industry. CAE has been used for various design studies since then. LION could further realize its innovative design and manufacturing, by constantly upgrading its CAE technologies and taking advantage of 3D printing - a method that has received a lot of attention recently. In this article, I would like to present an interview, which I have had the pleasure to conduct, with Mr. Nobuhito Nakagawa and Mr. Yoshiyasu Murata, about their CAE case studies, examples of 3D printer use, and the skills and techniques they count on to further enhance design performances. Akiko Kondoh: Can you tell us about the CAE environment in LION? Mr. Nakagawa: One of the key reasons for the growing importance of CAE in LION over the last 30 years, can be explained with the rules and

31 - Newsletter EnginSoft Year 12 n°1

regulations of the Container and Packaging Recycling Law in Japan, which was established in 1995. This law demands that manufacturing companies cover the recycling fees depending on the amount of plastic material used. In a trial calculation, we found out that the cost was becoming huge, and subsequently, we began to focus on saving container weight. It had become a crucial factor for cost reduction. Around the same time, refillable products had become popular, and we decided to conduct a study of refilling containers by thinning bottles. CAE showed immense power here. It strengthened our identity; it helped us in responding to the need for lighter containers. Since that time, CAE is used more and more often and our teams are aware of the importance of the technologies for our developments. Mr. Murata has led the CAE team before and after this phase. In the past, CAE was just regarded as a technology that could become useful in the future and that may be worth to give it a try. Then in

Case Histories


the 1990’s, CAE was in practical use and finally, the ultra-thin bottle was developed by Mr. Murata’s team. The design received the Kinoshita Award, a highly esteemed recognition in the Japanese packaging industry. The award and other successes, underlined the importance of CAE for the company’s product development. Today, we still work mainly with LSDYNA. As there were only a few software tools available on the market when we started using CAE, we inevitably chose LS-DYNA, which had a high reputation as a solver. The solver is a very important tool because it influences the results directly. This is why we haven’t changed the solver environment since then. We always compare the real phenomenon with the simulation results, which is usually old data. When we consider the effective utilization of old data, I believe the best thing to do is to continue using the same solver.

Fig.1 - The comparison of the unloading restoration between the new design (above) and the old design (below)

Akiko Kondoh: Can you please share some examples of the use of CAE in Lion? Mr. Nakagawa: As a first example of our work, I’d like to mention the analysis of blow-molded bottles, such as squeezing bottles for toilet cleaning. We assess compression, pressure increase and reduction, as well as the squeezing phenomenon. As for compression, experienced designers are able to find the areas where problems occur and adjustments in the manufacturing process are needed. They also know how to balance a budget in the end; hence simulation is just for checking. However, for the evaluation of the squeezing phenomena, simulation is indispensable. The performance of squeezing bottles depends on the structure of the bottle shape, and it’s very difficult to control. If the product specification says that 10ml is the amount to dispense from the bottle, we need to verify the dispensing behavior. For example, we have to make sure that it does not dispense 20ml, that it is easy to squeeze until 10ml are dispensed, and that it becomes more resistant over 10ml. These transformation profiles must be determined at the design stage, so that we can modify

Mr. Murata: Both factors, the feedback from the users, the consumers, and the simulation of a realistic design, allowed us to achieve a high quality bottle. The designer in our team, who is also a human engineering specialist, is in charge of the feeling tests. She monitors the design, and I handle the simulation. We can indeed say that the interaction between human engineering and simulation is successful. Mr. Nakagawa: We also simulated the human chewing motions in the frame of our feeling test program. In general, chewing solid gum is regarded to be good for improving teeth alignment because the movements spread out palatine bones. We cooperated with dentists who had already done research on such phenomena, to simulate it. Normally, a long term evaluation is necessary to consider what factors cause the growth of the bone. But in this case, we simulated the chewing of gum over a short period of time. The practical phenomenon is that the center of 2 plate bones called palate suture is deformed by the gum chewing motions. If deformation occurs in the center, which is the growth point, people feel a problem and the body tries to add flesh and bone. Then the bones are pushed out and the mouth is broadened outwards. By continuing those actions, the teeth will be well aligned when getting older.

Fig.2 - Stress concentration at the center of the upper jawbone, which is generated by the chewing motions (solid gum)

Case Histories

the design before production. Hence, it is necessary to start the simulations at the design stage. Mr. Nakagawa: Apart from engineering issues, such as the compression strength of bottles, the squeezing phenomenon also has to be seen in context with the user and how he/she feels, experiencing the squeezing. User feelings are different depending on the squeezing profile, even if the final required power is the same. For example, if 30N are necessary, a possible squeezing profile would be to gradually increase the strength up to 30N with a sharp increase at the end. Another option would be to start squeezing from the strength of 30N with a gradual decrease from there. The perception would be different for the user. This means that the simulation results are only a part of our evaluation. We also always take into account how people feel and perceive the utilization of the product.

The dentists had old data of children’s long term growth records. However, the effects of chewing gum had not been

Newsletter EnginSoft Year 12 n°1 - 32


Mr. Murata: For the evaluation, if the handle is broken, we do the stress and deflection simulation when loaded, and the simulation by dropping a weight on the handle. As we recommend that you brush your teeth with about 200g load, the flexibility of the handle when loading 200g is the guidepost for our design. We use simulation to accommodate requests from users about the handle’s flexibility, and opinions from designers about its visual appeal.

Fig.3 - Differences in teeth brushing depending on brush bristle shapes

recognized during this time. So we decided to evaluate it using engineering simulation. Over the course of a few months, tests were also carried out with students of an elementary school who collaborated with us. The simulation results were within the range of our expectations. After our research, a new adapted chewing gum product was commercialized. Mr. Nakagawa: As another research for our teeth actions, we calibrated muscle strength for chewing by using an electromyography, to get electromyograms. Here, it was necessary that the subjects were not moving and the chewing speed was defined with metronomic regularity. This was a challenging experiment because the procedures were fixed in order to handle the result wave patterns easily, and the same chewing motions had to be repeated in different orders using a variety of gums. Thanks to this experiment, we were able to find out that the nostrils were distorted because the chewing actions influenced the palatine bones under the nose. This could be an explanation why sinus pain often alleviates after chewing gum. I plan to continue our experimental studies, also by calibrating electromyograms, whenever there are questions that cannot be answered solely by simulation. Of course, our main goal is not to explore human engineering, but it is one of the techniques we use to develop and manufacture good products. We rely on our useful tools and ideas for better products today and in the future. Mr. Nakagawa: The simulation of a toothbrush is one of our longtime case studies. At the beginning, we used simulation for strength design and deflection control of the toothbrush handle. Previous toothbrush handles were made from a single material and the design evaluation was easy. After some time, handles started to be made from two different materials to improve design and functionality. As a consequence, the design problems became more difficult. If the color combination of the two materials is changed for visual appeal, the amount of core material is often reduced. Hence, the functionality depends on the design which creates challenges for our engineering.

33 - Newsletter EnginSoft Year 12 n°1

Mr. Nakagawa: Although the tip of the brush is related to the total functionality, the development of brush tip and handle are separated in the design process. The combined product shape becomes apparent in the later design phase; until then, we focus on the handle design for the required functionality. The basic design for our core products is evaluated thoroughly by using simulation. As there are so many kinds of toothbrush products, we don’t simulate each type. Akiko Kondoh: Please can you outline the use of 3D printing at LION Mr. Nakagawa: We introduced a 3D printer in 2007 for our digital process innovation. In our small organization, all the different specialists who support our manufacturing, designers, design engineers, CAE engineers and prototype engineers, work closely together. This is why and how we could introduce not only CAD and CAE, but also 3D printing - a leadingedge technology, at an early stage. It was put into practical use and represents one phase of our digital process. The evaluation targets of CAE and 3D printing are different, each has its own role and everything is related to manufacturing. For example, there are about 12 trials of design to create one product. About 10 of the trials are outputted using 3D printing. We can find the problems of the design at this point, and simulation is not necessary here. We discuss how to improve and narrow down the optimal designs, and then we simulate them for one of the final checks.

Fig.4 - The comparison between the traditional and current development flows

Case Histories


One example on the use of 3D printing is the evaluation of the residual quantity in a bottle. When taking a pump from a bottle after using the liquid inside, the residual quantity is very small if the liquid has a low viscosity like water. However, there is a certain amount of residual liquid if the viscosity is high and the liquid surface is not going flat. In this case, the liquid inside doesn’t come out even when we feel that it has been exhausted. It is a problem that there is still some residual liquid, and we can hear its sound when shaking the bottle. For the experiment of this phenomenon, we have used 3D printing. It would be too time-consuming to use CFD for it. We also applied 3D printing for the design of a bottle cap, for an ultracondensed liquid detergent product. The detergent is a new-generation eco product which offers the same washing power with only half the amount of detergent. It can be packaged in a smaller bottle and thus contributes to transport efficiency. We wanted to develop a suitable bottle for the new detergent, our goals were innovative design and userfriendliness: it should be easy to measure and refill, and hard to drip. To evaluate and develop the bottle, we have created more than 90 kinds of prototypes using 3D printing, only for the nozzle cap. We have chosen the final design after completion of this phase. Our organization’s strength is that we have different technologies and ways of solving problems in the same team. I think there are not many organizations today that use both, CAE and 3D printing. It is an advantage for us that we are a small company and that there are many different skills available in our undivided organization. We cannot compete with bigger companies by doing the same things they do. Akiko Kondoh: What are your plans for CAE in the future? Mr. Murata: We are thinking about starting to use CFD, and we will probably choose a CFD tool which is the best for us right at this point in time. Mr. Nakagawa: Our detergents and other products are high viscosity fluids. Hence we have to design and develop bottles that are best for the detergents’ viscosity and behavior. The characteristics of the new fluids are different from the existing ones, this may change again with our product development. When we receive the compositions that are under development, it is possible that the testing of bottle prototypes with 3D printing could help us to reach a final decision faster than by using simulation. However, experimentation does not provide us with the information of the whole process. It is good to observe, but the evaluation of the test results only means that we see the static situation after the phenomenon has ended. Therefore, we want to use simulation for evaluating why and how a problem occurs. For instance, when there is a challenge with a completed composition during the manufacturing process, we use simulation to improve manufacturing efficiency by changing the manufacturing conditions, not by changing the composition itself. If we can use simulation efficiently in this way, I’d like to also implement the technologies for the development of our fluid products. Akiko Kondoh: How do you encourage the use of CAE and increase its effectiveness in your product development? Mr. Nakagawa: CAE requires complex skills. Material mechanics is a pure theory and calculating it on a computer is CAE. Apart from knowing those theories and CAE, it is also important to understand the solvers and prepost processors. Moreover, the skill for using computer software tools in general is required. Of course experimenting is necessary and making theoretical models is also needed, because complicated real phenomena

Case Histories

Fig.5 - Design of a bottle cap for an ultra-condensed liquid detergent product

cannot be recreated. I also have to say here that choosing CAE software tools requires business management skills, on top of technical know-how. Mr. Murata: All factors, from business management to simulation methods, and how to prepare models should be considered thoroughly. All this knowledge is necessary for our CAE engineers, and in fact, it’s difficult to gain such a wide range of experiences. We always try to solve problems ourselves if there is any concern. I believe it is important to see and understand why the phenomenon occurs, clarify the problems, and perform the appropriate simulation steps to solve them. Mr. Nakagawa: We always try to keep our young engineers motivated. In many manufacturing companies today, designers go mainstream and CAE engineers tend to work behind the scenes. Especially in larger corporations, it might be difficult for the CAE department to express and pass on opinions to the design department. In our company, the organization is compact and well-structured, hence we can always pass on comments if we find a problem with our simulations that are ordered by the designers. It is vital to keep young engineers motivated. I believe we can make good products in the current organization and by keeping our communication alive. Mr. Murata: The power of our organization can be compared to the result of a conversation between 2 people that produces the same output as one that involves 3 or 4 colleagues. CAE plays a key role in the design process and this is important for our motivation. Our CAE engineers want to give something extra back to the designers after verifying their designs by simulation. Even if it seems to be okay, we would love to be able to suggest a slight improvement for an even better design. This kind of proactive communication in the organization creates innovative ideas and is very helpful for our design work. Mr. Nakagawa: Our company’s core strength is that it is a well structured organization. This gives us the flexibility to implement any kind of 3D printing, CAD and CAE. By using these tools efficiently, we can clarify their position and impact, and define the best role for them in our design and product development. Summarized and presented by Akiko Kondoh, Consultant for EnginSoft in Japan

Newsletter EnginSoft Year 12 n°1 - 34


Integrated Life Cycle Design for Buildings: Invest in Simulation to Increase Comfort and Environmental Sustainability at a Reasonable Cost With the evolution of both technological and cultural aspects on BENIMPACT PROJECT: PROGRAM AND OBJECTIVES sustainability, the building industry is facing the need to reconcile Definition of the technical-technological and functional energy savings and environmental and economic sustainability requirements of the work-flow during both construction and operation phases. In 2008, EnginSoft started to think of a workflow to support the The more comprehensive theme of “life cycle sustainability” building design process by taking into account, not only energetic involves all members in the production; from designers to final performances but also environmental aspects and costs, as well users, and it increases attention to different aspects, ranging as considering the entire life cycle/period of use (construction, from the capability to meet final users’ needs (safety, comfort management, and disposal/recovery). and aesthetics), to reduce energy dependence and environmental As a first step, methods and simulation software available on the impact (natural resources and materials consumption and market and their diffusion were compared to identify the most emissions). However, it often happens that the “life cycle thinking” suitable approach for each evaluation. Functional specifications approach turns out to be nothing more than an “expanded” look at the global process, while it hardly ever involves an effective effort to break down and analyze each single contribution. The latter requires a strong change in the building industry. As for manufactured products, there is a need to use software tools and methods that can grasp and integrate all the opportunities offered by materials and technologies. A decision support tool for considering, analyzing and optimizing energetic, environmental and economic performances would be helpful - especially during early design phases when it is really important to: • Identify issues • Identify design alternatives • Highlight differences • Give a consistent point of view Fig. 1- Input/output and data workflow in energy simulation software • Consider multiple criteria

35 - Newsletter EnginSoft Year 12 n°1

Case Histories


• Envelope: floors, opaque vertical closures, windows, roof. • Installations: generation systems (conventional and renewable) and the distribution grid. • Energy demand (quantity and type of energy source) required for heating, cooling and domestic hot water production.

This methodology was developed by considering the whole building as the “object of the analysis” - taking into account impacts related to raw material extraction, construction of building and its components and use of the building. In particular, the latter includes both energy consumption and maintenance/replacements of main elements, but not disposal/recycle, which largely depends on the specific characteristics of each case.

Development of simulation and optimization methodologies and strategies The optimization process of a complex system, such as a building, involves a integrated design approach that takes into account various disciplines and parts of the system, and their interaction, to identify configurations that better respond to one or more (often conflicting) objectives. While it is quite simple to find solutions which optimize a single objective - such as cost or energy consumption, it is much harder to efficiently identify good compromises whose behavior is inbetween the extremes. This goal can be achieved by applying multi-objective optimization strategies, to reduce time and effort when evaluating the multiple configurations and responses of a building system. Multi-objective optimization provides strong support, especially during early stages when design choices have the greatest influence on the building performance, so changes can still be easily (and less expensively) adjusted. It is important to emphasize that the aim of the tool is to support the decision making process, while the actual task of choosing between proposed solutions still remains in the hands of the designer. There are two pre-requirements that need to be satisfied in order to automatically optimize a problem. The first is to reformulate it so that it can be adressed and solved with mathematical tools, AND all the disciplines that contribute to the response of the system need to be adequately modelled. The second comprehends the definition of objectives, constraints and responses to be monitored, and the parameterization of the model. During the BENIMPACT project, an automatic procedure was implemented based on modeFRONTIER, a commercial multipurpose, multi-objective and multi-disciplinary optimization platform. The search for the optimal solution was done through the Pareto frontier method. Static and dynamic energetic behavior, costs and the environmental impact were analyzed for various building configurations; and significant input variables were identified based on their influence on the response of the “whole building system.” For each input variable, a “library” of alternative solutions were defined, and all technical information required for the simulation (type and thickness of its layers, thermal transmittance, specific cost and environmental impact indicators, etc...) were listed for each option.

The analyzed system consists of an accurate virtual model of the building (evaluation model) with a series of input and output data, which are measurable, quantifiable and clearly assignable to specific materials/components/issues (life cycle inventory). Parameters required for the calculation of the energy level, environmental and economic impact were calculated by summing up the following factors:

TEST CASE: OPTIMIZATION OF ENERGY, THERMAL COMFORT AND ECONOMIC PERFORMANCE The analysis of a 9-story building to be refurnished has been developed as follows: 1. On-the-spot investigation of the building characteristics, check of energy bills to verify the current energy consumption, and surveys among users about the actual comfort conditions.

Fig. 2 - 9-story building simplification

Fig. 3 – Central floor breakdown into thermal zones

and integration/interoperability requirements (data exchange mode) were also considered during the comparison. The result of the project was a mathematical workflow for the evaluation and optimization of different configurations from the energetic, environmental and economic point of view during the entire life cycle.

Case Histories

Newsletter EnginSoft Year 12 n°1 - 36


Fig. 4 – Simulated indoor operative temperature in the actual building during two days in January with a set-point of 21°C

2. Identification of redevelopment goals and alternative options to achieve them. 3. Evaluation and preliminary cost/benefit comparisons of different design solutions. The goal was to identify key areas of improvement for owners; such as, reduced energy consumption, increased comfort, aesthetic requalification and the implementation of new services such as air conditioning. The added value of this approach lies in the tight integration of design and management aspects (choosing, sizing, monitoring and controlling) starting from early stages of the design process. And the use of advanced techniques to support the analysis and comparison of a large number of solutions supports the decisionmaking process. One peculiarity of this approach is that, it does not recommend a single optimal solution, excluding in this way other interesting alternatives, but it offers a set of options that can be objectively compared to each other based on their technical and economic performances. System comparison and multi-variable optimization As an optimization problem is faster to solve if the model is “simple”, variables were limited (in number and variation) and the model was simplified in order to make simulations run quickly. Only the middle floor of the 9-story building was modelled (Figure 2), the latter was divided into thermal zones (Figure 3), and analysis was focused on optimizing the: • Type and thickness of the insulation coat. • Type of transparent enclosures (windows). • Heating and cooling system. • Three objectives were established for optimization (minimization). • Primary energy for heating and cooling calculated on a yearly basis (standard design year). • Total cost of refurbishment. • Pay-back time in relation to a reference solution. Firstly, the model of the plane was calibrated and validated for

37 - Newsletter EnginSoft Year 12 n°1

energy consumption and indoor temperature. i.e., the graph in Figure 4 shows that the tenants feel the cold as they are not able to reach the desired perceived comfort temperature, even with a heating set-point temperature of 21°C because of a much lower operating temperature. This phenomenon is further accentuated during shutdown (10 hours a day). The next steps were: • Evaluation of a campaign for the Design of Experiments (DoE) to create a first dataset of “attempted solutions” to support the optimization algorithm. • Optimization in order to efficiently search and identify optimal/ attractive configurations. The graph in Figure 5 shows configurations that are potentially interesting and their pay-back time related to savings in terms of primary energy. Optimal configurations are characterized by a good trade-off between a reasonable investment and a good capability to reduce costs related to energy consumption (i.e. investments on energy systems or external wall insulations have a reasonable payback time whilst replacing windows are characterized with a longer payback time). CONCLUSIONS It has been shown that the implementation of a software process can strongly support “sustainable” building design by making it possible to progressively evaluate and optimize energy performance, indoor thermal comfort, environmental impact and costs. By providing the scientific evidence of the benefits, this kind of approach can help spread sustainable design within the construction sector. Silvia Demattè, Angelo Messina - EnginSoft

Ask the expert in Green Building Silvia Demattè, EnginSoft s.dematte@enginsoft.it

Case Histories


Simulation of Quasi-Static Impact Response of Flat Marine Grade Foam-Filled Composite Sandwich Hull Panels Composite sandwich structures provide superior strength-to-weight ratio benefits when compared to counterpart composite laminates. The core of a composite sandwich structure contributes towards increasing the second moment of area of the structure and therefore its overall flexural rigidity. The top and bottom facesheets resist the compressive and tensile stresses due to an out-of-plane loading condition. Despite this structural superiority with respect to composite laminates, the applications of composite sandwich structures are usually limited to secondary noncritically loaded components due to the complexity of their impact response. Computer Aided Engineering techniques such as finite element analysis (FEA) are attractive engineering tools to investigate the impact response of composite sandwich structures. Such methods can potentially reduce the need to conduct destructive inspection tests to analyse the type and extent of damage imparted to the structure subjected to such impact conditions. As part of a national funded research project at the Department of Mechanical Engineering, University of Malta, an experimentally validated FE model was set-up to simulate the quasi-static impact (QSI) response of a certifiable advanced marine grade hybrid composite sandwich hull panel. The hull panel, shown in Figure 1, consists of a marine grade closed cell foam-core sandwiched between identical top and bottom facesheets. Each facesheet is identical, comprising a chopped strand mat and/or woven roving E-glass fibre sheets embedded in a matrix of orthophthalic polyester resin. Simulation of Impact Event The hull panel was subjected to QSI conditions from a 12.7mm diameter hemispherical hardened steel indentor travelling at a controlled constant downwards velocity of 1.25mm/min in the central out-of-plane direction of the panel. The hull panel was subjected to simply-supported boundary conditions

Case Histories

by resting the panel centrally over a rectangular steel plate with a 127mm diameter circular opening. These conditions are identical to those specified in procedure B of ASTM D7766-11 for determining the damage resistance of sandwich constructions. Finite Element Model and Fabrication Set-Up The Finite Element (FE) model set-up for this work (Figure 2) was implemented in the implicit-based ANSYS Mechanical v15.0. It consists of two identical composite facesheets and a foam-core. The FE model consists of only one-quarter of the full three-dimensional geometry due to presence of planar loading, material property and boundary conditions symmetries. The E-glass/orthophthalic polyester facesheets were modelled on the macroscopic scale such that the material properties used for the homogeneous facesheets were equivalent to the combined properties of the two single fibre sheets. The material behaviour of the

Fig. 1 - Foam-filled composite sandwich panel

Newsletter EnginSoft Year 12 n째1 - 38


closed-cell cross-linked polyvinylchloride foam-core was simulated by a linear best fit approximation of the GAZT yield surface through the linear Drucker-Prager yield criterion.

Fig. 2 - FE model

A custom user-built instant stiffness degradation material model was also set-up and validated in this work in order to simulate the initiation and progression of damage in the top facesheet. A degradation factor (DF) is applied for specific damage mechanisms in this material model to simulate degradation of the original material properties due to fibre failure and delamination. A vacuum bagging process was implemented to produce the sandwich panels. The panel fabrication technique involved curing the composite laminate facesheet and the foam core simultaneously to produce a coherent structure. The fabrication technique produce visually thinner facesheets, more compacted fibres in the matrix and an increased mass and volume fibre fractions when compared to hand lay-up fabrication methods. The vacuum bagging process produces facesheets of 70% mass fibre fractions, corresponding to a 50% volume fibre fraction. The vacuum technique increases the mass and volume fibre fractions over the hand lay-up technique by approximately 25% and 20% respectively. Seamless and perfect bonding between the facesheets and core was assumed as a result of the application of a constant vacuum pressure of four to six hours during the co-curing process. The perfect bonding was simulated by the node constraining method at the interface, such that overlapping nodes have equal x-, y- and z-translational degrees of freedom. Both the indentor and the support fixture were modelled as rigid bodies. Surface-to-surface contact elements were included in the FE model in order to account for the interaction of the hull panel with indentor and support fixture. Results and Conclusions Results show that the FE model is valid with respect to experimental data. The FE forcedisplacement history was found to fall within the limits of the 99% confidence interval of the mean experimental data, as shown in Figure 3. Experimental results show that the forcedisplacement history due to the conditions considered in this work experience a sharp drop in the force values after the maximum force is reached. The sharp drop is due to the localised puncturing of the top facesheets and crushing of the foam-core. The force-displacement history shown in Figure 3 shows the impact response up to this maximum force. The FE model was unable

39 - Newsletter EnginSoft Year 12 n°1

to converge at indentation displacements greater than that corresponding to the displacement at which the maximum force occurs. At the maximum force, the elements in the FE model beneath the hemispherical indentor were found to experience severe deformation. In practice, the mechanical strain energy absorbed by the hull sandwich panel due to this severe deformation is dissipated by the formation and propagation of cracks. Hence, fracture mechanics and/or a continuum damage mechanics material model has to be incorporated in the FE model to simulate the force-displacement history following the attainment of the maximum force. The initiation and propagation of cracks are highly undesirable in the design of marine grade composite sandwich hull panels in order to prevent the ingress of water in the marine craft. In view of the above discussion, validity of the presented FE model refers to the numerical agreement between the FE result and experimental result up to the point at which the maximum force is experienced by the panel. The validity of this FE model is attributed to the implementation of the custom instant stiffness degradation material model used to simulate the initiation and progression of damage in the hull panels. Figure 4 illustrates typical visible damage experienced by the hull panel during the QSI event.

Fig. 4 - Visible damage imparted to the hull panel

The research work disclosed in this publication is funded by the Strategic Educational Pathways Scholarship (Malta). The scholarship is partfinanced by the European Union – European Social Fund (ESF) under Operational Programme II – Cohesion Policy 2007-2013, “Empowering People for More Jobs and a Better Quality of Life”. Rudie Vella, Claire De Marco Muscat-Fenech and Pierluigi Mollicone - pierluigi.mollicone@um.edu.mt Department of Mechanical Engineering,Faculty of Engineering University of Malta

Fig. 3 - FE and experimental force-displacement history

Case Histories


Structural optimization of a bridge beam section subjected to instability of equilibrium It is well known that in the design of thin-walled structures, extreme attention must be made to the instability of equilibrium problem. This phenomenon involves a performance characteristics loss of the element in which it is manifested, almost instantaneous. If the is a single structural element it can be called local instability”; If the problem is a full structure it can be called global instability”. In both cases, the failure usually occurs suddenly with extreme danger. In this work, starting from a typical steel bridge, the design optimization target was in relation to the instability of equilibrium phenomenon about the web of the main beams. The starting structure is characterized by two main beams with 40 m of span and 2m of height. The beams are obtained by the union of three segments (one central and two lateral). The thicknesses of the flanges and web of the segments are modified, depending on the project needs. The two main beams, following a classic constructive scheme, are internally connected together through a series of reticular substructures. These transverse stiffening elements, in addition to stabilizing the beams during assembly, can also be used as reinforcement against the possible flexural-torsional equilibrium-type instability. The web of the beams are further reinforced with welded ribbings. These panel types have variable sizes (typically smaller in the zones under a shear stress state joined to axial compression) depending by needs.

• • • • • •

thickness of the transverse stiffening elements; thickness of the beam; tensions equivalents; tensions of the main beam web; total displacements of the structure; lateral shift of the single web.

Each model also presents a different number of transverse stiffening elements. Due the complexity of the problem, some static simplifications were made in the analysis parametric model phase. However, these changes weren’t influenced significantly the overall structural behavior. The optimization was made using the multidisciplinary multiobjective software modeFRONTIER. In order to evaluate simultaneously the dependence on multiple parameters, the use of the genetic algorithm MOGA II (Multiobjective Genetic Algorithm) was needed. Through this algorithm were generated a series of models composed by models generated with recursive method. These models offer all of the best features of the previous generations. Because of the number of models used in optimization and the high number of parameters that each model has got in the creation of modeFRONTIER workflow, a logic node “switch” was used.

Optimization In order to realize the optimization process of the initial design, 6 parametric models of the structure were created, using as parameters, with the assistance of the CAE software, ANSYS, respectively:

Fig. 1 - Deformation response of optimum design, front view

Table 1 - Percent of performance improvement compared to initial design

Case Histories

Fig. 2 - Deformation response of optimum design

Newsletter EnginSoft Year 12 n°1 - 40


Fig. 3 - Parallel Coordinates Chart of Pareto design

Through the appointment of a specific value of input variable associated, allow the software to do a univocal choice on the model to be used in the optimization process. Furthermore, for the quantification of the economic impact resulting from this process, was used the total length of the welds for the fixing of the stiffening elements to the main structure. Optimum solution The optimization process has produced a large number of design solutions that has been greatly reduced through the use of the Pareto Frontier. The remaining solutions are all to be considered “optimal” and for this reason a sensitivity analysis was performed to study various hypotheses made on the output parameters. This was done to determine a global optimum solution. these solutions have characteristics which optimizing the selected output parameters:

Table 2 - Optimum design table (T_1, 2, 3 spessore dei conci d’anima della trave, T_ agg thickness of theaddictional transverse stiffening elements placed between the longitudinal stiffener and the lower flap of the beam upper flange, i_mod reference model)

• • • •

minimum total displacement and of the web alone of the beam; minimum tension in the web of the beam minimum equivalent stress state in the entire bridge, minimum values of the length of the weld.

It was observed that each model has a constant value of the weld length, therefore minimizing this parameter means to choose the design that have the minimum number of transverse stiffeners. We used the Parallel Coordinates Chart to do a selection between the Pareto designs, obtaining six possible optimal solutions. It wasn’t possible to define an absolute optimum solution. Conclusion The final optimum solution has been chosen through static and economic considerations. The use of parameters such as a weld length, number of stiffening elements and the beam’s web thickness have directed the design choice to a good compromise of the improvements vs cost. The selected final design is characterized by a mean value of the total improvements obtained by the various solutions founded. The use of modeFRONTIER helped obtain important information about: • change of geometry • change of cross-section; • reaction of the structure. For example, the existence of a high number of transverse stiffening, associated with the presence of additional stiffening elements placed in the compressed portion of the beam produce an increase of the web displacements. This increase is due to a low inertial state in longitudinal direction compared to a high inertial state in transversal direction. Luca Candito, Giorgio Zavarise - Università del Salento

Fig. 4 - Synchronized view of Parallel coordinates chart and Bubble chart (upper picture), variation of the parameter values (lower picture)

41 - Newsletter EnginSoft Year 12 n°1

Ask the expert in the bridge structural optimization Vito Primavera, EnginSoft v.primavera@enginsoft.it

Case Histories


Investigation about the warm deep drawing of Mg alloys using Metamodels The present work is focused on the Deep Drawing process in warm conditions and in particular on the evaluation of optimal working conditions and/or enhancement of the process limits using Metamodels. Experimental tests were carried out with the aim of investigating the effect of the most important process parameters affecting the DD process: the blank holder pressure, the temperature of the Blank holder and the punch speed. The output variable “Progress� (defined as the ratio between the effective and the maximum punch stroke) was fitted using a separate parallel approach based on two different Response Surface construction algorithms. Further experimental tests were thus designed using an optimization approach based on the above mentioned RS, being the aim to enhance the process limits in terms of maximum achievable drawing ratio. The experimental results coming from the second campaign could be also used as a validation set for the results obtained from the initial optimization, thus allowing to identify which of the two initial RS was the most reliable in evaluating the process window.

improved properties both from a structural and a corrosive resistance point of view. On the other hand, the most limiting drawback when using Mg alloys is the poor formability exhibited at room temperature due to its Hexagonal Closed-Packed crystal structure, in which only the basal plane can move at room temperature. For this reason, thermal activation of further deformation mechanisms becomes necessary to improve ductility and formability. The present work deals with one of the most common test adopted to obtain information about the magnesium alloy formability, the Deep Drawing (DD) test. Experimental tests were carried out using an experimental equipment specifically designed: the initial Mg alloy blank was positioned between the die and the blankholder and both of the two were heated by means of electrical cartridges. In order to enhance formability, as documented in several researches about this process, the punch is maintained at room temperature adopting a cooling system so that a temperature gradient could be obtained in the radial direction of the blank. In addition, numerical simulations reproducing the process revealed to be in good accordance with the experimental results. As concerns the followed procedure, the first experimental campaign was carried out with the aim of investigating the effect of the most important process parameters affecting the forming of sound cups: the

Introduction During the last decade, lower fuel consumption and more limited polluting emissions are the main source of concern for the normative framework (Euro6). In order to reach such objectives, improvements in terms of vehicle lightweighting must be pursued: reduction of the engine size and structural mass reduction are the most attractive solutions. As concerns the latter solution, the wider use of light alloys reveals to be an attractive solution. Due to its lightweight and high specific strength, Magnesium (Mg) alloys are widely used for production of structural components: with a density of 1.74 g/cm3, its adoption in the automotive industries gives a real push toward the two abovementioned needs; in addition, alloying magnesium permits to obtain Fig. 1 - a) Experimental setup for deep drawing test; b) simplified scheme of the main components.

Case Histories

Newsletter EnginSoft Year 12 n°1 - 42


Holder Force (BHF) to be used in the test, was evaluated according to both temperature and strain rate using the equation (1):

Fig. 2 - Experimental campaign according to the CCD statistical plan

blank holder pressure (BHp), the temperature of the Blank holder (TBH) and the punch speed (PS). The output variable “Progress”, i.e. the ratio between the effective and the maximum punch stroke, was fitted adopting two different Response Surface (RS) algorithms, respectively a second order polynomial SVD and a Radial Basis Function (RBF); the determined surfaces were the starting point for the subsequent initial optimization. Basing on the obtained virtual designs, a second experimental activity was carried out aimed at investigating the process limits according to temperature and punch speed: in particular, blanks having different initial diameters were used and the Limit Drawing Ratio (LDR) was evaluated for different TBH and PS levels. In this second case, even considering a separate approach based on different RS algorithms, optimization results were similar. Material and Equipment Warm DD tests were performed at first on AZ31 circular blanks (0.7mm thick); since the recrystallization usually occurs only above 250°C, the alloy was tested in a range from 100 to 230°C. The scheme of the experimental setup is shown in Figure 2b; the equipment, assembled on a 200 kN Standard Tensile Test machine INSTRON 4480 (see Figure 1a) is composed by: i) a draw die flange; ii) a blankholder with an annular cavity in order to position the electrical heating device; iii) a water cooled punch. As heating system, a 1000 W power electric rod was positioned inside the blank holder flange; punch cooling was guaranteed by a constant water flow rate (at room temperature) passing through a proper designed axial circuit: in particular, three different water flow rate values were tested in order to evaluate the punch cooling efficiency, obtaining that the temperature gradient between the flange region and the blank center is not much affected by the water flow rate [6]. Temperature monitoring was possible thanks to two thermocouples (J type): one of them had a pin shape and it was located in a cavity in the blank holder for measuring its temperature, while the second thermocouple was a wire type and was glued at the blank centre for acquiring the temperature in this region. Response Surface Methodology (RSM) The first experimental campaign was designed using a statistical approach: a Central Composite Design consisting of 15 tests; the complete list of tests is reported in Figure 3. As input variables, beside PS and TBH, the BHp was not directly considered, but related to the material yield stress; in particular, the BHp, and consequently the Blank

43 - Newsletter EnginSoft Year 12 n°1

(eq. 1)

The equation (1) allows to relate the strain rate in the flange region () at the radial position: r is a function of the punch speed and of the die cavity radius (r1). Thus, once calculated the strain rate and according to the level of the temperature, the correspondent value of the yield stress was determined. Finally, also the initial temperature at the blank center (TBC_in), which is the temperature established (when punch starts moving) in the center of the specimen as a consequence of the heating by the BH and the cooling by the punch, was monitored. Experimental results, collected in terms of the above defined input parameters and of the output variable Progress were arranged in a spreadsheet in order to be implemented within the modeFRONTIER integration platform through the procedure in Figure 3.

Fig. 3 - Workflow in the modeFRONTIER integration platform

The RS Methodology was adopted to analyze the experimental data obtained from the CCD plan. As first approach, a second order polynomial SVD algorithm was chosen: RS capability of fitting experimental data was evaluated in terms of residuals and distances. The obtained RS using the2nd order polynomial SVD algorithm optimally fits the starting data, being characterised by negligible values of the maximum absolute and relative errors (respectively equal to 2E15 and 6E-16). In order to understand if the Radial Basis Functions algorithm could guarantee a comparable level of fitting, data set was divided into two groups: the Validation Set (about 10% of the entire starting data group) and the Training Set on which the RSs were initially trained. As shown in Figure 4 the RSs obtained applying both the Duchon’s Polyharmonic Splines and the Wendland’s Compactly Supported algorithm exhibited the highest value of the Mean Leave One Out parameter (MLOOE), which indicated a limited capability of fitting the experimental data. As a confirmation of the previous results, the two RSs were characterized by the highest values of residuals on both of the two designs belonging to the Validation Set. The remaining three obtained

Case Histories


Fig. 4 - MLOOE and SP evaluated on Training Set Designs

since the goal was to obtain a sound drawn cup while maximizing the output variable Progress, the constraint imposed the value of Progress equal to 100%. An initial population of 10 designs was set within the DoE node, while the MOGA-II algorithm was adopted considering a number of successive generation equal to 100. When the RBF was chosen as the basis for the optimization (Figure 6) the PS had to be set at average values (in the range 45 -60 mm/min) as well as the TBH (but almost 200°C).

RSs were considered the most reliable: Hardy’s MultiQuadrics, Inverse MultiQuadrics. and Gaussians. Thus a second training considering the whole data set was performed: the determined RSs were plot in a common graph through the Function Plots in Figure 5 Since the output variable (Progress) had to be maximized, the RS that underestimated the maximum value was chosen. Thus, using the Function Plot in Figure 5, the Hardy’s MultiQuadrics RS was chosen Fig. 7 - Results of Optimization (SVD as starting RS)

On the other hand, results of the optimization performed considering the second order Polynomial SVD as starting RS were different from the previous ones: according to the bubble charts shown in Figure 7, the optimal working conditions could be achieved adopting low values of the punch speed (15 mm/min) and setting the blankholder temperature at high values (between 240 and 260 °C). Further analysis and experiments detailed in the next section were necessary in order to understand which is the more suitable algorithm.

Fig. 5 - Function Plot: Hardy’s MultiQuadrics, Inverse MultiQuadrics and Gaussians trained on the whole data set

Fig. 6 - Results of Optimization (RBF as starting RS): a) entire set of virtual designs, b) only feasible designs

Fig. 8 - Residuals on Validation Set

among the considered formulations since it was placed always below the other two surfaces. The second order polynomial SVD and the RBF Hardy’s MultiQuadrics were considered as the starting point for the subsequent optimization with the precise goal of: (i) understanding the optimal working condition; (ii) evaluating how the analytical RS formulation could influence the results. In order to reach this goal, the workflow was characterized by the presence of a constraint in the objective function (see Figure 3): in particular,

Enhancing the process limit The second part of experimental activity was mainly focussed on understanding how PS and TBH could influence the LDR; the procedure started from setting the working temperature at three levels, respectively 130, 180 and 230 °C: at each level of the temperature, the punch speed was varied and the LDR could be thus evaluated. The second workflow resembled the one shown in Figure 3, excepting for the absence of a constraint: in this case there was not a precise

Case Histories

Newsletter EnginSoft Year 12 n°1 - 44


rate). The second optimization demonstrated that the first analysis (based on the SVD) was not only able to evaluate the optimal range of the input parameter to obtain a sound cup, but also the process window to be adopted if the process limit (in terms of LDR) had to be enhanced.

Fig. 9 - RS capability of fitting starting data: MLOOE, SP and residual on Validation Set Designs

Fig. 10 - Function Plot of the most accurate RSs

limitation of the drawing ratio that had simply to be maximized. The followed procedure maintained the double parallel approach: also in this second case both the SVD and the RBF algorithms were taken into account. As shown in Figure 8, according to the residuals, the second order Polynomial SVD revealed to be more accurate in fitting starting data: thus, the same algorithm was adopted considering the whole data set, and the RS trained a second time. At the same time, the RBFs were considered as alternative solutions: the same Validation Set was adopted to evaluate the performances of each RS. Bar charts in Figure 9 showed that the Hardy’s MultiQuadrics, Inverse MultiQuadrics and Duchon’s exhibited the lower value of the MLOOE: thus, a Function Plot was necessary to evaluate their fitting capability (Figure 10).

Conclusions The present work aimed at determining the most performing working conditions for a deep drawing process: metamodels were built up in order to understand how the most influencing parameter such as punch speed or blankholder load could influence the process performance. Results of the first experimental set were analysed evaluating the fitting capability of two RSs obtained from different algorithms and a first subsequent optimization procedure was carried out.

A subsequent experimental activity, based on the previous results, was designed in order to understand how TBH and PS had to be set in order to increase the LDR. Results of the second optimization demonstrated that the first optimization procedure based on the 2nd order Polynomial SVD was the most reliable and able to give robust results, since the operative window in which process performances were maximized was the same in which not only a sound component could be formed, but the LDR could also be enhanced. Acknowledgment The authors wish to thank Dr. Vito Primavera (EnginSoft - Mesagne) for his support in the analysis using modeFRONTIER. G. Palumbo, A. Piccininni, V. Piglionico, P. Guglielmi, D. Sorgente and L. Tricarico Politecnico di Bari

The second Function Plot in figure 10 confirms that the trend of the three surfaces was repeatable: this led not to consider one of the three surfaces for the subsequent optimization, but all of them were separately used as starting point. The results were analysed and compared with the ones coming from the optimization performed considering the second order polynomial SVD. As shown in Figure 11, despite optimization procedures started from different RSs, results were common: in order to improve the drawing ratio temperature had to be set at higher temperature, while punch speed had to be set at low values. In addition, if the punch speed was set at low values, an increasing temperature led to an increasing value of the drawing ratio (alloy improved formability). Beside this, if the temperature level was set at higher temperature, an increasing value of the punch speed led to a decreasing value of the drawing ratio (increasing PS means higher strain

45 - Newsletter EnginSoft Year 12 n°1

Fig. 11 - Comparison of the results of the optimization procedures

Case Histories


ANSYS CFX R16.0 ANSYS CFX announces new capabilities and enhancements to allow engineers to improve their product designs. This brief article summarizes the major changes to ANSYS CFX introduced in Release 16.0. New Features and Enhancements Parallel Processing • Dramatic HPC scalability advances were previously implemented, and required additional expert parameter settings for activation. R16 incorporates numerous HPC improvements and changes, to deliver excellent scalability “out-of-the-box”. • Partitioning in ANSYS CFX Parallel is a preprocessing step. In CFX-Solver Manager, the user can choose one of several partitioning methods, and can specify the number of partitions (up to a maximum of 16384). In R16.0, partitions are smoothed by default before solver runs. The Partition smoothing attempts to minimize the surface area of partition boundaries by swapping vertices between partitions (during partitioning, “Iso-Partition Connection Statistics” are written to the Output File). This reduces communication overhead, and improves solver robustness, thus improving HPC performance and scaling for large numbers of partitions. Smoothing is enabled by default, but the user can configure its settings in either the Execution Control object in CFX-Pre or the Define Run dialog box in CFX-Solver; if this is not desired, it can be disabled by changing Partition Smoothing > Option. • Since Release 15, the large problem partitioner supported the MeTiS partitioning method as an expert parameter enabled option, allowing larger cases to be partitioned with MeTiS. In Release 16.0, MeTiS is the default partitioning method for the large problem partitioner. • The default remote access protocol on UNIX/Linux has been changed from rsh to ssh. Solution Monitoring and Convergence Control • Now in CFX-Pre, the user can specify one or more statistical quantities (Arithmetic Sum, Arithmetic Average, Difference from the Mean, Maximum, Minimum, Root Mean Square, Standard Deviation, Time Integral -only for transient simulations-) to be evaluated for the expression value, defined as a Monitor, over a moving interval, which represents the recent history of the solution. The user can then use these statistics as part of an interrupt control for solver runs. When the user runs the simulation, CFX-Solver computes the statistical quantities in addition to the expression value. • In CFX-Solver Manager, the user can apply “derived variables”, such as graphical offsets or statistics over moving intervals, to existing

Software Update

monitors. The available statistical quantities are the same as described above. Standard CFX out files have been increasingly compact in recent releases, but parallel host information could still swamp the output when running with larger number partitions. R16.0 makes standard diagnostics more succinct again, highlighting only the most important information. However, full output is still available when desired through the argument “-output-summary-option”.

Turbulence • A new reattachment modification option is available for the SST model, to address potentially exaggerated zones of flow separation that can occur - even though the onset of separation is typically well predicted by the SST model. Transient Blade Row • Up to R15, the Fourier Transformation method can be used in cases involving rotational flow boundary disturbances, blade flutter or forced response. Now in R16 it can be used to solve transient rotorstator interaction cases (single stage only). A new tutorial is also available. • In the past releases, time step changes and solution restarts for TBR simulations affected Fourier Coefficient sampling, requiring care on when to stop/start simulation. In fact, the time step size was restricted

Newsletter EnginSoft Year 12 n°1 - 46


to integer division within these releases. In R16, it is possible to exploit flexible time step and moving averages accumulation for Fourier coefficients. This allows simulation to be started with large time step, then restarted with smaller time step for last few blade passings, to reduce overall simulation time and improve accuracy. A Frequency Filtering option was added to the Fourier Transformation model to avoid instabilities.

Boundary Conditions • When using Mass Flow Rate or Exit Corrected Mass Flow Rate boundary conditions in a single-phase, rotational periodic model, you can now specify whether the mass flow rate value boundary conditions apply to the modeled sector (“As Specified”) or the full geometry (“Total for All Sectors”), which avoids inconsistency if the passage number changes in the design process. • The “Mass Flow Update” outlet boundary condition has been renamed to “Mass Flow Outlet Constraint” for clarity. All associated settings have also been renamed. Export • Now the user can export complex pressure from a blade flutter case for use in the ANSYS Mechanical solver. Incompatibilities • The “Last Period“ Fourier coefficient accumulation method has been deprecated in favor of the “Moving Averages” method, which continuously updates Fourier coefficients as CFX-Solver runs progress. “Last Period” no longer appears in the Monitor object for new CFX-Pre cases. Updates Affecting Code Behavior • Parallel collection is switched on by default for runs with at least 64 partitions. • The viscous work term is now enabled by default for cases involving multiple frames of reference and the Total Energy heat transfer model. • In run continuation mode, the interpolator will now copy zone-global data for different meshes in the same domain (geometry). This corrects a transient rotor stator problem, in which the gravity vector is perpendicular to the axis of rotation. • Accuracy has been improved for cases containing radiometers specified within a material with a refractive index greater than one. • Incompressible flows with the Thermal Energy model now return the correct total temperature. • Parallel radiation cases with the Monte Carlo model or Discrete Transfer model, and transmissive domain interfaces, now produce the correct intensity when a parallel partition boundary touches that interface. • Previous mass source results from Initial Values files are now fully ignored, when used with a Solver Input File without mass source definitions.

47 - Newsletter EnginSoft Year 12 n°1

• FSI calculations in parallel involving heat transfer now produce •

the correct heat flux at the interface when a boundary crosses that interface. The High Speed Numerics option now sets the nodal pressure gradients to zero at all pressure boundaries and openings.

CFD-Post • Animations of transient with display of time could lead to “jumpy” text output, depending on given time value. R16 adds user control for the number precision, giving better and cleaner annotation display in the viewer Applies to time value and other numerical quantities. • Creating image files (png, jpg, etc.) could take a long time, especially for scenes with many transparent pixels, slowing down interactive use as well as batch post-processing. R16 has various general printing performance improvements, with most dramatic improvements for cases involving volume rendering. • Generating images using “screen capture mode” can be much more efficient than default rendering for complex scenes, but the feature was not fully supported on Linux. R16 now fully supports “screen capture” on Linux for much faster image generation. • Turbo-specific post-processing variables were calculated and stored at load time, and in some cases repeatedly - leading to delays at start-up and other situations (e.g. change in data instances for TBR simulations). Turbo variables are now computed only if and when actually needed (“on demand”) in R16. • Automatic creation of turbo plots with default settings often was annoying, as generating the default plot could be time consuming, esp. because default values need user adjustment in most cases. R16 turbo plots are no longer automatically applied, so you no longer have to wait for the default plot to be generated before being able to make any adjustments.

Miscellaneous • Communication lag between the engine and GUI sometimes caused interactive response to deteriorate significantly on Linux. R16 incorporates improved algorithms to make Linux interactive response comparable to what it is on Windows. • Sharing ANSYS CFD results as 3D images from CFD-Post meant careful a priori definition of what parts of scene are visible or not. In R16 this powerful means of sharing 3D results with any colleague or customers allow users to toggle individual object visibility within the viewer itself. • Embedding 3D images in MS Office applications was possible, but with compromised interactive response and the ability to print the selected view was limited. The interactive response of the R16 3D viewer is now instantaneous when embedded in PowerPoint, Internet Explorer, etc., and viewer printing possibilities are also available. For more information ANSYS CFX Product Manager: Alessandro Marini, EnginSoft a.marini@enginsoft.it

Software Update


ANSYS R16.0 Mechanical Presenting the technical improvements of the ANSYS WB Mechanical 16 Release is not an easy task. Nevertheless, the colorful diagram displayed here, aims to summarize the complex structure of its several technical innovations and new performances. An exploded view of any single element of the graph is listed in this short review, which includes the factors that are considered to be widely used by the customers, with the support of some pictures presenting the different software capabilities. To support its customers in receiving the technical improvements of the new release, EnginSoft has organized 2 different webinars. In order to meet heterogeneous needs, the “Basic Webinar” will be focused on the topics presented in the right part of the diagram, whereas the “Advanced Webinar” will deal with the left part of diagram. February 2015: New mesh capabilities (‘Basic’) & Material failure: NCODE capabilities (‘Advanced’) March 2015: Fabricated structure mesh with beam & shell (‘Basic‘) & Composite (‘Advanced’) Month by month, webinar structure and topics will be available on the website: www.enginsoft.it/eventi

Mesh editing Connections: Mesh connections are now automatically detected, (similar to contacts) and are used to ensure coincident nodes at edges. Node merge: “Node merge” operations can also be used to join parts, including mixed topologies. Move Nodes: “Move node” option allows for local adaption of node location, to manually improve mesh (Fig. 2). Element quality can now be displayed on the mesh.

Improvements in ANSYS WB 16 Fig. 2 - “Move Nodes”: Figure B1 (Sx) area with low “element quality”; Figure B2 (dx) of the same area forward the manual “move node” thus improving the “element quality” in the rounded area

Fig. 1 – Improvement in mesh algorithm that minimizes triangular elements

Fabricated Structures Workflow R16 offers a new, highly automated process that commands faster transfers from geometry, new meshing algorithms, parallel meshing and efficient mesh connections. A typical workflow removes the need for shared topology, and favors the use of imprints - resulting in faster transfer from geometry to mechanical. Since all parts are imported separately, parallel part-by-part meshing can be used to speed-up the meshing process. Shell meshing benefits: improved quad meshing algorithms (Fig. 1).

Software Update

Geometry Welds can be added to a surface model to connect parts. Welds and other parts are connected using mesh connections. “Detach” operations can be used to split surface bodies in multiple surfaces for better local control of mesh or easier geometry changes (orientations, defeaturing …). Beam stresses Beam Stress Contour gives full solution of the beam, including maximum/ minimum stress values on the outer surface of the beam structure. Contact Modeling Performance Enhancements in Contact Surface Wear Modeling provides more accurate wear results, and allows automated re-meshing and automated termination of the simulation (Fig. 3). Robustness Improvements in Contact Stiffness updates allows contact analysis to be more robust and faster than before.

Newsletter EnginSoft Year 12 n°1 - 48


General contact “General contact” aids in automatically identifying all possible contact, thus minimizing human error. Tracking Contact trackers can be added even after the solution has been computed. Productivity Assemblies Improved Assembly Management allows import of more details from the model and improves data management in the Fig. 3 – Automatic Remeshing in the areas close to the contact simulation tree. Users can now quickly preview the orientation of the different parts, allowing you to re-position your subassemblies if needed. Only geometry and contacts are assembled. Computationally expensive items, such as the mesh, are not assembled. Significant performance improvements have been made in Model Assembly algorithms, for a 2-3 times speedup, compared to prior releases. Parallel contact detection speeds up model load time significantly for large assemblies. Graphical tools Grouping: Object Grouping allows the user to create folders, to logically group loads and coordinate systems, to make the tree more manageable (Fig. 4). Explode: Graphical exploding of parts helps with selections and visualization of hidden parts (Fig. 5). Solver feedback: New features in Details View Property provides control for MAPDL Solver Pivot Error Checking. View enhancements: Pictures are easily captured from the new Toolbar Menu, Context Menu or simply Ctrl+C.

Fig. 4 - Grouping in folders of loads, coordinates systems, etc. that allow an easier management of the whole applied conditions.

Fig. 5 - Exploded view of the system

49 - Newsletter EnginSoft Year 12 n°1

Named selection Duplicate/Copy/Paste options have been added for Named Selection work-sheets to help accelerate creation of complex selections. Data mapping Conservative When importing forces from an external source, it is important to accurately capture resulting forces; which is generally not the case if non-conservative algorithms are used. External data now allows the use of conservative algorithms whilst importing loads, to match resulting forces better. Composites 3D composites New Solid Model Geometry Cut-off lets you define arbitrary cut-outs in your structured solid composite models.

Fig. 6 - Orientation view of the fibers and of the element normal

Post-processing ACP Plies imported into Mechanical provides better overview of model contents: display of single ply information, toggle fiber direction and element normal (Fig. 6). Detailed View updates with ACP imported information of single Ply. Display of Fiber Stresses in a Ply, Display of Stresses in a Layer. New Conversion of Legacy MAPDL Composite Models lets you create WB/ACP Projects. Imperfections Shear, Temperature, and Degradation Factor Dependent Material Data lets you model imperfections of composites. Degradation Field and Draping Simulation include the effects of draping shear and manufacturing artifacts on the material properties. Material failure Crack coordinate system Created automatically by the meshing process. Is a child of the ‘Crack’ object. Cannot be modified directly or used elsewhere in the model. Used in the MAPDL input file, graphics annotations, and fracture calculations associated with a given crack. Align with face normal (new property): specifies whether the Crack Coordinate System’s X-Axis will be aligned with the face normal of the nearest surface. Defaults to Yes. Project to nearest surface (new property): specifies whether the Crack Coordinate System’s origin will be projected to a point on the nearest surface. Defaults to Yes.

Software Update


UMM The Unstructured Mesh Method (UMM) is a numerical tool used to improve the accuracy of fracture mechanics parameter calculations, for Unstructured Hexahedral Meshes and Tetrahedral Meshes. The accuracy of the resulting parameters is generally comparable to the parameters evaluated using structured hexahedral meshes.

a Computational Fluid Dynamic (CFD) analysis with a one-dimensional fluid flow. Models using Thermal Fluid Flow analysis can be solved not only in full 3D environments, but also in 2D-plane and 2D-axisymmetric cases. Heat flow is generated by the conduction within the fluid and the fluid mass transport. Thermal Fluid Flow uses line bodies with ‘Thermal Fluid’ properties analysis activation (Fig. 9)

Fig. 7 - Growing crack using the extended finite element method

XFEM The Extended Finite Element Method (XFEM) is used to simulate growing cracks without the use of morphing or re-meshing.(Fig. 7). The method uses additional displacement functions to account for jumps in displacements across the cracks. Crack tip singularities are not taken into account. Allows for arbitrary crack growth within an existing mesh. The XFEM uses the phantom node method for growing cracks, which implies that the crack cuts the elements fully. Newly formed crack segments are automatically modeled with cohesive behavior based on material specifications. Only quasi-static crack growth is allowed to be modeled. Analysis guide A Fracture Analysis Guide is now available in the help system.

Fig. 8 - Adaptive remeshing during the simulation

Elastomers Adaptive reme-shing Adaptive re-meshing is now supported in Mechanical for Static Structural Analysis. Results are plotted over changing mesh. Adaptive re-meshing can be monitored in the regular convergence chart (Fig. 8). Fracture Material force: Material forces are used to determine fracture mechanics parameters of rubber material. They can be seen as the driving forces acting on any kind of inhomogeneity, including crack tips. Similar calculations are performed to regulate fracture parameters, with a special option for material forces. Thermal Fluid Flow Thermal Fluid Flow analysis is equivalent to a reduced-order modeling for

Software Update

Fig. 9 - Basic example of potentiality extension of the FLUID116 element

Buckling NL eigenvalue Buckling loads can be computed considering nonlinearities in the model - with or without additional loads. Unlike linear buckling, load multiplier is calculated for incremental loads applied after static solution considering nonlinearity. Eigenvalues can be computer based on the pre-stress loads or with incremental loads. Rotordynamics Mistuning, off-balance of disks and/or shafts have a significant impact on the dynamic behaviors of rotor machinery. In order to meet standards and specifications, designers have to take these effects into account. Unbalanced Harmonic Response of Rotordynamics is now available in Mechanical (Fig. 10) Small blade-to-blade variations due to manufacturing tolerances and wear, produce large uncertainty in the forced response levels of bladed discs. Mistuning modeling is now available in R16. For more information ANSYS Mechanical Product Manager: Roberto Gonella, EnginSoft r.gonella@enginsoft.it

Fig. 10 - Unbalanced Harmonic Response of Rotor-dynamics

Newsletter EnginSoft Year 12 n°1 - 50


Flowmaster V7.9.3 The focus of the new release is on quality. Enhancements have been made in the field of usability, providing a greater flexibility, and in the field of physics, providing new capabilities and greater model accuracy. All these enhancements help users in increasing their productivity in working. Usability A significant number of usability enhancements have been made across the Flowmaster product. Orifice. Velocity, Mach number and static pressure results are now available at the vena contracta of the orifices. This provides important information to analysts for optimizing system performances. Parametric Study Experiment. The interface for creating a Parametric Study Experiment has been significantly updated with the purpose of simplifying the definition of the study design and providing information more regularly. The user can now input values for each parameter based on a start value and a combination of other parameters providing great flexibility in the definition of the parametric study. The table showing the parameters values for each simulation is displayed and continually updated so that the user can quickly and accurately configure the parametric study. Users are able to edit values, add and delete simulations to fully customize the experiment. Finally, experiments definition can be saved and reused. Monte Carlo Randomized Inputs. The existing Monte Carlo experiment type has been enhanced to generate individual randomized values for each instance of a variable parameter in a Flowmaster network. This enhancement allows to quantify the combined risk of manufacturing tolerances on analysis. Experiments: Space & Time Estimation. All experiment types have been enhanced to give an estimate of the database space and time required to complete the experiment. The estimates update as the experiment progresses to give the remaining database space required and time until completion. Current Simulation Time Output. The Simulation Data tab has been enhanced to provide a display of the current time step during transient simulations. 64 bit Solution. This enables Flowmaster to be coupled with Matlab and Excel 64 bit. Schematic Label and Table Enhancements. The user now has the option to display the units of the result label that they draw on the schematic, the results tables are resizable and both results tables and labels now have a persistence option that allows the user to reposition them and save those locations to be reused. These enhancements improve the user experience and allow to produce clear and intuitive results outputs. Database Enhancements. The performance of several common interactions with Flowmaster has improved due to more efficient interactions with the associated database leading to smaller response times. The improvements tend to be more significant with large sets of performance data, networks with a large number of components, remote database configurations, and Oracle.

51 - Newsletter EnginSoft Year 12 n째1

Physics Natural Circulation. It is the ability of a fluid within a piping system to circulate or flow continuously without the means of a pump. The combination of changes in heat energy and gravity work together to cause the circulation. The 2 Phase pipe model has been enhanced to properly consider buoyancy related flow effects. This allows for simulation of boiler loops in traditional fossil fuel power plants. Additionally, entire power plant steam generation and consumption network can be modeled seamlessly. Gas Turbine Rotational Simulation Improvements. The gas turbine secondary air capability has been enhanced in several areas to provide the user with a better modeling experience as well as better model convergence and a more accurate solutions all of which is aimed at increased user productivity. Fluid Property Enhancement for Super-Critical Fluids. Pressure and enthalpy curves for complex fluids have been extended to properly consider fluids that operate in super critical regions where the fluid changes phase between liquid and gas without going through a two-phase step. Improved Compressible Pipe Heat Transfer. The equation for the heat transfer between the fluid and the pipe has been updated to use the adiabatic wall temperature instead of the static temperature. This provide more accurate heat transfer results at high flow rates (high Mach numbers) as well as more stability in the calculations. New T and Y Junction Components. V7.9.3 introduces new T and Y junction components that replace the now legacy versions from all previous V7 releases. The new components are based on a different linearization method that provides a more stable solution. Net Positive Suction Head (NPSH). Net Positive Suction Head is now reported for centrifugal pumps. This enhancement provides results in the form that the user gets from the pump supplier and allows for quick comparison of the results to determine if the system is configured for proper operation of the specified pump. Verification and Validation Documentation All Flowmaster components are verified through testing by using component specific models (networks). Results generated by Flowmaster are then compared against hand calculations. Moreover, any Flowmaster components which are compared against experimental measurements and are demonstrated to show close agreement are classed as Validated. Validation is done in a number of different ways including test partners, published Journals, the use of experimental test rigs in collaboration with universities . Dedicated reports are created to document the verification and validation processes. Verification and Validation reports are now available as part of the V7.9.3 release. For more information: Flowmaster Product Manager: Alberto Deponti, EnginSoft a.deponti@enginsoft.it

Software Update


Total Materia Integrator The new software for creating a private company database of material information; making the organization of internal standards, test data, external data sources and any other material information turn-key and hassle free What challenges does Integrator tackle? Key to Metals AG has an important role to play in the materials information work flow already, by providing core business functions such as material selection, design and CAE analysis, quality control, and purchasing. The use of the market-leading material properties database, Total Materia (www.totalmateria.com) provides the ammunition to make accurate and informed decisions about materials stored in the database. With the recent launch of Total Materia Integrator, Key to Metals AG allows another piece of the material domain puzzle to be solved, by being able to easily develop a database to manage internal materials and their data. The Total Materia Integrator provides flexibility to define the inserted material structures in accordance to the most important aspects for the engineering teams. With security and traceability being the most critical factor, Integrator provides confidence that data will be stored safely, either in the cloud or on the customer server, aa well as providing full traceability of administrative actions, including material changes business wide. Bundled with the Total Materia database, Integrator provides a powerful multidimensional tool with a widened scope of engineering appeal by leveraging access to over 250,000 materials, and fully available intuitive functionalities for both Total Materia and private internal data, including: • Comparison of private materials with “standard” materials in a few simple clicks • Inclusion of private and standard materials in combined advanced search possibilities

Software Update

Fig. 1 - Integrator is a powerful web application for inserting and managing materials, With the flexibility to add multiple properties, diagrams and related information for Your Private Corporate Materials; along with a range of user defined datasets and documents. Through coupling with the Total Materia database, Integrator provides the complete scope of more than 250,000 materials, as well as the full functionality of Total Materia; such as advanced property searches, intuitive comparison tools for materials and datasets, exporting data in various formats including the leading CAE solvers and much more.

Newsletter EnginSoft Year 12 n°1 - 52


• • •

The ability to build private cross reference tables with private materials that are able to connect to standard materials Creation of material PDFs to help communicate entire material profiles throughout the business The ability to export information for use in CAE software and other advanced calculations, from both the standard and private database.

Why is a simple, turn-key solution the way forward? When considering data management solutions, the main objectives are to ensure that information is organized in a logical, clear way according to the business’ best practice and to make sure information is traceable, accessible and reliable through a single intuitive platform; also where possible, the solution should be able to interact with other information systems within the business. Let’s demonstrate the importance of Integrator by pausing to think about the reality. Often a single material’s data profile is obtained from a patch work of sources. Some information like linear properties will come from official sources like standards or handbooks, whereas experimental data, such as stress strain data, might come from internal or third party testing. There is also a need to assign additional information, which may be useful for the everyday material assignments that arise. Integrator handles this data diversity simply and efficiently by: • • • • • •

Providing a simple data insert process for information such as composition, mechanical and physical properties etc. Mapping out a logical data structure and taxonomy framework Allowing flexibility to add new properties, material groups, and much more. Managing numerous different sources of information for the same material Importing advanced property data, such as stress strain points with automatic plotting Adding additional documents which could include application guidelines, images, pricing etc.

What distinguishes Integrator from other data management solutions? There are many more complex solutions available to manage data and sometimes this is necessary to satisfy the most stringent requirements in complex engineering environments, such as aerospace; but of course this comes with a price tag to match. Integrator excels is in its possibilities to provide a flexible, off the shelf solutions to the vast majority of companies who are looking to simply manage materials and their data; keeping them up to date in a secure and organized environment to ensure all engineering functions of the business can benefit by easily finding what they need, when they need it. Can this solve all of the business’s data and data management needs? Well actually it’s not realistic to think it can! Engineers are looking for a wide range of data and from multiple sources. What Key to Metals AG are focused on doing is providing a platform which can help drive best practice within the business. By allowing control over internal data and combining this with the largest material properties database in the world - Total Materia, provides the most comprehensive combined solution possible.

53 - Newsletter EnginSoft Year 12 n°1

To see more information on how Integrator works and can benefit your business please visit the Total Materia website http://www.keytometals.com/page.aspx?ID=Integrator&LN=EN or view our multimedia video at http://www.youtube.com/watch?v=nDI6H0z1P0c EnginSoft represents Key to Metals’ full product line in selected regions. EnginSoft also provide training, support and consulting services for customers of Total Materia. EnginSoft brings niche knowledge from the CAE markets combined with core materials and database knowledge at Key to Metals; creating new innovative products and services - where Integrator is one example. Contact information EnginSoft - Christina Löfvén c.lofven@enginsoft.se

Winterwind 2015 International Wind Energy Conference

February 2 to 4, 2015 – Piteå, SWEDEN Organized by Swedish Windpower Association The conference focuses on challenges of generating wind power in cold climates, i.e. low temperatures and icing conditions. Winterwind offers an industrial and a commercial track, poster exhibition and technical visit. Participants are scientists, engineers, manufacturers, developers, consultants, investors, wind farm owners, students and O&M providers, as well as representatives from government agencies from all over the world.This year the conference attracted more than 500 delegates from more than 20 countries. EnginSoft’s offerings included FENSAP SW, which is an appreciated spot-on contributor to the challenges of power losses, prediction challenges, security issues and structural impact threats. For more information EnginSoft, Susanne Glifberg - s.glifberg@enginsoft.se

Software Update


Pre-salt Simulation Issues

How to obtain accurate data to simulate carbonate reservoirs? According to a 2012 publication from Geological Society of London it is estimated that about 60% of world reserves of oil are stored in carbonate reservoirs with most of this volume sited in the Middle East. Although the potential storage of carbonate rocks is known, the main simulation challenge with this type of geological formation is to characterize the complex and heterogeneous network of pores and to understand consequent flow regimes. Carbonate reservoirs from the Atlantic Ocean´s Pre-salt layer bring additional petrophysical and geomechanical difficulties when compared to the Middle East, since they are placed in water depths greater than 2000 m and under a thick layer of sediment and salt, reaching depths of more than 5000 m. The analysis of carbonate rocks and internal fluid behavior (oil-gas-water) are critical in estimating the amount and quality of reserves in order to choose the best production strategy and performance prediction for each oil field. Traditionally, experimental methods still present the bulk of the techniques applied by the Oil and Gas Industry in rock characterization analysis. However, these techniques consume large amounts of time and resources in order to carry out several analyses. However, , methods of image analysis are gradually progressing to provide an alternative that can complement experiments whist aiding cost reduction. Virtual analysis provides benefits such as obtaining different properties from the same sample with uniformity and repeatability. Since 2000 methodologies for image analysis of siliciclastic sedimentary rocks have been gaining industry acceptance . The major design challenge is to use these methods for the characterization of carbonate rocks. The complexity of the processes involved in the formation of these rocks results in high heterogeneity of the form and the pore size. The range scales from nanometers up to few inches and so the method used in the analysis of images must be able to understand these different spatial scales. Looking at this challengethe Brazilian company CONCREMAT Engineering and Technology set up a business targetted at Fundamental Petrophysics laboratory support in partnership with ESSS. This latter group specializes in engineering simulation solutions and developed a computational

Software Update

tool to simulate most petrophysical information available for pre-salt carbonate rocks gleaned from microtomography images in 2014. To build on technology evolution in this area a software development project has now been selected by the Brazilian Innovation Agency (FINEP) under an Information Technology bid with a grant of over U$ 1.5 million. Subsequently CONCREMAT and ESSS are developing a new tool that takes into account three-dimensional (3D) images of a porous medium by a set of voxels (three-dimensional pixel equivalent) to keep a density value of the material. As It is usual for analysis, the use of binary images is usual for analysisthis means that by storing a voxel value of 1 or 0, then a solid matrix or pore space can be identified. With the new tool 3D images from the inside portion of a sample can be obtained by computed tomography or construction of pore spaces (by statistical methods or simulation of geological processes for creating 3D space, which preserves features found in thin sections of rocks ). Since microtomography can evolve direct images, faithfully reproducing the existing porous structure in the sample, construction methods are no longer needed. From 3D images numerical methods for calculation of petrophysical properties are then applied. Two main principles can be used: Lattice -Boltzmann Method or Pore Network. The first is a CFD method that models the physical phenomena in mesoscopic scale (such as particle collision or the interaction forces between molecules) and which reproduces the macroscopic effect of a viscous flow. This method gives very good results, but has high computational costs and only allows the use of one spatial scale. The second method, Pore Network, works on the construction of a structure that is topologically equivalent to the porous system image but with a simplified geometry. This simulated network then seeks to reproduce a porous rock in nature, which is composed of larger spaces between grains (named pores) interconnected by elongated apertures (canyons calls or connections). It is assumed that both the connections as the pores have a constant cross-section and form factors are used to represent irregularities and tortuosity of the actual structures . The network extraction process is important for grouping voxels of

Newsletter EnginSoft Year 12 n°1 - 54


the image and classifying them as members of a particular portion or connection. These two entities are important in the extraction method since they provide a fair representation of the original porous medium. To address the different spatial scales in carbonate samples a set of 3D images with different spatial resolution have to be obtained. From each image is extracted a network of pores and bonds which is integrated into a single multi-scale network. The integration indicates network statistical properties in a natural scale. The challenge lies in the connection between the pores from two different scales, since the expected result is that the connectivity of the multi-scale network is considerably higher than that obtained from the original network. This effect represents a natural feature observed in carbonates rocks where apparently disconnected pores are actually connected by porous structures of smaller scales, which are not visible until there is a multi-scale analysis. At the end of the process, using the generated network, the static petrophysical properties (those that depend only on geometry and topology) can be calculated such as total porosity, effective porosity and pore size distribution. The petrophysical transport properties such as absolute and relative permeability and capilar pressure are calculated from the simulation of flow in one or more stages through the network. The hypotheses considered in simulation are steady, laminar flow and Newtonian fluid so that the flow in the canyons

can be modeled from the widespread Poiseulle equation using the calculated form factors for each element. Throughout the development of the software, samples analyzed by microtomography have been tested in the CONCREMAT Petrophysics Laboratory to obtain a more accurate validation of numerical simulation results. The project therefore includes the collaboration of best practice exchange with Science and Technology Institutes for optimization and validation of scientific methods and with Oil companies to approve their technical requirements under their applicability. Despite the scope of the project deliverables are limited to the simulation of petrophysical properties in rock scaling, CONCREMAT studies the opportunity to expand their studies to the scale of the reservoir, such as the potential contribution of petrophysical investigations in seismic-well integration and petroelastics modeling under the continuous monitoring of hydrocarbon production. Marcus Reis - ESSS Ask the expert in CONCREMAT Marcus Reis, ESSS - marcus@esss.com.br Figure at the top of the article: Pore Network extracted from a microtomography image of a siliciclastic sedimentary rock aided by CONCREMAT-ESSS software (under development).

EnginSoft Spa and FunctionBay GmbH sign a Distribution Agreement to Expand RecurDyn and MBD4Ansys Market Penetration over Europe EnginSoft, the European leading provider of Computer Aided Engineering Solutions and Value Added Services in all sectors and FunctionBay GmbH, the European distributor of the unique FEMBS Simulation Technology RecurDyn (MultiBody simulation with extended capabilities), today announced that they have signed a license distribution agreement for Italy & Europe. EnginSoft will take great advantages from this agreement. It is going to extend its Solution portfolio by adding a revolutionary, outstanding simulation technology, covering a fast growing area of the CAE Computer Aided Engineering - market in Italy and Europe. It will benefit from FunctionBay GmbH technical knowledge of the innovative FEMBS technology. Moreover, the collaboration with FunctionBay GmbH will consolidate the multidisciplinary perception of the brand EnginSoft as one of the major players in the European CAE market. “Combining forces with a company that spends its effort to promote and bring innovation at industrial level, is a pleasure. FunctionBay attitude is very similar to EnginSoft’s one, and our shared views fit in a common mission. A synergy between our CAE market knowledge and the proven potential of RecurDyn technology will create new opportunities for both of us”- stated Stefano Odorizzi, founder and CEO of EnginSoft. “We are happy to establish this partnership with EnginSoft. Their reputation and history of serving the industry with qualified CAE based technologies and services secures that our solutions will be well represented and our future customers will receive the best Value and support”- said Tanja Alexandra Lange, FunctionBay GmbH General Manager.

55 - Newsletter EnginSoft Year 12 n°1

About FunctionBay Inc. and FunctionBay GmbH FunctionBay Inc. is an international company, which develops innovative technologies for mechanical system simulation. Headquartered in Seoul, Korea since 2000, it operates worldwide through its main branches located in Germany, United States, and Japan, as well as its channel partners. FunctionBay’s core technology is the RecurDyn software, which is designed to deliver Multi-Body Dynamics Simulations with high precision and high efficiency. Besides being the leading MBS tool based on fully recursive approach, RecurDyn is conceived to efficiently integrate Multi-Body Simulation (MBS), Structural Dynamics (FEM) and Control System Capabilities in a unique platform, with an easy-to-use Graphical User Interface. Such technology, called FEMBS (from joining FEM & MBS), bridges the gap between the traditional Finite Element approach, and the traditional Multi-Body approach. Recently, FunctionBay has started the development of MBD4Ansys, a tool which is designed to bring both simulation capabilities and solver power of RecurDyn into the ANSYS Workbench Platform. The tool will be launched within the next months, but has already attracted a lot of interest in the ANSYS Workbench community users. FunctionBay GmbH was established in Munich in 2003, as exclusive distributor of RecurDyn within the whole of Europe. Over the years, FunctionBay GmbH has achieved a very high acknowledged reputation as engineering consultant in the field of system dynamics, whilst supporting all of its customers in the effective use of the RecurDyn tool. www.FunctionBay.org For more information: Fabiano Maggio, EnginSoft f.maggio@enginsoft.it

Software Update


Assembly Layout & Process Estimator Integrated graphical environment where assembly line designers can perform an extremely fast evaluation of line concepts The Assembly Layout and Process Estimator is a software developed within the co-financed research activities of the European Commission “RLW-Navigator” project, and has been developed by EnginSoft in partnership with one of the project’s industrial partner, STADCO. It is an integrated graphical environment, where assembly line designers can perform extremely fast evaluations of assembly line concepts. The evaluations are made easily by monitoring the main design KPIs, which are simply computed, taking into account user defined design constraints. All aspects related to the preliminary line design are manageable within the same graphical environment. The KPI monitoring cost drivers and design constraints definition, (which are presented in a customizable and expandable database; together with the fast and intuitive options for line layout population and task sequencing), offers a complete set of tools to shorten and carry out everyday work with ease. Features Description Interactive Graphical User Interface (GUI) The GUI consists of: • Work Space - where the layout is designed. • Operations Table - displays Task Sequencing. • KPIs Panel - shows the main Key Performance Indicators in real-time. • Main Menu Bar - root of any additional function. • Status Bar - displays real-time status progress, updating KPIs • Tools Bar - provides tools to populate Work Space and Operations Table.

Research & Technology Transfer

Layout generation and Task Sequencing definition From the Entities Toolbar, there is a wide variety of pre-defined Entity Categories to choose from, enabling you to place them into the Work Space to create the Assembly Layout. Through the Entity Editor Dialog, it is possible to select a model associated to the chosen Category, and assign its physical dimensions, as well as price and attached Tools. After the Entity definition process, the Entity will be added into the WorkSpace for the user to position, thus building the layout. Each Entity can be assigned Tasks, and all Tasks can be relationally linked in order to define a relational Task Sequence, displayed into the Operations Table on the lower part of the GUI. Once populated, the Operations Table and Work Space can be edited according to the needs of the designer so that all given Design Constraints are satisfied.

Design Constraints, Miscellaneous Costs and KPIs Design Constraints such as the Requested Annual Volume, Time per Shift, Shifts per Day and Days per Year can be defined through a dedicated window, accessed from the Main Menu Bar; they are used to compute - among other KPIs, the Target Cycle Time, which is represented through the Target Cycle Bar displayed on the Operations Table. The graphical representation of the Target Cycle Time gives the user a fast and intuitive indicator of the feasibility of the Layout and Task Sequence, and helps identify bottlenecks and/ or possible improvements.

Newsletter EnginSoft Year 12 n°1 - 56


The above features allow the user to fully customize the work environment, and potentially allow for the use of Assembly Layout and Process Estimator in any field. Export Features Another very useful feature is the export capabilities; where the Layout can be exported as an image and the Task Sequencing as CSV file. A fully customizable auto-reporting function is under development, which will greatly enhance the exporting capabilities.

Fig. 1 - Assembly Layout & Process Estimator UI

Along with the Target Cycle Time, other KPIs are always displayed on the GUI (on KPIs Display Panel and Status Bar); the KPIs (Floor Space, Cycle Time, Equipment Cost, etc…) update dynamically as soon as the user edits the Layout, so that it is always possible to monitor them and understand - step by step, where the design is heading. It is also possible to insert Miscellaneous Costs, such as Operational and Investment Cost Drivers, in order to have a full overview of the Layout costs.

Ask the expert in Assembly Layout & Process Estimator Francesco Micchetti, EnginSoft f.micchetti@enginsoft.it

User Defined Items A key feature of the Assembly Layout and Process Estimator, is the capability to let the end user modify, update and integrate the database. Accessed from the Main Menu Bar, are the Edit Categories and Edit Models dialogs, which allow: • insertion of new custom defined equipment; • creation of new custom defined tasks; • creation of custom defined tools; • association of tasks and tools to custom or default equipment; • insertion of real industrial data for entity and tools models.

The RLW Project RLW Navigator is a three year, €3.9M, project funded by the European Commission under the ICT-Factories of the Future programme. The project has fourteen partners and began in January 2012. The goal of the project is to develop an engineering platform for an emerging joining technology from the automotive industry, Remote Laser Welding (RLW), that will enable the exploitation of this technology and ultimately support other joining processes. The RLW Navigator will accomplish this by integrating universal simulation engine and experimental models to precisely model, configure, optimise and control process variation, production throughput and cost in multistage assembly processes. The RLW Navigator will integrate manufacturing system design information (CAD/CAM) with statistical analysis/variation simulation thus helping to support the early introduction of new processes and fundamentally change these introductions from current trial-and-error to systematic math-based system and station configuration, optimisation and control. www.rlwnavigator.eu Call identifier THEME [FoF-IC-2011.7.4] [Digital factories: Manufacturing design and product lifecycle management]

Fig. 2 - Robotic Assembly Line

Fig. 3 - Database Customization Window

57 - Newsletter EnginSoft Year 12 n°1

Research & Technology Transfer


EnginSoft launches a new subsidiary in Istanbul, Turkey Investment in a new subsidiary reflects ongoing commitment in maximizing value for its customers EnginSoft, premier consulting company in the field of Simulation Based Engineering Science (SBES) with a global presence, announced today the opening of a new subsidiary in Istanbul, Turkey. The company’s new office offers a state of the art in CAE - Computer Aided Engineering, reflecting Global Commitment to continually provide leading industry Product & Services Solutions and maximum value for its customers. The new location reinforces the Mission and desire to increase its international footprint and through this expansion, enable our customers to work with our local subsidiary which is best able to reflect the local market conditions. “We are extremely pleased to open our new subsidiary in Istanbul, it is strengthens further our pursuit of internationalization and reinforces the company’s commitment to Turkey and to our valued customers”, said Stefano Odorizzi, EnginSoft CEO. “We’re convinced our new facilities will further maximize the value and return that our customers receive from our Integrated Products and Services portfolio.”

to facilitate our International Customers, having facilities here, to benefit same high level of services in local language and closer to the specific demand. Furthermore, we offer a range of consultancy methods to match the customer’s requirements, location and budget. We are happy to advise on how a series of Products & Services can be combined together and delivered over a period of time to provide an integrated offer which meets their requirements”, declare Sadi Kopuz EnginSoft Turkey Managing Director.

For more information: Birkan Tunc; Sales & Marketing Manager of EnginSoft Turkey b.tunc@enginsoft.com

The company’s new office presents a number of “Best in Class” CAE Technologies driven by experienced engineers and supported by the latest innovative computers and network infrastructure to ensure our delegates can benefit from the best and up to date technical environment. “As part of our customer focused philosophy, we have opened a new subsidiary in Turkey also

Company News

Newsletter EnginSoft Year 12 n°1 - 58


EnginSoft Turkey started business with great inaugurations both in Istanbul and Ankara EnginSoft successfully completed inaugurations of the new office in Teknopark Istanbul during the events on the 29th of January in Istanbul and on the 30th of January in Ankara. Both events gathered over one hundred guests each; from important commercial companies in defence, automotive, appliances, manufacturing and naval sectors, as well as universities, and public sector organisations showing valuable interest. During the events, CEO of EnginSoft, Stefano Odorizzi and general manager of EnginSoft Turkey, Şadi Kopuz presented the future plans of EnginSoft in Turkey. Expertise areas of EnginSoft for each sector was presented by the global business development team. Dr. İsmail Arı, general manager of Teknopark İstanbul, presented the opening speech in the event in İstanbul, and stated the importance on the existence of EnginSoft Turkey in Teknopark area for the industry. Prof. Dr. Ercan Tezer, secretary of automotive manufacturers association, explained the improvement in recent years within R&D in automotive sectors in Turkey. He also stated that the experience of EnginSoft to the Turkish automotive industry would have great value. Abdullah Raşit Gülhan, president of Sinerjitürk foundation, opened the event in Ankara with his speech focusing on the experience and importance of EnginSoft to the Aerospace and Defence sector. He explained that the extensive experience of EnginSoft in the Defence sector would contribute to the development of Turkish Aerospace and Defence industry. Dr. Şadi Kopuz explains the mission of EnginSoft Turkey as being: an important address in advanced engineering sciences in Turkey, consultancy even in the most complex industrial problems, and research projects in addition to turn-key engineering solutions. Dr. Şadi also states that EnginSoft Turkey would like to have a place in localization projects in different industrial sectors. Finally, training programmes regarding the needs of Turkish industries are among the main focuses of EnginSoft Turkey. Future plans include a new office in Ankara to consider the needs of the Aerospace and Defence sector.

59 - Newsletter EnginSoft Year 12 n°1

Fig. 1 - Stefano Odorizzi, CEO EnginSoft ad Şadi Kopuz, CEO of EnginSoft Turkey

Teknopark Istanbul

Teknopark Istanbul is a science and technology park which was fostered and developed by the Undersecretariat for Defence Industries and the Istanbul Chamber of Commerce to contribute to Turkey’s technology development capacity for local and international entrepreneurs. For EnginSoft, active for years in the field of research and application of advanced technologies, it is natural to be localized within a system of high-tech companies strongly oriented toward innovation. Teknopark Istanbul on one of the largest science and technology parks in Europe and largest in Turkey.

Company News


EnginSoft Group grows EnginSoft, the value driven engineering consultancy company operating in the field of Simulation Based Engineering Sciences (SBES) strengthens its resolve to provide the leading product & service solutions that deliver value for its customers. After many successful collaborations in Tolerance Management projects, TTC³ GmbH (Tolerance Technology Competence Centre) and VSA GmbH (Variation Systems Analysis), based in Cologne, are now part of the EnginSoft Group to further commit and to nurture the tolerance management capabilities offered by the Group. Stefano Odorizzi, CEO of EnginSoft SpA stated: “TTC³/VSA have some superb complimentary skills in the domain of Tolerance Management honed over two decades. From the very first meeting it was clear that our company values and vision was similar – to help our clients become more competitive and profitable by delivering value. Working within the EnginSoft Group allows a natural market expansion for this world-class capability. We are very excited that our current clients can also have access to this service.” The importance of tolerance management can no longer be underestimated, where the incorrect application can

have overwhelming consequences for any company. EnginSoft and TTC³ / VSA operate with the same vision to deliver the best match solutions to meet customer requirements and enable the maximum value and return received from our solutions portfolio. Conrad F. Toepfer the CEO and founder of VSA /TTC³ stated: “Becoming part of a highly motivated and well experienced Team of European engineers with a proven background in complementary technologies puts us a company into a new league. Teaming up with EnginSoft and with the backbone of proven state of art Tolerance simulation products enables us to deliver unique - tailored to the customer - solutions.”

About TTC³ and VSA TTC³ and VSA’s mission is to provide clients with necessary knowledge about the methods and application of threedimensional Geometric Dimensioning and Tolerancing (GD&T) and Tolerance Management in order to achieve technology leadership. Therefore, VSA and TTC³ offer a comprehensive range of software solutions as well as consultancy and training services, e.g. for 2D and 3D simulation software. To achieve the best results and to always tailor their simulation technologies and techniques to the clients’ specific application needs. For almost 20 years, sincerity and honesty are the foundation for many successful long-term relationships with technology leading customers. Professional qualifications, interdisciplinary training and a breadth of experience through cross-market synergies, in addition to an open personality is what distinguishes their team from its competitors. Learn more at www.vsa-ing.com and www.ttc3.com

Company News

Newsletter EnginSoft Year 12 n°1 - 60


Certification Program for Computational Mechanics Engineers in Japan Computational Mechanics (CM) has become an essential technology for manufacturing, because it is concerned with the virtual construction of a product in the design process. Today, CM software programs are used in order to investigate and evaluate deformation, vibration, thermal conduction, fluid flow, and other mechanical behaviors; with the aim to enhance development efficiency and product performance. Structural and thermo-fluid analysis software, which are based on CM, have been widely used in industry, design, and manufacturing. Computer simulations performed with CM software have made it possible to develop products with high performance and safety. The technology is in great demand; it offers high functionality due to remarkable improvements in digital computers. CM software is becoming a black-box process. Non-expert CM engineers are using CM software in various engineering fields. Thanks

to the graphical user interface (GUI) and the robustness of the commercial software, even beginners can achieve results that appear to be good at first glance, although they may be completely erroneous. It is not an easy task to prove that the results are reliable. Misjudgment, input data error, and selection of inappropriate algorithms in solving simulation models may provide completely ill-defined flawed results. Design and manufacturing based on incorrect simulation results can lead to large losses, such as performance deficiencies and accidents. Using reliable CM software alone is not sufficient to obtain a reliable CM analysis. Ensuring the skill level of the CM engineer is critical. For all these reasons - it is important to assess the technological and application knowledge of engineers who use CM software.

Fig.1 Certification level for each field (This table is based on the information provided in the presentation of Dr. Yuuichiro Yoshida at NAFEMS Japan 2013.)

61 - Newsletter EnginSoft Year 12 n°1

CERTIFICATION PROGRAM FOR CM ENGINEERS Based on the above considerations and the demands placed on CM engineers, JSME (The Japan Society for Mechanical Engineers) has launched a certification program for Computational Mechanics Engineers in 2003. Recently, certifications for three fields and nine classes, which include “Senior analyst grade,” “Grade 1,” “Grade 2,” and “Basic Grade”. CM engineers in solid mechanics and thermal fluid fields, and Grade 2 CM engineers in vibration engineering, have been introduced. Basic Grade certificates will be issued to applicants who are deemed to satisfy the certification requirements based on documentary examination. Applicants who achieve the passing score for the Grade 1 and Grade 2 certification examination would be certified as Grade 1 and/or Grade 2. Grade 1 holder applicants, who are screened by two types of essays on experienced analysis works and pass an oral examination, shall be certified as Senior Analysts.

Japan Column


Details of the grades are summarized in Fig.1: The examinations are carried out every December in a major city. The certification examinations for the Senior Analyst grade commenced in 2009, and take place today in Tokyo, every September. At present, the certification examinations are organized by the Innovation Center of JSME, with the cooperation of seven related divisions and four branches of JSME, who are under the co-sponsorship of 54 domestic associations related to computational mechanics. The examinations are also under the support of the Japan Machinery Federation, the Japan Society of Industrial Machinery Manufacturers, and the Japan Electrical Manufacturers’ Association. Fig.2 shows the number of examinees and participants that passed the examinations since 2003. A total of 4,541 participants passed the certification tests in nine classes and are now working in industry and educational/research institutions as CM engineers who have been certified by JSME. The industrial affiliation distribution of applicants in 2011 for first and second grades is shown in Fig. 3. The CM engineer certification program offers the following benefits: 1. Meaningful analysis results that can be obtained using reliable computational mechanics software operated by certificated engineers. 2. The certification clarifies the engineers’ skills in technology and the range of his/her responsibilities, which also leads to a greater value for society. 3. The CM certificated engineer can be expected to have a certain degree of knowledge and skills. Currently, the certification program is managed by the Committee for Licensing Engineering Businesses/Specialized Committee for CM Engineering License Certification; which was established at the Innovation Center of JSME. The specialized committee has been chaired by Prof. Shinobu Yoshimura at the University of Tokyo since its establishment, and the current framework of the certification program has been completed as a result of his vigorous activities. Prof. Toshio Nagashima of Sophia University has held the chairmanship since 2009. Because CM software is already widely used across the world, the certification program is expected to spread globally in the near future. FUTURE PLANS Consistent with the current trend, internationalization is one of the most important topics of the certification program. As one of the activities to foster such ideas and plans, JSME just started the mutual recognition system between JSME senior analyst and NAFEMS PSE advanced. JSME will now call Senior Analysts who are also Professional Simulation

Fig.2 - JSME Certification program results for computational mechanics engineers

Engineers - International Senior Analysts. In this spirit, JSME expects the community of International Senior Analysts to grow in the near future and new interactions of simulation technologies to develop - through mutual recognition - between Europe and Japan. By Toshio Nagashima, Sophia University and Yuuichiro Yoshida, Toshiba I.S. Corporation For more information about this article, please e-mail: yuuichiro.yoshida@toshiba.co.jp (Dr. Yuuichiro Yoshida, Toshiba I.S. Corporation / JSME) By courtesy of JSME (The Japan Society for Mechanical Engineers), an original version of this article was presented in the JSME NEWS Vol.24 No.1 published in July 2013.

Optimization in SUPSI Lugano (CH) The seminar on Innovative Optimization Technologies held in SUPSI at Lugano has concluded the agenda of travelling meetings organized by EnginSoft in 2014 in order to propose its technological offer to Academic World. This tour will go on in 2015 and two events have been already planned at University of Bologna and University of Marche. EnginSoft wants to reinforce the already strong relationships with the academic world introducing innovative themes either at didactic level and in the collaboration of teachers and researches for other events: class lessons, International CAE Conference (which includes the Poster Award), research projects joint with industry. The seminar has been held thanks to the collaboration of Prof. Maurizio Barbato, thermo-fluid dynamics laboratory Manager in ICIMSI (CIM Institute for a sustainable innovation). The participation has been significant and the seminar will be followed, as usual, by a training session during which the technology will be applied to pilot projects in different numerical domains. During the event some specific industrial applications have been presented to show how much these tools are spreading in design, research and development in order to accelerate the product cycle. For more information Lorenzo Benetton - EnginSoft, l.benetton@enginsoft.it

Fig.3 Industrial sectors of applicants in 2011 for 1st and 2nd grades

Japan Column

Newsletter EnginSoft Year 12 n°1 - 62


MUSIC Project at CO.SUMMIT 2015 - 10/11 March 2015 – Berlin The 7th edition of the Co-summit will be dedicated to ‘Smart industry: impact of software innovation’ The 2015 Co-summit will be held on 10 & 11 March in the BCC - Berlin Congress Center in Berlin, Germany. The event is jointly organized by ARTEMIS, represented by ARTEMIS Industry Association, the association for actors in Embedded & Cyber-Physical Systems within Europe, and by ITEA, the EUREKA Cluster on Software-intensive Systems & Services. EnginSoft, as coordinator of the MUSIC consortium and developer of software and database, participates in this event, presenting for the first time a prototype of the MUSIC Control & Cognitive System. ICT, Communication and software innovation for next generation of Factory of Future are the key elements of MUSIC project whose aim is to control all aspects and components of the previous fragmented automated production line. The Control & Cognitive (C&C) system analyzes data gathered by a heterogeneous network of sensors installed on one or more machines for high pressure die casting of light alloys (HDPC) or plastic injection molding (PIM) and produces a set of output data that are used to monitor and optimize in real-time the manufacturing process in terms of Quality, Energy and Cost. The MUSIC system is fully compliant with the OPC-UA industry standard which enables to connect devices, machines and systems from different manufacturers using the same interface. For further information on this event or about the MUSIC project: http://music.eucoord.com

NEXT EVENTS CAE WEBINARS 2015 EnginSoft continues the proposal of the CAE Webinar on specific topics of the virtual prototyping technologies.

October 19-20, 2015 Pacengo del Garda (VR), Italy International CAE Conference http://www.caeconference.com EnginSoft will be the main sponsor of the International CAE Conference. Many of our engineers will be engaged in presenting industrial case studies during parallel sessions, as well as technology updates during the workshops - all within the Conference. Call for papers is now open; send your contribution and become a protagonist!

2015 CAE EVENTS Stay tuned to:

www.enginsoft.it/eventi - www.enginsoft.com/events/ for the complete program of the events in 2015

63 - Newsletter EnginSoft Year 12 n°1

Next topics: ANSYS R16.0 - FENSAP-ICE for icing phenomena - Tolerance analysis with CETOL 6s Stay tuned to www.enginsoft.it/eventi for the complete program of webinars. The podcasts on past CAE Webinars are available at: www.enginsoft.it/eventi

Your FREE Newsletter Copy Do you want to receive a free printed copy of the EnginSoft Newsletter? Just scan the QRcode or connect to: www.enginsoft.it/nl

Events


Save the date

19 - 20 October 2015 Pacengo del Garda, Verona - Italy

Your opportunity to be part of the future! Striving for innovation excellence? Join the thought-leaders at the world’s largest gathering of Simulation Based Engineering and Science Call for Papers is now open! A great opportunity to submit an abstract and share your expertise!

www.caeconference.com


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.