Simulation software has made major strides forward in recent years and 2020 will continue that trend. In the year ahead there will be a number of defining shifts that will stand out as noteworthy developments in the software and its impact on industry.
AI and machine learning
Like many other industries and technologies, AI and machine learning have a big role to play in the year ahead. For simulation, there’s a clear opportunity to utilise AI and machine learning to complete basic processes and administration to save engineers time or simplify procedures. The setting of parameters is one such area. A machine learning engine can monitor senior engineers’ use of simulation tools, including how they use it and what parameters they set, to then accurately recreate this to a degree for less experienced engineers and enable them to use the tool.
For an aerospace company with hundreds, even thousands of employees, of which only 5 per cent can currently use simulation technology, AI and machine learning will make it more accessible so perhaps 15-20 per cent of employees can use it. Clearly, this is a victory for that company, which can now commit more time and energy towards the simulation process, without the need to hire more senior engineers.
Another area that AI/ML can help with simulation is to use data-driven or physics-informed neural network solvers to accelerate simulation by orders of magnitude. Instead of solving second order partial differential equations (PDE) using traditional numerical methods such as finite element or finite volume methods, these newer AI/ML methods use neural networks to solve PDEs. These methods have been shown to work with simple geometries, and boundary conditions. We are working on applying these new methods to real-world complex problems.
Multiphysics as a concept has origins dating back 50 years. As it has developed, it has faced many challenges and today’s is interactions between the different physics tools. Historically, an engineer would use different physics simulation tools to overcome a variety of design issues with a single product. Take a computer chip as an example: you would simulate the heat given off by a chip, then analyse how this affects the circuit board that it is housed on, and then find a solution for how to cool the chip to protect the circuit board from cracking.
While the step by step approach mentioned there has been the best option available for many years, engineers are demanding a way to resolve these issues at the same time in parallel.
This multidisciplinary optimisation will reduce the time required to analyse the product, whether it’s a chip or otherwise, and find the right solution to whichever problem engineers are faced with. This leads to better products at lower costs.
ANSYS’ acquisition of simulation process integration and design optimisation leader Dynardo enables us to move one step closer to multiphysics interactions and enables our customers to identify optimal product designs faster and more economically. In the year ahead further efforts will advance the technology.
Microservices for simulation
We’ll also see progress made in the area of ‘microservices for simulation’, whereby we transform the major parts of simulation – for example, geometry, then meshing, followed by solver, and finally post-processing – from one monolith process to dedicated separate parts. The steps required for simulation will be independent; there will be geometry services, meshing services, solver services, and post-processing services, instead of one single process.
These services can then be used by different products, integrating with one another through application programming interfaces (APIs) and running scalably on cloud computing like Microsoft Azure or AWS – the result will be more accessibility, more flexibility, and more re-usability to do many different tasks. The APIs will also enable users of simulation to connect, for example, ANSYS tools with any other company’s systems, for a truly open platform.
(Hyper) Scale up simulation
One of the main challenges to users of many types of software is the run time. They’re increasingly demanding faster run times and simulation is no different – we’ll see the focus on this intensify in 2020.
One way to run faster is through parallel computing. Over the years parallel computing has covered many different forms, from Shared Memory Processing (SMP), to Message Passing Interface (MPI) to fine grain GPU based parallelism to Task based parallelism. For hyper scale, the idea is we use all forms of algorithms, GPU, SMP, MPI, and take based, on exascale supercomputers. This means customers will be able to tap into hyper scale simulation and run simulation that previously might have taken 10,000 hours to complete and, because it's running on more cores, cut that time significantly and run it in potentially only a matter of minutes or a couple of hours.
There’s much work to be done in this area. 2020 will be too early to expect hyper scale to the degree I’ve described, but it will be achieved within the next decade.
Predictive and robust design
In an effort to achieve efficiency and cost savings, many manufacturers and service providers have eliminated over-engineering and instead focused on a minimalist design. Where 5 inches of tarmac may be required for a highway, exactly 5 inches will be used, not 10 inches like may have been in the past ‘just in case’. The issue is that variations occur in all materials, which means a calculation for the volume of tarmac required may differ from project to project. Therefore, 5 inches of tarmac would be suitable on one occasion, but not another.
Robust design through simulation addresses uncertainties such as these and will be increasingly exercised in the coming year. Using simulation to assess the materials and calculate uncertainty prevents both over- and under-engineering of products and services; instead of a 500 per cent factor of safety, which is too high and inefficient, or 100 per cent, which leaves no room for material variation, robust design may settle on a 110 per cent factor of safety – on an informed basis!
Key to understanding material variables, and the subsequent ability to calculate the ideal factor of safety, is materials intelligence. This was behind ANSYS’ decision to acquire Granta Design, which advances ANSYS’ capabilities in this area and enables users of our simulation tools to quantify, validate and verify their products and services in the presence of uncertainty and ensure the optimum factor of safety.
Simulation is already digital, right? Yes, but it’s increasingly encompassing the physical world too. Recently, thanks to the Internet of Things (IOT), the usage of digital twins has boomed and will continue to do so. Here, engineers are digitising information of a physical part, enabling them to analyse performance and monitor systems to head off problems with the real-world component or machine before they occur. Now that the technology has established itself and proven its worth, by helping to minimise downtime and the associated costs, adoption will in an upward trend.
Then there’s augmented (AR) and virtual reality (VR) to consider. At present, engineers visualise their simulated designs on 2D screens, but with the acceleration and accessibility of VR and AR technology they will soon be visualising their designs in a 3D environment on a AR/VR headset such as in the Oculus portfolio. Data will be easier to evaluate and designs will be simpler to understand, edit and test, leading to a leaner, more effective process.
Design, testing, maintenance – simulation has a critical role in these activities. But what about where a product or part fails? How can simulation help then? Through manufacturers digitally transforming their operations, they can align all of their activities, from the initial design, through manufacturing, to sale, to track and trace every part. So, if there were an issue with the brakes on a certain car model, it can be traced back to the original simulation design and this design can be reviewed and the flaw quickly identified. Then, if the model is recalled, the issue can be resolved quicker and more cost-effectively than going through a manual testing process that could take longer and cost more to identify the issue.
This digital transformation is a big task for most businesses, but many have already begun. I expect this to accelerate in 2020 and big manufacturers to complete the journey before the year ends.
Simulation in new areas
Simulation is well-established in a number of areas of real-world multiphysics conditions and in 2020 we’ll see simulation software providers pushing the boundaries of the technology to solve problems in other areas of physics. Chemistry for healthcare is an area not currently widely catered to by multiphysics simulation but would benefit from it.
What might this look like in practice? Clinical trials for new drugs require testing on humans, but one day, through simulation, the trials could be conducted purely through simulation. The need to apply the drug on thousands of test subjects is eliminated, and so too are the huge costs of the trials – and where trials cannot be conducted, for example, on children, the applications for simulation are vast. Similarly, in healthcare, where a heart attack occurs because of a blood clot, simulation could be used to identify the correct drugs to thin the blood and resolve the clot.
Innovate, innovate, innovate!
The very nature of technology today means that anyone can create tools that replace those that have been established for years over night. A startup in Silicon Valley could create a completely new way to complete a task or solve a problem and disrupt existing services. Many forgotten businesses of yesteryear failed to prepare for this risk. Instead of ignoring the slim chances of this happening, businesses must address the risk head on by innovating.
Don’t wait to be disrupted – disrupt yourself! The iPhone is an excellent example of what this looks like in practice. Apple disrupted its own products with even more advanced models. Simulation is no different; we strive to disrupt our own simulation through new technologies, and mitigate the chance of being matched.
Prith Banerjee, chief technology officer, ANSYS