Home » Posts tagged 'Multiscale Modeling'
Tag Archives: Multiscale Modeling
Industry Case Studies: combining discrete and continuum modelling to address industrial R&D challenges
Materials modelling is used today by a range of industries to improve efficiency and achieve breakthroughs in the development of new and improved materials and processes. A set of four case studies has been developed by the European Materials Modelling Council which demonstrate how industrial R&D problems have been addressed by the integration of different types of materials models and what technical and technological benefits and business impacts were achieved as a result.
The case studies cover a diverse set of applications and industries, including chemical processing (Covestro), discovery of new functional materials (IMRA Europe), additive manufacturing of engine parts (MTU Aero Engines) and magnetic hard drive materials (Seagate):
- Identification of Solvents for Extractive Distillation
- Discovery of new thermoelectric materials
- Simulation of additive manufacturing of metallic components
- Integrated Recording Model for Heat Assisted Magnetic Recording (HAMR)
The case studies were compiled with the support of the EC Industrial Technologies Programme.
Here is the executive summary of a new report on the economic impact of materials modelling, co-authored with Christa Court from MRIGlobal in the framework of the European Materials Modelling Council (EMMC) and the International Materials Modelling Board (IM2B). The full text as well as survey form is available here.
At the core of the report is an industry survey conducted during 2015 that provides corroboration for the indicators of research and development (R&D) process improvements found in earlier studies and new data relevant for quantitative economic analyses.
The survey is set in the context of an outline of metrics and methodologies that can be used to quantify the economic impacts of materials modelling from a variety of perspectives including R&D and industry stakeholders and society at large. At the micro-economic level, performance indicators include financial metrics such as net present value, return on investment (ROI), and internal rate of return. Where sufficient data are available, micro-economic analyses could be extended to a more in depth cost benefit analysis. Finally, macro-economic modelling methodologies can be used to model the wider impacts of the integration of materials modelling into the production function of various industries. Since materials modelling is a potentially disruptive technology, macro-economic impact assessment will likely require dynamic simulation models, which are scenario specific and necessitate someone with a high level of both problem domain knowledge and modelling domain knowledge.
Research impact is reviewed briefly based on bibliometrics, case studies, peer review, and economic analysis  using evidence gathered for a previous report  as well as the recent UK Research Excellence Framework , which includes 15 cases involving materials modelling.
The study also investigates how materials modelling impacts the industrial R&D process and outlines the value and potential of materials modelling for industrial research and innovation, competitiveness, and profitability using examples from materials industries based on recent Integrated Computational Materials Engineering studies and a Computer-Aided Drug Design study, which demonstrated the usefulness of defining a performance metrics for a modelling function in an industrial R&D organisation.
The survey analysis was based on information provided by 29 companies covering a wide range of sizes and industry sectors and an even distribution in terms of types and scales of modelling. The qualitative benefits identified in the responses were categorised into the following Key Performance Indicators: More efficient and targeted exploration; Deeper understanding; Broader exploration; R&D strategy development; Source of property data; Trouble shooting; Performance optimisation; Intellectual property protection; Value chain benefits; Improved communication and collaboration between R&D and production; Upscaling and market introduction as well as marketing benefits.
On a quantitative level about 80% of companies reported innovation accomplishment, 60% cost savings, 35% job creation, and 30% revenue increase due to materials modelling. A wide variety of project sizes are represented, with total materials modelling investment (covering staff, software and hardware) ranging from €45K to €4M (average €1M, median €½M). Staff was the largest cost factor: the ratio of staff costs to the median cost of software and hardware, respectively, is 100/20/6. Cost savings due to the materials modelling project ranged from €100K to €50M (average €12M, median €5M). The ROI, determined by the ratio of revenue generated and investment in modelling, ranged from 2 to 1000. Removing the largest and the smallest values yields an average ROI of 8. A trend for ROI to grow more than linearly with investment in modelling was found.
The European Multi-scale Modelling Cluster is going to hold a workshop on Interoperability in Multiscale Modeling of Nano-enabled Materials 28th – 29th May 2015 at University of Jyväskylä, Finland. The workshop program has just been posted, and I look forward to exciting, wide-ranging and in-depth discussion about the way forward on topics such as materials modelling metadata, repositories, a harmonised approach to integration and multiscale modelling platform development and more… Hope to seen many folks there.
The evidence for economic impact of molecular modelling of chemicals and materials is investigated, including the mechanisms by which impact is achieved and how it is measured.
Broadly following a model of transmission from the research base via industry to the consumer, the impact of modelling can be traced from (a) the authors of theories and models via (b) the users of modelling in science and engineering to (c) the research and development staff that utilise the information in the development of new products that benefit society at large.
The question is addressed to what extent molecular modelling is accepted as a mainstream tool that is useful, practical and accessible. A number of technology trends have contributed to increased applicability and acceptance in recent years, including
- Much increased capabilities of hardware and software.
- A convergence of actual technology scales with the scales that can be simulated by molecular modelling as a result of nanotechnology.
- Improved know-how and a focus in industry on cases where molecular simulation works well.
The acceptance level still varies depending on method and application area, with quantum chemistry methods having the highest level of acceptance, and fields with a strong overlap of requirements and method capabilities such as electronics and catalysis reporting strong impact anecdotally and as measured by the size of the modelling community and the number of patents. The picture is somewhat more mixed in areas such as polymers and chemical engineering that rely more heavily on classical and mesoscale simulation methods.
A quantitative approach is attempted by considering available evidence of impact and transmission throughout the expanding circles of influence from the model author to the end product consumer. As indicators of the research base and its ability to transfer knowledge, data about the number of publications, their growth and impact relative to other fields are discussed. Patents and the communities of users and interested ‘consumers’ of modelling results, as well as the size and growth of the software industry provide evidence for transmission of impact further into industry and product development. The return on investment due to industrial R&D process improvements is a measure of the contribution to value creation and justifies determining the macroeconomic impact of modelling as a proportion of the impact of related disciplines such as chemistry and high performance computing. Finally the integration of molecular modelling with workflows for engineered and formulated products provides a direct link to the end consumer.
Key evidence gathered in these areas includes:
- The number of publications in modelling and simulation has been growing more strongly than the science average and has a citation impact considerably above the average.
- There is preliminary evidence for a strong rise in the number of patents, also as a proportion of the number of patents within the respective fields.
- The number of people involved with modelling has been growing strongly for more than a decade. A large user community has developed which is different from the original developer community, and there are more people in managerial and director positions with a background in modelling.
- The software industry has emerged from a ‘hype cycle’ into a phase of sustained growth.
- There is solid evidence for R&D process improvements that can be achieved by using modelling, with a return of investment in the range of 3:1 to 9:1.
- The macroeconomic impact has been estimated on the basis of data for the contribution of chemistry research to the UK economy. The preliminary figures suggest a value add equivalent to 1% of GDP.
- The integration with engineering workflows shows that molecular modelling forms a small but very important part of workflows that have produced very considerable returns on investment.
- E-infrastructures such as high-throughput modelling, materials informatics systems and high performance computing act as multipliers of impact. Molecular modelling is estimated to account for about 6% of the impact generated from high performance computing.
Finally, a number of existing barriers to impact are discussed including deficiencies in some of the methods, software interoperability, usability and integration issues, the need for databases and informatics tools as well as further education and training. These issues notwithstanding, this review found strong and even quantifiable evidence for the impact of modelling from the research base to economic benefits.
We acknowledge financial support from the University of Cambridge in the production of this report.
At the Techconnect Nanotech 2011 conference in Boston a couple of weeks ago, the emphasis was clearly on the ‘downstream’, i.e. realising the potential of nanotechnology in new and exciting products. I was impressed by the progress made at Nanocomp in manufacturing huge sheets and yarns from nanotubes for applications such as EMI shielding and heat straps. Having shared the lab with folks wondering how one could process this stuff in the nineties  the presentation brought it home how far the field has developed in the last 15 years.
Going from the large size applications to the small, Tom Russell presented the latest in his quest to reach addressable arrays of 10 tera-dots per square inch by self-assembly of block copolymers. A fascinating journey, from the enhanced ordering obtained by solvent annealing which gives grain sizes of about 20 micron (“not good enough”), lithography guided assembly (“still not good enough”), to spin coated and solvent annealed copolymer on faceted sapphire wafers, which eventually lead to cylinder phase perpendicular to the sapphire ridges with translational and orientational order persisting over centimetres! Looks like the next generations of memory devices is well on its way.
Big strides are also being made in catalysis. Nanostellar, who design new materials based on a so-called Rational Design Methodology which relies heavily on simulation, presented advances in diesel emission catalysts. It was interesting to hear from CEO Pankaj Dhingra that their focus has changed from using modelling for wide range screening to a more focussed application on uncovering the key selection criteria within a more targeted phase space, in this case Strontium doped Lanthanum perovskites.
The downstream theme was also echoed in the modelling session. Apart from my talk about the ‘landscape’ of integration of atomistic simulation into engineering optimisation, which I’ll come to in another blog, Simon McGrother from CULGI highlighted some great successes of polymer and mesoscale modelling in product development. Despite that, he made the point that these methods have still not reached the ‘democratization’ that was anticipated ten years ago. Based on the growth figures of the modelling community presented in my previous blog, I would actually dispute that. Nevertheless, the impact on ‘downstream’ development and products remains limited, and that’s where I agree with Simon.
On the other hand, the engineering simulation community is showing an interest in molecular modelling, as highlighted in a presentation by Carlos Alguin, Head of the Nanotechnology Group at Autodesk with some cool graphics based on the Maya software and Molecular Maya toolkit. Clearly, the ease of use of and interactivity their design tools and the superb visualization have much to offer the molecular modelling community. The question is though how we achieve further awareness and utilisation of materials modelling back in the engineering world.
 M.S.P Shaffer, X.F. Fan and A.H. Windle, Dispersion and packing of carbon nanotubes, Carbon, Vol. 36, No. 11, pp.1603-1612 (1998)
The atomistic modelling field has grown substantially over the last 10 years, and reached a level of maturity which makes a more routine type of application and integration into engineering and product design a viable option. At the same time, product design has reached scales that are close to atomistic, and also involves exploring an ever larger space of potential new materials across the element table.
Here is some evidence:
The growth of the simulation field was demonstrated very nicely by a recent study based on publications in the ab initio field by the Psi-k network. It shows a strong increase in the number of (unique) people publishing papers based on ab initio methods from about 3000 in 1991 to about 20,000 in 2009, with particularly strong growth in East Asia. If one adds people who use other techniques such as molecular dynamics, and researchers in industry that don’t publish their work, it should be safe to assume that there are more than 30,000 users of some sort of atomistic technique.
This level of growth is also linked to the robustness of the codes and the speed of standard hardware. These together with the experience that has been gained regarding the types properties that can be calculated at a certain level of accuracy have increased the impact of atomistic simulation in many industrial applications.
Also, atomistic techniques support the combinatorial exploration of the large materials phase space. For example, the iCatDesign project in the UK explored alloys for fuel cell catalysts, considering both the combination of different elements as well as structural aspects. The online library of binary alloys from the Energy Materials Lab at Duke is an example of structure calculations that aid in the discovery and development of new materials. Considering ternary alloys are becoming more important in meeting complex requirements in high performance applications such as aerospace and energy generation, and the fact that only about 5% of ternaries are known, such modelling approaches will become even more relevant in new materials design. Also, in other areas such as polymer and composite design, early adopters are demonstrating the usefulness of integration, for example Boeing reported that they “integrated molecular simulations into the materials design process” and their work “demonstrates that the future of aerospace materials development includes simulation tools”.
Despite the growing importance and opportunity of a stronger integration of atomistic methods into engineering design, this is still an area in its infancy, but promoted strongly as part of a wider agenda such as Integrated Computational Materials Engineering (ICME). One of the key questions I am interested in is how the integration is actually achieved. For example, will integration of the modelling methods themselves be required, as in multiscale methods?
While multiscale methods are important for some applications, their significance for integration may be overrated, as was also concluded by the report on ICME report. Rather, the focus needs to be on a more detailed analysis of design workflows, and their intersection with the information that can be determined well at the atomistic scale.
A design workflow typically includes a number of selection stages, at which decisions are made regarding materials and processes. These will be informed by available data from a number of sources and should include atomistic modelling where appropriate. This type of approach has been reported for example by Massimo Noro from Unilever, who talks about selection criteria as “emerging physico-chemical criteria we can evaluate in practice that help us select ingredients”. Also Oreste Todini from Procter & Gamble promotes the use of modelling in the decision process to come up with lead options for new formulations.
So there is evidence of an integrated design approach from early adopters such as Boeing, Unilever and Procter & Gamble. In order to establish integration more widely, engineering and science communities need to collaborate more closely. The atomistic simulations community needs to improve the way in which best practices are established, shared and linked with engineering workflows. Informatics frameworks are being established, for example with the integration of Materials Studio in Accelrys’ Pipeline Pilot platform, and projects such as iCatDesign and MosGrid. However, integration into engineering rather than chemistry platforms may be what is required.