Materials Modelling

Home » ICME

Category Archives: ICME

CEN Workshop on materials modelling terminology, classification and metadata

Following the proposal by a group of European scientists involved in materials modelling CEN (the European Committee for Standardization) has announced a new workshop on the subject “Materials modelling terminology, classification and metadata”. It is based on many years of effort led by the European Commission and the European Materials Modelling Council (EMMC), as expressed in the Review of Materials Modelling (RoMM), which will be released in its sixth edition in January 2017. The aim is to agree on a terminology and classification of materials models and organise the description of materials modelling applications based on a system referred to as MODA (Materials Modelling Data). A common terminology in materials modelling should lead to simplified and much more efficient communication and lower the barrier to utilising materials modelling. The end result is the adoption of a CEN Workshop Agreement (CWA), a best practices document for further standardisation efforts and input for the development of a future certification scheme.

A New European Network to Coordinate and Support the Industrial Uptake of Materials Modelling

In recognition of the importance of materials modelling for industrial innovation and the strength of Europe, a new Horizon 2020 project has been funded to augment and further boost the actions of the European Materials Modelling Council (EMMC). The new European Materials Modelling Council Coordination and Support Action (EMMC-CSA) includes 15 partners and is coordinated by TU Wien.

Goldbeck Consulting is part of the EMMC management team and leads Work Package 2 on Interoperability and Integration of materials modelling.

For further information, see the EMMC-CSA Press Release.

csa-logos

 

Boeing is moving ahead with integrating chemistry and materials modelling into the product life cycle

The EMMC Roadmap for Materials Modelling is calling for a number of actions to increase the application and impact of materials modelling in industry. In its Objectives and Vision it states that “the ultimate goal is that materials modelling and simulation will become an integral part of product life cycle management in European industry” and that “in recent years, materials modelling of nano-scale phenomena, especially that based on discrete models (electronic/atomistic/mesoscopic) has developed rapidly. However, this has not yet led to the integration of these models as part of the industrial design tool chain of materials and products.” While true in general, it turns out that world leading organisations are already implementing such integration.

An example of a company that seems to be well ahead of the curve is Boeing. In fact, this blog was triggered by a panel discussion that I participated in at the Predictive Materials Modelling workshop in Cambridge early December 2015. The panel on Aerospace applications was led by Airbus who presented their elaborate work on virtual testing of aircraft frames. While the work is clearly very challenging in terms of the computational resolution in FEM models required and the issues in identifying ‘hotspots’ etc., at this stage of development the materials are already well defined and no variation in material parameters are allowed or considered any more. So what about the actual materials development and its integration into this process as outlined in the vision above?

Certainly Airbus competitor Boeing has been very active in the materials modelling field down to the chemistry level for some time, see for example Reference [[i],[ii]]. Rather than relying on the traditional supply chain dynamics, Boeing has become involved in chemistry based research in-silico, thereby taking a pro-active role in shaping its own future across all disciplines.

Three recent Boeing patents actually demonstrate the significance of the corner-stones of exploitation of materials modelling at the industrial level which are also highlighted in the EMMC roadmap (a) materials modelling has been developed to a point that it can make an impact on real industrial problems, (b) multi-scale modelling workflows are key to realising impact, (c) it is important for business efficiency and effectiveness to integrate information gained down to the chemistry level into wider information management and business decision support systems.

Testimony to (a) is Patent US20150080494 (Filing date:   4 Feb 2014) “Fiber-reinforced resin composites and methods of making the same”. It deals with the efficiency of load transfer between the fiber and the surrounding matrix at the micro-scale level, which may directly affect the overall mechanical performance of the composite at the continuum level. “The region of the matrix that may be substantially affected by the presence of fibers, sometimes referred to as the “interphase” region, is the interfacial area of the matrix directly surrounding the fiber. In composites, this interphase region may experience high shear strain due to the mismatch in elastic stiffness between the fibers and the surrounding matrix. Widely-used conventional bulk resins may not provide desirable distortional capabilities.” The patent claims superior performance of resins developed with the help of atomistic materials modelling. This performance improvement could translate into substantial efficiency in load bearing and associated lower weight of the aircraft frame.

Testimony to (b) is Patent US 08862437 (Application date: 30 Mar 2010) on “Multi-scale modeling of composite structures”. The following patent abstract is a bit hard to read but basically seems to claim that there is a controlled, deterministic relationship between composite performance and materials/chemical structure at various levels, as calculated by modelling: “A method, apparatus, and computer program product are present for creating a composite structure. A number of characteristics for a number of components for the composite structure is obtained from a simulation of the composite structure using a model of the composite structure. A number of changes in the number of characteristics needed to meet a desired level of performance for the number of characteristics is ascertained. A number of attributes for a number of composite materials used to form the number of components corresponding to the number of characteristics having the number of changes is identified. The number of attributes for the number of composite materials for the number of characteristics having the number of changes based on the desired level of performance is changed.”

Testimony to (C) is Patent WO2015060960 (Filed on 18 Sep 2014): “Product Chemical Profile System”. The abstract describes as system that is able to pull together and query all levels of information about a product down to the chemistry level:  “A computer-implemented system and method for obtaining product related information obtained from a plurality of different sources that is transformed into processed product data with a plurality of levels. Callouts and contexts are identified and a product-to-chemical continuum is generated by creating callout-context pathway segments between the plurality of levels of the processed product data based on the callouts and contexts identified and a transformed query request is generated used to traverse the product-to-chemical continuum through the callout-context pathway segments that span the plurality of levels. The product information that matches the set of context search parameters is extracted from the product-to-chemical continuum. The callout context pathway segments reduce processing resources and time needed to obtain the product information.”

Schematic illustrating the Product Chemical Profile System. From Patent WO2015060960

Schematic illustrating the Product Chemical Profile System. From Patent WO2015060960

These patents are a clear recognition of the relevance and importance materials modelling and a more integrated approach to engineering. The question remains however how to tear down barriers preventing its wider exploitation across the whole community. That is what the EMMC Roadmap, current and forthcoming Horizon2020 actions aim to address.

[i] A. Browning, “Utilization of Molecular Simulations in Aerospace Materials: Simulation of Thermoset Resin/Graphite Interactions,” Proceedings of AIChE Fall Annual Meeting, 2009.

[ii] Knox, C. K., Andzelm, J. W., Lenhart, J. L., Browning, A. R., & Christensen, S. (2010, December). High strain rate mechanical behavior of epoxy networks from molecular dynamics simulations. In Proc of 27th army science conf, Orlando, FL, GP-09.

Interoperability in Multiscale Modeling of Nano-enabled Materials

The European Multi-scale Modelling Cluster is going to hold a workshop on Interoperability in Multiscale Modeling of Nano-enabled Materials 28th – 29th May 2015 at University of Jyväskylä, Finland. The workshop program has just been posted, and I look forward to exciting, wide-ranging and in-depth discussion about the way forward on topics such as materials modelling metadata, repositories, a harmonised approach to integration and multiscale modelling platform development and more… Hope to seen many folks there.

The economic impact of molecular modelling of chemicals and materials

Here is the executive summary of an extensive report from Goldbeck Consulting which is available here (The economic impact of modelling) or by request from info@goldbeck-consulting.com.

The evidence for economic impact of molecular modelling of chemicals and materials is investigated, including the mechanisms by which impact is achieved and how it is measured.

Broadly following a model of transmission from the research base via industry to the consumer, the impact of modelling can be traced from (a) the authors of theories and models via (b) the users of modelling in science and engineering to (c) the research and development staff that utilise the information in the development of new products that benefit society at large.

The question is addressed to what extent molecular modelling is accepted as a mainstream tool that is useful, practical and accessible. A number of technology trends have contributed to increased applicability and acceptance in recent years, including

  • Much increased capabilities of hardware and software.
  • A convergence of actual technology scales with the scales that can be simulated by molecular modelling as a result of nanotechnology.
  • Improved know-how and a focus in industry on cases where molecular simulation works well.

The acceptance level still varies depending on method and application area, with quantum chemistry methods having the highest level of acceptance, and fields with a strong overlap of requirements and method capabilities such as electronics and catalysis reporting strong impact anecdotally and as measured by the size of the modelling community and the number of patents. The picture is somewhat more mixed in areas such as polymers and chemical engineering that rely more heavily on classical and mesoscale simulation methods.

A quantitative approach is attempted by considering available evidence of impact and transmission throughout the expanding circles of influence from the model author to the end product consumer.  As indicators of the research base and its ability to transfer knowledge, data about the number of publications, their growth and impact relative to other fields are discussed. Patents and the communities of users and interested ‘consumers’ of modelling results, as well as the size and growth of the software industry provide evidence for transmission of impact further into industry and product development. The return on investment due to industrial R&D process improvements is a measure of the contribution to value creation and justifies determining the macroeconomic impact of modelling as a proportion of the impact of related disciplines such as chemistry and high performance computing. Finally the integration of molecular modelling with workflows for engineered and formulated products provides a direct link to the end consumer.

Key evidence gathered in these areas includes:

  • The number of publications in modelling and simulation has been growing more strongly than the science average and has a citation impact considerably above the average.
  • There is preliminary evidence for a strong rise in the number of patents, also as a proportion of the number of patents within the respective fields.
  • The number of people involved with modelling has been growing strongly for more than a decade. A large user community has developed which is different from the original developer community, and there are more people in managerial and director positions with a background in modelling.
  • The software industry has emerged from a ‘hype cycle’ into a phase of sustained growth.
  • There is solid evidence for R&D process improvements that can be achieved by using modelling, with a return of investment in the range of 3:1 to 9:1.
  • The macroeconomic impact has been estimated on the basis of data for the contribution of chemistry research to the UK economy. The preliminary figures suggest a value add equivalent to 1% of GDP.
  • The integration with engineering workflows shows that molecular modelling forms a small but very important part of workflows that have produced very considerable returns on investment.
  • E-infrastructures such as high-throughput modelling, materials informatics systems and high performance computing act as multipliers of impact. Molecular modelling is estimated to account for about 6% of the impact generated from high performance computing.

Finally, a number of existing barriers to impact are discussed including deficiencies in some of the methods, software interoperability, usability and integration issues, the need for databases and informatics tools as well as further education and training. These issues notwithstanding, this review found strong and even quantifiable evidence for the impact of modelling from the research base to economic benefits.

We acknowledge financial support from the University of Cambridge in the production of this report.

Nanotech going downstream

At the Techconnect Nanotech 2011 conference in Boston a couple of weeks ago, the emphasis was clearly on the ‘downstream’, i.e. realising the potential of nanotechnology in new and exciting products. I was impressed by the progress made at Nanocomp in manufacturing huge sheets and yarns from nanotubes for applications such as EMI shielding and heat straps. Having shared the lab with folks wondering how one could process this stuff in the nineties [1] the presentation brought it home how far the field has developed in the last 15 years.

Going from the large size applications to the small, Tom Russell presented the latest in his quest to reach addressable arrays of 10 tera-dots per square inch by self-assembly of block copolymers. A fascinating journey, from the enhanced ordering obtained by solvent annealing which gives grain sizes of about 20 micron (“not good enough”), lithography guided assembly (“still not good enough”), to spin coated and solvent annealed copolymer on faceted sapphire wafers, which eventually lead to cylinder phase perpendicular to the sapphire ridges with translational and orientational order persisting over centimetres! Looks like the next generations of memory devices is well on its way.

Big strides are also being made in catalysis. Nanostellar, who design new materials based on a so-called Rational Design Methodology which relies heavily on simulation, presented advances in diesel emission catalysts.  It was interesting to hear from CEO Pankaj Dhingra that their focus has changed from using modelling for wide range screening to a more focussed application on uncovering the key selection criteria within a more targeted phase space, in this case Strontium doped Lanthanum perovskites.

The downstream theme was also echoed in the modelling session. Apart from my talk about the ‘landscape’ of integration of atomistic simulation into engineering optimisation, which I’ll come to in another blog, Simon McGrother from CULGI highlighted some great successes of polymer and mesoscale modelling in product development.  Despite that, he made the point that these methods have still not reached the ‘democratization’ that was anticipated ten years ago. Based on the growth figures of the modelling community presented in my previous blog, I would actually dispute that. Nevertheless, the impact on ‘downstream’ development and products remains limited, and that’s where I agree with Simon.

On the other hand, the engineering simulation community is showing an interest in molecular modelling, as highlighted in a presentation by Carlos Alguin, Head of the Nanotechnology Group at Autodesk with some cool graphics based on the Maya software and Molecular Maya toolkit. Clearly, the ease of use of and interactivity their design tools and the superb visualization have much to offer the molecular modelling community. The question is though how we achieve further awareness and utilisation of materials modelling back in the engineering world.

[1] M.S.P Shaffer, X.F. Fan and A.H. Windle, Dispersion and packing of carbon nanotubes, Carbon, Vol. 36, No. 11, pp.1603-1612 (1998)

Integration of atomistic modelling into product design

The atomistic modelling field has grown substantially over the last 10 years, and reached a level of maturity which makes a more routine type of application and integration into engineering and product design a viable option. At the same time, product design has reached scales that are close to atomistic, and also involves exploring an ever larger space of potential new materials across the element table.

Here is some evidence:

The growth of the simulation field was demonstrated very nicely by a recent study based on publications in the ab initio field by the Psi-k network. It shows a strong increase in the number of (unique) people publishing papers based on ab initio methods from about 3000 in 1991 to about 20,000 in 2009, with particularly strong growth in East Asia. If one adds people who use other techniques such as molecular dynamics, and researchers in industry that don’t publish their work, it should be safe to assume that there are more than 30,000 users of some sort of atomistic technique.

This level of growth is also linked to the robustness of the codes and the speed of standard hardware. These together with the experience that has been gained regarding the types properties that can be calculated at a certain level of accuracy have increased the impact of atomistic simulation in many industrial applications.

Also, atomistic techniques support the combinatorial exploration of the large materials phase space. For example, the iCatDesign project in the UK explored alloys for fuel cell catalysts, considering both the combination of different elements as well as structural aspects. The online library of binary alloys from the Energy Materials Lab at Duke  is an example of structure calculations that aid in the discovery and development of new materials. Considering ternary alloys are becoming more important in meeting complex requirements in high performance applications such as aerospace and energy generation, and the fact that only about 5% of ternaries are known, such modelling approaches will become even more relevant in new materials design. Also, in other areas such as polymer and composite design, early adopters are demonstrating the usefulness of integration, for example Boeing reported that they “integrated molecular simulations into the materials design process” and their work “demonstrates that the future of aerospace materials development includes simulation tools”.

Despite the growing importance and opportunity of a stronger integration of atomistic methods into engineering design, this is still an area in its infancy, but promoted strongly as part of a wider agenda such as Integrated Computational Materials Engineering (ICME). One of the key questions I am interested in is how the integration is actually achieved. For example, will integration of the modelling methods themselves be required, as in multiscale methods?

While multiscale methods are important for some applications, their significance for integration may be overrated, as was also concluded by the report on ICME report. Rather, the focus needs to be on a more detailed analysis of design workflows, and their intersection with the information that can be determined well at the atomistic scale.

A design workflow typically includes a number of selection stages, at which decisions are made regarding materials and processes. These will be informed by available data from a number of sources and should include atomistic modelling where appropriate.  This type of approach has been reported for example by Massimo Noro from Unilever, who talks about selection criteria as “emerging physico-chemical criteria we can evaluate in practice that help us select ingredients”. Also Oreste Todini from Procter & Gamble promotes the use of modelling in the decision process to come up with lead options for new formulations.

So there is evidence of an integrated design approach from early adopters such as Boeing, Unilever and Procter & Gamble.  In order to establish integration more widely, engineering and science communities need to collaborate more closely. The atomistic simulations community needs to improve the way in which best practices are established, shared and linked with engineering workflows. Informatics frameworks are being established, for example with the integration of Materials Studio in Accelrys’ Pipeline Pilot platform, and projects such as iCatDesign and MosGrid. However, integration into engineering rather than chemistry platforms may be what is required.