Home » Quantum and atomistic modelling
Category Archives: Quantum and atomistic modelling
I am always happy to see the strong innovation legacy of the Nanotechnology Consortium that I ran from 2004-2010 grow in the Materials Studio releases. The leading edge tools that the Consortium progressed from an academic code to a commercial release include ONETEP (linear scaling DFT), QMERA (coupled electronic-atomistic modelling) as well as the new GULP (atomistic modelling incl reactive forcefield) and DFTB+ (fast, tight binding based DFT). All have been further enhanced and by now are clearly a core part of the Dassault Systemès discrete modelling package. Particularly pleasing is the recent release of the reaction Kinetic Monte Carlo module Kinetix for the general public, about 10 years after it became available to Nanotechnology Consortium members. As other Reaction Kinetic MC tools have moved from academia to a wider industry use (see e.g. Zacros) it is clear that the Nanotech Consortium and all companies that supported it were leading the innovation. I am curious to see where the next wave of Dassault Systemès innovation in materials modelling is going to come from, as sadly the time of consortia seems to be over.
I recently carried out a survey on behalf of the Psi-k network of the European ab initio research community and the CECAM-UK-JCMaxwell Node. The full report can be accessed here, and below is an overview.
The report explores the interactions of the academic Psi-k community with industry and is based on a semi-quantitative survey and interviews of network members. The evidence is analysed in the context of a prior report on the economic impact of molecular modelling [i] as well as of a recent study into Science-to-Business (S-2-B) collaborations [ii] in general.
Pertinent findings of the economic impact report were that the dominant electronic structure method, Density Functional Theory (DFT), is the most widely accepted ‘molecular modelling’ method and that it has become established in the electronics industry. Also of significance are the more than average growth in the number of patents which include DFT, and the growing interest in the potential of modelling in a wider circle of researchers in industry.
The S-2-B study [ii] emphasized the key role of the Principal Investigator (PI) in establishing and maintaining a satisfactory relationship, and the importance to industry of ‘soft’ objectives relative to outcomes with hard metrics.
All Psi-k board, working group and advisory group members, a total of about 120 people were invited to take part in the study, and 40 people responded, representing more than 400 scientists from 33 different institutions in 12 European countries. While it is acknowledged that this group will to some extent pre-select those with industry collaborations, the result that 90% of respondents work with industry is still significant. Main industry sectors of the collaborators are materials, electronics, automotive and aerospace and software. Density functional theory is almost always used in industry collaborations but classical and higher level theory also feature strongly.
It was noted that the Psi-k network represents some of the most widely used electronic structure codes in the world. In fact, all electronic structure codes available in the leading commercial packages originate from Europe and are used at a few hundred industrial sites worldwide.
Psi-k groups that work with industry collaborate on average with 2-3 companies, typically on a long term basis. It is interesting that small groups are just as likely to collaborate with industry as larger ones, and also with roughly the same number of companies. There is however a correlation between the number of collaborating companies and the number of alumni in industry positions, which is consistent with the observation of the S-2-B study that the role of the PI and the depth of the relationship are the dominant factors.
Considering the different forms of interactions, informal interactions dominated, followed by collaborative projects, consultancies and training. Collaborative projects were reported by 75% of respondents with on average one such project per team per year. Nearly 60% of respondents had consultancy and contract research projects, with an average of one such engagement per research team every 1-2 years. Training was least frequent but still more than 40% of respondents had training interactions in the last three years.
The main drivers for industry to collaborate are seen to be the expertise of the PI and access to new ideas and insights. As measures of success, new insights dominate followed by achieving breakthroughs in R&D. On the other hand, despite a clear ROI, cost saving is not generally the driver for collaborations. Impact was often achieved by unveiling mechanisms that could explain observations on a fundamental level and that had previously not been known or properly understood. The new insights thereby helped to overcome long standing misconceptions, leading to a completely new way of thinking and research direction. Similarly, electronic structure calculations helped to scrutinize certain concepts or aspects of engineering models. Less frequently so far seems to be the determination of input parameters for these models. However, the ability of simulations to screen a large number of systems, which would be prohibitively expensive if done experimentally, also plays an important role.
The above evidence and mechanisms of success indicate that the Psi-k network is largely in line with S-2-B collaborations in general, for example in terms of the relationships, importance of PI and the typical ‘soft’ measures of success.
On the other hand we can also see significant opportunities for further improvement. There is sincere interest as well as unmet need in industry. On the one hand, the gap between industry requirements and what can be delivered by today’s theories and simulations is widely acknowledged. On the other hand, there is plenty of evidence that important and impactful topics can be addressed with current methods. However it takes a lot of time, effort and translation skills to identify and act upon these. Despite some activities by the network to further the exchange with industrial research, there is still too little common ground in terms of interactions, interests and language to develop the personal relationships that were found to be crucial for engagements between academics and industry.
However, we see evidence of successful mechanisms that can be built upon. These include utilising multiscale modelling approaches as not only a scientific endeavour but also as an opportunity to build a bridge in terms of communication and relationships. Also, relationships with industry at the level of Ph.D. training seems to be an effective mechanisms not only to train scientists with the relevant skills and understanding but also to build long term relationships between the academic centres and industry. Similarly, centres of excellence that are by their nature set up with industry involvement provide visibility and help to build relationships, although with the proviso [ii] that the single investigator can be the critical determinant.
[i] Goldbeck, G. The economic impact of molecular modelling. Goldbeck Consulting Limited, Available via https://gerhardgoldbeck.wordpress.com/2012/07/10/the-economic-impact-of-molecular-modelling-of-chemicals-and-materials/ (2012).
[ii] Boehm, D. N. & Hogan, T. Science-to-Business collaborations: A science-to-business marketing perspective on scientific knowledge commercialization. Industrial Marketing Management 42, 564–579 (2013).
The evidence for economic impact of molecular modelling of chemicals and materials is investigated, including the mechanisms by which impact is achieved and how it is measured.
Broadly following a model of transmission from the research base via industry to the consumer, the impact of modelling can be traced from (a) the authors of theories and models via (b) the users of modelling in science and engineering to (c) the research and development staff that utilise the information in the development of new products that benefit society at large.
The question is addressed to what extent molecular modelling is accepted as a mainstream tool that is useful, practical and accessible. A number of technology trends have contributed to increased applicability and acceptance in recent years, including
- Much increased capabilities of hardware and software.
- A convergence of actual technology scales with the scales that can be simulated by molecular modelling as a result of nanotechnology.
- Improved know-how and a focus in industry on cases where molecular simulation works well.
The acceptance level still varies depending on method and application area, with quantum chemistry methods having the highest level of acceptance, and fields with a strong overlap of requirements and method capabilities such as electronics and catalysis reporting strong impact anecdotally and as measured by the size of the modelling community and the number of patents. The picture is somewhat more mixed in areas such as polymers and chemical engineering that rely more heavily on classical and mesoscale simulation methods.
A quantitative approach is attempted by considering available evidence of impact and transmission throughout the expanding circles of influence from the model author to the end product consumer. As indicators of the research base and its ability to transfer knowledge, data about the number of publications, their growth and impact relative to other fields are discussed. Patents and the communities of users and interested ‘consumers’ of modelling results, as well as the size and growth of the software industry provide evidence for transmission of impact further into industry and product development. The return on investment due to industrial R&D process improvements is a measure of the contribution to value creation and justifies determining the macroeconomic impact of modelling as a proportion of the impact of related disciplines such as chemistry and high performance computing. Finally the integration of molecular modelling with workflows for engineered and formulated products provides a direct link to the end consumer.
Key evidence gathered in these areas includes:
- The number of publications in modelling and simulation has been growing more strongly than the science average and has a citation impact considerably above the average.
- There is preliminary evidence for a strong rise in the number of patents, also as a proportion of the number of patents within the respective fields.
- The number of people involved with modelling has been growing strongly for more than a decade. A large user community has developed which is different from the original developer community, and there are more people in managerial and director positions with a background in modelling.
- The software industry has emerged from a ‘hype cycle’ into a phase of sustained growth.
- There is solid evidence for R&D process improvements that can be achieved by using modelling, with a return of investment in the range of 3:1 to 9:1.
- The macroeconomic impact has been estimated on the basis of data for the contribution of chemistry research to the UK economy. The preliminary figures suggest a value add equivalent to 1% of GDP.
- The integration with engineering workflows shows that molecular modelling forms a small but very important part of workflows that have produced very considerable returns on investment.
- E-infrastructures such as high-throughput modelling, materials informatics systems and high performance computing act as multipliers of impact. Molecular modelling is estimated to account for about 6% of the impact generated from high performance computing.
Finally, a number of existing barriers to impact are discussed including deficiencies in some of the methods, software interoperability, usability and integration issues, the need for databases and informatics tools as well as further education and training. These issues notwithstanding, this review found strong and even quantifiable evidence for the impact of modelling from the research base to economic benefits.
We acknowledge financial support from the University of Cambridge in the production of this report.
The atomistic modelling field has grown substantially over the last 10 years, and reached a level of maturity which makes a more routine type of application and integration into engineering and product design a viable option. At the same time, product design has reached scales that are close to atomistic, and also involves exploring an ever larger space of potential new materials across the element table.
Here is some evidence:
The growth of the simulation field was demonstrated very nicely by a recent study based on publications in the ab initio field by the Psi-k network. It shows a strong increase in the number of (unique) people publishing papers based on ab initio methods from about 3000 in 1991 to about 20,000 in 2009, with particularly strong growth in East Asia. If one adds people who use other techniques such as molecular dynamics, and researchers in industry that don’t publish their work, it should be safe to assume that there are more than 30,000 users of some sort of atomistic technique.
This level of growth is also linked to the robustness of the codes and the speed of standard hardware. These together with the experience that has been gained regarding the types properties that can be calculated at a certain level of accuracy have increased the impact of atomistic simulation in many industrial applications.
Also, atomistic techniques support the combinatorial exploration of the large materials phase space. For example, the iCatDesign project in the UK explored alloys for fuel cell catalysts, considering both the combination of different elements as well as structural aspects. The online library of binary alloys from the Energy Materials Lab at Duke is an example of structure calculations that aid in the discovery and development of new materials. Considering ternary alloys are becoming more important in meeting complex requirements in high performance applications such as aerospace and energy generation, and the fact that only about 5% of ternaries are known, such modelling approaches will become even more relevant in new materials design. Also, in other areas such as polymer and composite design, early adopters are demonstrating the usefulness of integration, for example Boeing reported that they “integrated molecular simulations into the materials design process” and their work “demonstrates that the future of aerospace materials development includes simulation tools”.
Despite the growing importance and opportunity of a stronger integration of atomistic methods into engineering design, this is still an area in its infancy, but promoted strongly as part of a wider agenda such as Integrated Computational Materials Engineering (ICME). One of the key questions I am interested in is how the integration is actually achieved. For example, will integration of the modelling methods themselves be required, as in multiscale methods?
While multiscale methods are important for some applications, their significance for integration may be overrated, as was also concluded by the report on ICME report. Rather, the focus needs to be on a more detailed analysis of design workflows, and their intersection with the information that can be determined well at the atomistic scale.
A design workflow typically includes a number of selection stages, at which decisions are made regarding materials and processes. These will be informed by available data from a number of sources and should include atomistic modelling where appropriate. This type of approach has been reported for example by Massimo Noro from Unilever, who talks about selection criteria as “emerging physico-chemical criteria we can evaluate in practice that help us select ingredients”. Also Oreste Todini from Procter & Gamble promotes the use of modelling in the decision process to come up with lead options for new formulations.
So there is evidence of an integrated design approach from early adopters such as Boeing, Unilever and Procter & Gamble. In order to establish integration more widely, engineering and science communities need to collaborate more closely. The atomistic simulations community needs to improve the way in which best practices are established, shared and linked with engineering workflows. Informatics frameworks are being established, for example with the integration of Materials Studio in Accelrys’ Pipeline Pilot platform, and projects such as iCatDesign and MosGrid. However, integration into engineering rather than chemistry platforms may be what is required.