Home » Posts tagged 'Atomistic modeling'
Tag Archives: Atomistic modeling
An article just appeared which summarizes (and includes case examples for) some important lessons for the application of computational methods in drug design. I write about it here because it includes important messages also for the materials modelling and design field. There are differences of course since materials innovation is not always about new materials design. Nevertheless some of the key points are still valid and at least worth considering. I like the ‘principle of parsimony’, and also the conclusion about the importance of good software design. Much needed in the materials field as well.
Here are some key quotes and extracts from the paper .
The value of qualitative statements. Frequently, a single new idea or a pointer in a new direction is sufficient guidance for a project team. Most project impact comes from qualitative work, from sharing an insight or a hypothesis rather than a calculated number or a priority order. The importance of this observation cannot be overrated in a field that has invested enormously in quantitative prediction methods. We believe that quantitative prediction alone is a misleading mission statement for molecular design. …
Shaping chemical space. At any given point during a project, a team’s focus is either on expanding chemical space or on narrowing it down, for different aspects of problem solving and optimization. Broadening chemical space requires methods that create new ideas within a set of constraints. ….Narrowing down chemical space can be a simple filtering process or can be based on a specific hypothesis. Within a given project context, it is important to understand whether it is required to broaden or narrow down chemical space and to choose tools and approaches accordingly. As projects progress towards candidate selection, the “amplitudes” of narrowing and broadening space typically become smaller, but the concept stays the same.
The principle of parsimony. Molecular design is a conceptual process and therefore always at risk of losing touch with reality. The scientific questions should lead to the method, and not vice versa. To achieve this, it is a helpful guiding principle to keep things as simple as possible. Choosing the simplest possible explanation and the simplest possible computational protocol leads to agility and to a better focus on the key questions at hand. …
Annotation is half the battle. … Contextual information can add value almost anywhere. A good deal of frontloading work—computational, organizational—is often required to bring data into a useful shape. Proper frontloading work can turn sophisticated queries into simple lookup processes or visualization steps. There is a significant growth potential in this area.
Staying close to experiment. One way of keeping things as simple as possible is to preferentially utilize experimental data that may support a project, wherever this is meaningful. … Rational drug design has a lot to do with clever recycling. If consistently applied, these guidelines have significant implications for the current practice of molecular design.
Let us look at some of the more problematic aspects as well. Many computational methods introduce additional parameters and thus potential sources of error that make the predictive value harder to extract. …..
What is special about molecular design is the need to build solid hypotheses and to simultaneously foster creative thinking in medicinal chemistry. If we accept this, our focus may shift from the many semi-quantitative prediction tools that we have to methods supporting this creative process. Further improvements in computational methods may then have less to do with science than with good software engineering and interface design. The tools are a just means to an end. Good science is what happens when they are appropriately employed.
 A Real-World Perspective on Molecular Design. Bernd Kuhn, Wolfgang Guba, Jérôme Hert, David W. Banner, Caterina Bissantz, Simona Maria Ceccarelli, Wolfgang Haap, Matthias Körner, Andreas Kuglstatter, Christian D. Lerner, Patrizio Mattei, Werner Neidhart, Emmanuel Pinard, Markus G. Rudolph, Tanja Schulz-Gasch, Thomas J. Woltering, and Martin Stahl
J. Med. Chem., DOI: 10.1021/acs.jmedchem.5b01875 • Publication Date (Web): 15 Feb 2016
The evidence for economic impact of molecular modelling of chemicals and materials is investigated, including the mechanisms by which impact is achieved and how it is measured.
Broadly following a model of transmission from the research base via industry to the consumer, the impact of modelling can be traced from (a) the authors of theories and models via (b) the users of modelling in science and engineering to (c) the research and development staff that utilise the information in the development of new products that benefit society at large.
The question is addressed to what extent molecular modelling is accepted as a mainstream tool that is useful, practical and accessible. A number of technology trends have contributed to increased applicability and acceptance in recent years, including
- Much increased capabilities of hardware and software.
- A convergence of actual technology scales with the scales that can be simulated by molecular modelling as a result of nanotechnology.
- Improved know-how and a focus in industry on cases where molecular simulation works well.
The acceptance level still varies depending on method and application area, with quantum chemistry methods having the highest level of acceptance, and fields with a strong overlap of requirements and method capabilities such as electronics and catalysis reporting strong impact anecdotally and as measured by the size of the modelling community and the number of patents. The picture is somewhat more mixed in areas such as polymers and chemical engineering that rely more heavily on classical and mesoscale simulation methods.
A quantitative approach is attempted by considering available evidence of impact and transmission throughout the expanding circles of influence from the model author to the end product consumer. As indicators of the research base and its ability to transfer knowledge, data about the number of publications, their growth and impact relative to other fields are discussed. Patents and the communities of users and interested ‘consumers’ of modelling results, as well as the size and growth of the software industry provide evidence for transmission of impact further into industry and product development. The return on investment due to industrial R&D process improvements is a measure of the contribution to value creation and justifies determining the macroeconomic impact of modelling as a proportion of the impact of related disciplines such as chemistry and high performance computing. Finally the integration of molecular modelling with workflows for engineered and formulated products provides a direct link to the end consumer.
Key evidence gathered in these areas includes:
- The number of publications in modelling and simulation has been growing more strongly than the science average and has a citation impact considerably above the average.
- There is preliminary evidence for a strong rise in the number of patents, also as a proportion of the number of patents within the respective fields.
- The number of people involved with modelling has been growing strongly for more than a decade. A large user community has developed which is different from the original developer community, and there are more people in managerial and director positions with a background in modelling.
- The software industry has emerged from a ‘hype cycle’ into a phase of sustained growth.
- There is solid evidence for R&D process improvements that can be achieved by using modelling, with a return of investment in the range of 3:1 to 9:1.
- The macroeconomic impact has been estimated on the basis of data for the contribution of chemistry research to the UK economy. The preliminary figures suggest a value add equivalent to 1% of GDP.
- The integration with engineering workflows shows that molecular modelling forms a small but very important part of workflows that have produced very considerable returns on investment.
- E-infrastructures such as high-throughput modelling, materials informatics systems and high performance computing act as multipliers of impact. Molecular modelling is estimated to account for about 6% of the impact generated from high performance computing.
Finally, a number of existing barriers to impact are discussed including deficiencies in some of the methods, software interoperability, usability and integration issues, the need for databases and informatics tools as well as further education and training. These issues notwithstanding, this review found strong and even quantifiable evidence for the impact of modelling from the research base to economic benefits.
We acknowledge financial support from the University of Cambridge in the production of this report.
At the Techconnect Nanotech 2011 conference in Boston a couple of weeks ago, the emphasis was clearly on the ‘downstream’, i.e. realising the potential of nanotechnology in new and exciting products. I was impressed by the progress made at Nanocomp in manufacturing huge sheets and yarns from nanotubes for applications such as EMI shielding and heat straps. Having shared the lab with folks wondering how one could process this stuff in the nineties  the presentation brought it home how far the field has developed in the last 15 years.
Going from the large size applications to the small, Tom Russell presented the latest in his quest to reach addressable arrays of 10 tera-dots per square inch by self-assembly of block copolymers. A fascinating journey, from the enhanced ordering obtained by solvent annealing which gives grain sizes of about 20 micron (“not good enough”), lithography guided assembly (“still not good enough”), to spin coated and solvent annealed copolymer on faceted sapphire wafers, which eventually lead to cylinder phase perpendicular to the sapphire ridges with translational and orientational order persisting over centimetres! Looks like the next generations of memory devices is well on its way.
Big strides are also being made in catalysis. Nanostellar, who design new materials based on a so-called Rational Design Methodology which relies heavily on simulation, presented advances in diesel emission catalysts. It was interesting to hear from CEO Pankaj Dhingra that their focus has changed from using modelling for wide range screening to a more focussed application on uncovering the key selection criteria within a more targeted phase space, in this case Strontium doped Lanthanum perovskites.
The downstream theme was also echoed in the modelling session. Apart from my talk about the ‘landscape’ of integration of atomistic simulation into engineering optimisation, which I’ll come to in another blog, Simon McGrother from CULGI highlighted some great successes of polymer and mesoscale modelling in product development. Despite that, he made the point that these methods have still not reached the ‘democratization’ that was anticipated ten years ago. Based on the growth figures of the modelling community presented in my previous blog, I would actually dispute that. Nevertheless, the impact on ‘downstream’ development and products remains limited, and that’s where I agree with Simon.
On the other hand, the engineering simulation community is showing an interest in molecular modelling, as highlighted in a presentation by Carlos Alguin, Head of the Nanotechnology Group at Autodesk with some cool graphics based on the Maya software and Molecular Maya toolkit. Clearly, the ease of use of and interactivity their design tools and the superb visualization have much to offer the molecular modelling community. The question is though how we achieve further awareness and utilisation of materials modelling back in the engineering world.
 M.S.P Shaffer, X.F. Fan and A.H. Windle, Dispersion and packing of carbon nanotubes, Carbon, Vol. 36, No. 11, pp.1603-1612 (1998)
The atomistic modelling field has grown substantially over the last 10 years, and reached a level of maturity which makes a more routine type of application and integration into engineering and product design a viable option. At the same time, product design has reached scales that are close to atomistic, and also involves exploring an ever larger space of potential new materials across the element table.
Here is some evidence:
The growth of the simulation field was demonstrated very nicely by a recent study based on publications in the ab initio field by the Psi-k network. It shows a strong increase in the number of (unique) people publishing papers based on ab initio methods from about 3000 in 1991 to about 20,000 in 2009, with particularly strong growth in East Asia. If one adds people who use other techniques such as molecular dynamics, and researchers in industry that don’t publish their work, it should be safe to assume that there are more than 30,000 users of some sort of atomistic technique.
This level of growth is also linked to the robustness of the codes and the speed of standard hardware. These together with the experience that has been gained regarding the types properties that can be calculated at a certain level of accuracy have increased the impact of atomistic simulation in many industrial applications.
Also, atomistic techniques support the combinatorial exploration of the large materials phase space. For example, the iCatDesign project in the UK explored alloys for fuel cell catalysts, considering both the combination of different elements as well as structural aspects. The online library of binary alloys from the Energy Materials Lab at Duke is an example of structure calculations that aid in the discovery and development of new materials. Considering ternary alloys are becoming more important in meeting complex requirements in high performance applications such as aerospace and energy generation, and the fact that only about 5% of ternaries are known, such modelling approaches will become even more relevant in new materials design. Also, in other areas such as polymer and composite design, early adopters are demonstrating the usefulness of integration, for example Boeing reported that they “integrated molecular simulations into the materials design process” and their work “demonstrates that the future of aerospace materials development includes simulation tools”.
Despite the growing importance and opportunity of a stronger integration of atomistic methods into engineering design, this is still an area in its infancy, but promoted strongly as part of a wider agenda such as Integrated Computational Materials Engineering (ICME). One of the key questions I am interested in is how the integration is actually achieved. For example, will integration of the modelling methods themselves be required, as in multiscale methods?
While multiscale methods are important for some applications, their significance for integration may be overrated, as was also concluded by the report on ICME report. Rather, the focus needs to be on a more detailed analysis of design workflows, and their intersection with the information that can be determined well at the atomistic scale.
A design workflow typically includes a number of selection stages, at which decisions are made regarding materials and processes. These will be informed by available data from a number of sources and should include atomistic modelling where appropriate. This type of approach has been reported for example by Massimo Noro from Unilever, who talks about selection criteria as “emerging physico-chemical criteria we can evaluate in practice that help us select ingredients”. Also Oreste Todini from Procter & Gamble promotes the use of modelling in the decision process to come up with lead options for new formulations.
So there is evidence of an integrated design approach from early adopters such as Boeing, Unilever and Procter & Gamble. In order to establish integration more widely, engineering and science communities need to collaborate more closely. The atomistic simulations community needs to improve the way in which best practices are established, shared and linked with engineering workflows. Informatics frameworks are being established, for example with the integration of Materials Studio in Accelrys’ Pipeline Pilot platform, and projects such as iCatDesign and MosGrid. However, integration into engineering rather than chemistry platforms may be what is required.