The role of scientific computing in chemical engineering: developing clean energy technologies

Chemical Engineering has contributed significantly to shaping our lives for the past century. Developing a chemical process at the lab scale and taking it to the industrial scale is at the heart of Chemical Engineering. Out of the numerous chemical processes, perhaps converting crude oil into various chemicals and fuels that are the workhorse of modern life has been the most instrumental contribution of Chemical Engineering. However, these feats are also responsible for climate change due to excessive greenhouse gas emissions. There is an imminent need to develop green processes to produce chemicals and fuels in a sustainable and environment-friendly manner. Solving these grand challenges requires a paradigm shift in chemical engineering practices. 

Historically, developing a new chemical process is based on empirical design tools and substantial experimentation at different scales, ranging from a catalyst particle to the pilot scale. This process is lengthy (10-20 years) and expensive. In this context, the tremendous growth over the last two decades in high-performance computing (HPC), machine learning and artificial intelligence (ML/AI), numerical techniques, and user-friendly commercial and open-source software have opened a new paradigm. For example, the speed of the top supercomputer has been doubling every fourteen months for the last two decades. These resources are increasingly becoming cheaper and accessible. Utilizing the growing CPU and GPU power combined with efficient algorithms can be a game-changer in how new chemical processes are developed at the lab scale and scaled to the industrial scale. More and more chemical industries recognize this fact and are now adopting scientific computing as an integral part of developing new technologies.

However, the transition from empirical to scientific computing-based design strategies requires a level of reliability and robustness in the computational models that are unavailable today. Computer models remain unreliable for complex processes due to the strong assumptions made in the underlying physical and chemical models. Industries are risk-averse by nature and, thus, rely on traditional and proven formulae or empirical relations. These formulae work but cannot explore a large design space for catalyst and reactor optimization required to make clean energy technologies economical.

In this context, high-fidelity computational design tools are desirable for exploring a vast design space and guiding experiments toward optimization. Most emerging clean energy technologies, such as biomass and plastic valorization, CO2 capture and utilization, and hydrogen production, involve chemical kinetics, fluid flow, and heat and mass transfer. Mathematically, these phenomena are represented by coupled partial differential equations (PDEs), which cannot be solved analytically and are challenging to solve even numerically. Advances in HPC and numerical techniques allow these PDEs to be solved for lab-scale systems, providing unprecedented insights into unknown domains. These insights can expedite the discovery of new materials and processes at the development stage, saving time and money.

Computational tools that are ubiquitous in academia and the chemical industry include Density Functional Theory (DFT) for atomic scale, Molecular Dynamics (MD) for molecular scale, Computational Fluid Dynamics (CFD) for particle-to-reactor scales, and process simulation software like ASPEN PLUS for Integrated Process Modeling (IPM) at the plant scale. These tools are often used in silos, implying there is no information flow from one model to another. Connecting these tools remains a significant challenge. Multiscale modeling approaches combined with ML/AI can be helpful in this regard. A hierarchical approach is desirable where insights from a lower-level model can be fed into a higher-level model all the way from DFT to MD to CFD to IPM. However, achieving such a goal requires extensive collaboration among researchers with different strengths and appreciation of each other’s domains.

One way to achieve this goal is to build models for processes at different scales and then integrate them. Each sub-model can be validated against experiments individually, reducing the uncertainty associated with the overall model. We have recently shown such an approach to model thermochemical conversion of biomass (TCB) – a technique to convert lignocellulosic biomass into gaseous or liquid products. We developed a multiscale model for TCB by integrating experimentally validated models for chemistry, particle scale, and reactor scale [1]. Such multiscale methodologies can be extended to other processes, such as coal gasification and CO2 capture and utilization.

At present, ML is making the maximum impact in DFT and MD, where it is used to develop high throughput models to predict structure-property relationships. In the chemical industry, ML is bringing a significant change in the form of digital twins, primarily based on plant data. CFD is used in the chemical industry primarily to design reactors. Currently, techniques to utilize ML in CFD, where transport processes (fluid, heat, and mass) are coupled with chemical kinetics, are limited. The primary reason is the system’s complexity arising from coupling between various phenomena. One approach to tackle this challenge is to build ML models for specific processes and then integrate them with CFD. In general, solving small-scale processes, such as chemistry and sub-particle transport, in CFD simulations is the most time-consuming step. Thus, using ML to build models for small-scale phenomena could be an efficient strategy. We demonstrate such an approach in our recent work [2]. The existing ML algorithms can also help in postprocessing simulation data, quickly extracting helpful information from the simulation data. For example, clustering algorithms, such as DBSCAN, can be used to analyze the CFD simulation of reactors [3].

Ultimately, connecting the output of DFT, MD, and CFD in process simulation software will allow the full utilization of scientific computing in developing and deploying new technologies. For example, a new catalyst that provides excellent selectivity towards a desirable product in the lab may not be adequate in the industry. Numerous factors determine the useability of a catalyst at the industrial scale, such as the cost, stability, and ability to produce the catalyst at a large scale. Computer models that can quickly perform these calculations and inform the user about the viability of a new product or process will be excellent. Such tools can be instrumental in directing the research and funding toward a practical solution. 

A significant challenge in making scientific computing an integral part of the chemical industry is training the workforce. Chemical engineering students need to be trained with the latest developments in computational science. The current course curriculum at the institutes and universities can be supplemented with electives on computational science topics like parallel computing, HPC, and ML/AI. Moreover, courses on industrial practices, such as scale-up of reactors, can be co-taught by faculty and industry practitioners. The ability to combine core chemical engineering fundamentals with the latest computational science techniques will enable the upcoming graduates to use scientific computing in chemical engineering practice.

References

1. BK Kumar and H Goyal, Impact of particle-scale models on CFD-DEM simulations of biomass pyrolysis, Reaction Chemistry & Engineering (2024) doi.org/10.1039/D4RE00086B

2. KG Sharma, NS Kaisare, H Goyal, A recurrent neural network model for biomass gasification chemistry, Reaction Chemistry & Engineering, 7, 570-579 (2022) doi.org/10.1039/D1RE00409C

3. BK Kumar, S Ganesh, H Goyal, Capturing mesoscale structures in computational fluid dynamics simulations of gas-solid flows, AIChE Journal, e18360 (2024) doi.org/10.1002/aic.18360

Author – Prof Himanshu Goyal, IIT Madras, Chennai