10:30am - 10:50amAutoJSA: A Knowledge-Enhanced Large Language Model Framework for Improving Job Safety Analysis
Shuo Xu, Jinsong Zhao
Tsinghua University, China, People's Republic of
Job Safety Analysis (JSA) is a core tool for identifying potential hazards in the workplace, assessing risks, and proposing control measures. It is widely used in various industries. However, traditional JSA methods face many challenges, including time-consuming and lengthy, lack of unified risk assessment standards, and difficulty in identifying hidden risks from the surrounding environment. In addition, the reliance on manpower in the analysis process leads to its inefficiency, especially in dynamic work environments that require frequent updates. These problems limit the effectiveness of JSA in actual operations and make it difficult to fully realize its potential safety assurance function.
In recent years, the rapid development of natural language processing (NLP) technology, especially the rise of large language models (LLMs), has provided a new idea for solving these problems. LLM can process and understand large amounts of complex text data. With its powerful language generation and knowledge integration capabilities, it has gradually been applied to various fields to optimize the analysis process. Despite this, the application of LLM in job safety analysis is still in its infancy, especially the research on automated hazard identification and risk assessment is relatively limited.
To address the limitations of existing JSA methods, this paper proposes the AutoJSA, which uses knowledge-enhanced LLM to support and improve the entire JSA analysis process. AutoJSA employs domain knowledge enhancement technology by extracting workplace safety insights from a wealth of JSA reports and industry safety documents. This extracted knowledge is then integrated into the LLM, significantly improving its ability to comprehend hazardous situations within specific industry contexts. AutoJSA can generate detailed job analysis reports according to analyst requirements, including job step division, hazard factor identification and evaluation, and suggestions for control measures.
Experimental results show that AutoJSA can significantly shorten JSA analysis time, reduce human errors, and perform well in identifying potential risks and proposing effective control measures. Compared with traditional manual JSA methods, AutoJSA not only improves efficiency, but also can more comprehensively identify potential risks in complex tasks and consider interference from surrounding operations through its knowledge enhancement capabilities. Especially in highly dynamic and multi-task parallel work environments, AutoJSA ensures the timeliness and effectiveness of safety measures by timely updating analysis results and providing real-time suggestions.
Although there are still challenges in the scale and quality of the dataset, AutoJSA demonstrates its great potential in job safety analysis. This paper argues that by leveraging the latest advances in NLP, especially LLM, AutoJSA is expected to promote the automation and intelligence of workplace safety management while improving the efficiency and accuracy of JSA.
10:50am - 11:10amPhenomena-Based Graph Representations and Applications to Chemical Process Simulation
Yoel Rene Cortes-Pena1, Victor M. Zavala2
1University of Wisconsin Madison, United States of America; 2University of Wisconsin Madison, United States of America
Rapid and robust simulation of a chemical production process is critical to address core scientific questions related to process design, optimization, and sustainability. Efficiently solving a chemical process, however, remains a challenge due to their highly coupled and nonlinear nature. While many algorithmic paradigms for process simulation exist (e.g., sequential modular simulation [1], equation-based simulation [2], pseudo-transient methods [3]) only a limited set of approaches exist that exploit the topology/connectivity of the process at the phenomenological (physical) level [4].
Graph-theoretic representations of process equations and variables can help better understand the topology of a chemical process at the phenomenological level and potentially uncover more efficient decompositions for flowsheet simulation. Decoupling of equations based on phenomena (e.g., mass, energy, thermodynamic, kinetic) is widely used to solve unit operations such as multistage equilibrium columns [5]. Through graph-theoretic representations, it may be possible to generalize and expand phenomena-based decomposition approaches used at the unit operation level towards the complete flowsheet.
In this work, we developed a general graph abstraction of underlying physical phenomena within unit operations. The abstraction consists of a graph/network of interconnected variable/equation nodes that are systematically generated through PhenomeNode, an open-source library that we developed and implemented in Python. By employing the graph representation on an industrial separation process for purifying glacial acetic acid, we show how partitioning the graph into separate mass, energy, and phenomena subgraphs can help decouple nonlinearities and developed a phenomena-based simulation algorithm. We implemented this new simulation algorithm in BioSTEAM [6]—an open-source process simulation platform in Python— and demonstrate how phenomena-based decomposition enables more efficient simulation of large, highly-coupled systems than sequential modular. Using an industrial process for dewatering acetic acid as a representative case for a large and highly coupled system, the new algorithm results in a 76.6% reduction in computation time. These results suggest that phenomena-based decomposition of the flowsheet may open new avenues for more rapid and robust simulation.
References
(1) Motard, R. L.; Shacham, M.; Rosen, E. M. Steady State Chemical Process Simulation. AIChE Journal 1975, 21 (3), 417–436. https://doi.org/10.1002/aic.690210302.
(2) Bogle, I. D. L.; Perkins, J. D. Sparse Newton-like Methods in Equation Oriented Flowsheeting. Computers & Chemical Engineering 1988, 12 (8), 791–805. https://doi.org/10.1016/0098-1354(88)80018-8.
(3) Tsay, C.; Baldea, M. Fast and Efficient Chemical Process Flowsheet Simulation by Pseudo-Transient Continuation on Inertial Manifolds. Computer Methods in Applied Mechanics and Engineering 2019, 348, 935–953. https://doi.org/10.1016/j.cma.2019.01.025.
(4) Ishii, Y.; Otto, F. D. Novel and Fundamental Strategies for Equation-Oriented Process Flowsheeting. Computers & Chemical Engineering 2008, 32 (8), 1842–1860. https://doi.org/10.1016/j.compchemeng.2007.10.004.
(5) Monroy-Loperena, R. Simulation of Multicomponent Multistage Vapor−Liquid Separations. An Improved Algorithm Using the Wang−Henke Tridiagonal Matrix Method. Ind. Eng. Chem. Res. 2003, 42 (1), 175–182. https://doi.org/10.1021/ie0108898.
(6) Cortes-Peña, Y.; Kumar, D.; Singh, V.; Guest, J. S. BioSTEAM: A Fast and Flexible Platform for the Design, Simulation, and Techno-Economic Analysis of Biorefineries under Uncertainty. ACS Sustainable Chem. Eng. 2020, 8 (8), 3302–3310. https://doi.org/10.1021/acssuschemeng.9b07040.
11:10am - 11:30amSystematic comparison between Graph Neural Networks and UNIFAC-IL for solvent pre-selection in liquid-liquid extraction
Edgar Ivan Sanchez Medina2, Ann-Joelle Minor1, Kai Sundmacher1,2
1Max Planck Institute for Dynamics of Complex Technical Systems, Germany; 2Otto von Guericke University
Solvent selection is a complex decision that involves balancing multiple factors, including economic, environmental, and societal considerations. As industries strive for sustainability, choosing the right solvent has become increasingly important. However, the vast chemical space makes evaluating all possible solvents impractical. To address this, pre-selection strategies are essential, narrowing down the search to the most promising solvent candidates.
Predictive thermodynamic models are commonly used for solvent pre-selection, with the UNIFAC model being one of the most popular. Recently, advancements in deep learning and computational power have led to the development of new models, such as the Gibbs-Helmholtz Graph Neural Network (GH-GNN). This model has demonstrated higher accuracy in predicting infinite dilution activity coefficients over a larger chemical space than UNIFAC [1]. Moreover, GH-GNN has been applied to solvent pre-selection for extractive distillation [2]. However, a systematic comparison of the pre-selection methods has not yet been conducted.
In this work, we present a systematic comparison between solvent pre-selection using GH-GNN and UNIFAC-IL, an extension of the UNIFAC model for ionic liquids. The case study focuses on pre-selecting solvents for a liquid-liquid extraction process involving the ionic liquid ethyl-3-methylimidazolium tetrafluoroborate ([EMIM][BF4]) and caprolactam. Recent research has identified ionic liquids as promising candidates for the solvolytic depolymerization of polyamide 6 into caprolactam [3]. In this process, the resulting mixture, which primarily consists of caprolactam and the ionic liquid, must be separated into pure components to enable the recycling of the ionic liquid and the reuse of caprolactam in a circular economy.
Our results show that, despite differences in solvent rankings across methods, there is a significant correlation in the overall pre-selection. This suggests that deep learning-based models like GH-GNN can be viable alternatives for solvent pre-selection across a broader chemical space compared to traditional group contribution methods. Additionally, we show that chemical similarity metrics, such as Tanimoto similarity of molecular fingerprints, can be used to assess confidence in the proposed solvent rankings. This allows the model user to decide the level of risk that is willing to tolerate regarding the predictions of a vast chemical space.
[1] Sanchez Medina, E.I., Linke, S., Stoll, M. and Sundmacher, K., 2023. Gibbs–Helmholtz graph neural network: capturing the temperature dependency of activity coefficients at infinite dilution. Digital Discovery, 2(3), pp.781-798.
[2] Sanchez Medina, E.I. and Sundmacher, K., 2023. Solvent pre-selection for extractive distillation using Gibbs-Helmholtz Graph Neural Networks. In Computer Aided Chemical Engineering (Vol. 52, pp. 2037-2042). Elsevier.
[3] Kamimura, A., Shiramatsu, Y. and Kawamoto, T., 2019. Depolymerization of polyamide 6 in hydrophilic ionic liquids. Green Energy & Environment, 4(2), pp.166-170.
11:30am - 11:50amOptimizing Individual-based Modelling: A Grid- based Approach to Computationally Efficient Microbial Simulations
Ihab Hashem, Jian Wang, Jan Van Impe
KU Leuven, Belgium
Individual-based Modeling (IbM) is a powerful approach for simulating microbial populations, with applications in biochemical engineering such as wastewater treatment, bioreactor design, and biofilm formation. This bottom-up approach explicitly models individual cell behaviors, allowing population dynamics to emerge from cellular interactions. Despite its strengths, IbM is often limited by high computational demands, especially when scaling to larger populations. The main bottleneck lies in resolving spatial overlaps between cells, which is usually managed through pairwise comparisons using data structures like kd-trees that partition the space recursively. While kd-trees reduce the complexity of neighbor searches to O(N log N), they become less efficient as population sizes increase [1]. To overcome this limitation, we developed the Discretized Overlap Resolution Algorithm (DORA), a grid-based method that transforms the overlap resolution process into a more efficient O(N) operation. Rather than tracking individual cell positions and performing neighbor searches, DORA discretizes the simulation space into grid units, resolves overlaps using a diffusion-like process, and translates the results back into movement vectors for the cells. This approach resulted in substantial improvements in computational efficiency for large-scale colony and biofilm simulations. DORA was implemented within MICRODIMS, an in-house IbM platform developed for simulating microbial growth [2]. MICRODIMS is built on the Repast Simphony toolkit and allows for the simulation of microbial dynamics, nutrient diffusion, and biofilm formation with detailed control over spatial and temporal resolution. Integrating DORA into MICRODIMS significantly enhanced its ability in handling large-scale growth simulations in a computationally feasible time.
We extended DORA by incorporating adaptive grid scaling, which adjusts grid resolution based on local cell density to optimize computational resources and speed up simulations in less dense areas. We also introduced stochastic movement components prior to the overlap resolution step, enhancing the realism of simulations by capturing inherent biological variability, such as microbial motility and environmental fluctuations. Our results showed that these optimizations significantly improved DORA’s performance and enabled more computationally efficient simulations without compromising accuracy. The incorporation of stochasticity also provides flexibility, allowing the model to better reflect the natural variability seen in biological systems, thereby offering a more accurate representation of microbial behavior under diverse conditions.
References
1- Hellweger, F.L. and Bucci, V., 2009. A bunch of tiny individuals—Individual-based modeling for microbes. Ecological Modelling, 220(1), pp.8-22.
2- Tack, I.L., Nimmegeers, P., Akkermans, S., Hashem, I. and Van Impe, J.F., 2017. Simulation of Escherichia coli dynamics in biofilms and submerged colonies with an individual-based model including metabolic network information. Frontiers in microbiology, 8, p.2509.
11:50am - 12:10pmA Framework Utilizing a Seamless Integration of Python with AspenPlus® for a Multi-Criteria Process Evaluation
Simon Maier, Julia Weyand, Ginif Kaur, Oliver Erdmann, Ralph-Uwe Dietrich
German Aerospace Center (DLR), Germany
Detailed assessment of fuel production processes at an early stage of a project is crucial to identify potential technical challenges, optimize efficiency and minimize costs and environmental impact. While process simulations often are either very rigid and accurate or very flexible and unprecise, informed decision making can only be maintained by establishing a detailed process model as early as possible with in the project lifecycle while keeping relevant aspects of the process flexible enough.
In this work, we present the development of a framework based on a dynamic interface between AspenPlus® process simulations and Python, enabling enhanced flexibility and automation for process modeling and optimization. This integration leverages the powerful simulation capabilities of AspenPlus® with the versatility of Python for data analysis and optimization, delivering significant improvements in workflow efficiency and process control. By utilizing the dynamic simulation data exchange with Python, extensive parameter studies can be conducted and post-processed for techno-economic and environmental analyses. Furthermore, the interface allows the implementation of complex kinetic models or optimization routines for single process units. An additional extension for heat integration ensures the technical viability of the process route for reliable comparisons of different routes and process configurations.
The functionalities are applied to a biomass- and power-based methanol production process including various process designs and operating conditions. To keep the level of detail at a high level, additional Python scripts are implemented securing a proper scaling of process units such as the methanol synthesis reactor system. The process configurations are assessed technically, economically and environmentally.
12:10pm - 12:30pmGlobal Robust Optimisation for Non-Convex Quadratic Programs: Application to Pooling Problems
Asimina Marousi, Vassilis Charitopoulos
The Sargent Centre for Process Systems Engineering, Department of Chemical Engineering,University College London, United Kingdom
Robust optimization is widely used to identify worst-case scenarios, ensuring constraint satisfaction under uncertainty or when statistical data is unavailable, as an alternative to scenario-based approaches [1]. The most prevalent solution algorithms for convex problems are robust reformulation and robust cutting-planes, both of which can be extended to non-convex problems, though they lack guarantees of polynomial-time convergence [2]. Cutting-planes involve sequentially solving an upper-level deterministic problem and lower-level problems to find uncertainty values that cause the maximum constraint violation. For those values, corresponding cuts are added to the upper-level problem. Implementations of cutting-planes can be found in solvers like PyROS [3] and the ROmodel package [4] in Python, which rely on local or global solvers to handle intermediate problems. However, traditional robust cutting-planes are heavily influenced by the performance of the chosen solver, and if the solver fails to converge during a cut-round, the entire algorithm may not converge [3].
With the growing presence of nonlinear functions in chemical engineering problems, especially when data-driven methods are employed, there is a need for systematic techniques that handle non-convex problems [5]. This study proposes a novel spatial Branch-and-Bound algorithm integrated with robust cutting-planes (RsBB) for solving non-convex robust optimization problems. The coupling of global and robust optimization algorithms has been explored in process systems engineering literature [6,7]. In this work, the proposed algorithm is implemented to solve benchmark pooling problems with uncertain feed quality, using McCormick relaxations. In the RsBB algorithm, robust infeasibility checks are performed at each node of the BB tree. The infeasibility cuts are added both on the original and the relaxed problem. The branching tree is separated into different families depending on the number of cuts at each node, and a depth-first search is used, favouring nodes within the same family. A comparison is performed between RsBB and state-of-the-art software in terms of computational efficiency and solution robustness.
[1] Dias, L. S., & Ierapetritou, M. G. (2016). Integration of scheduling and control under uncertainties: Review and challenges. Chemical Engineering Research and Design, 116, 98–113.
[2] Wiebe, J., Cecĺlio, I., & Misener, R. (2019). Robust Optimization for the Pooling Problem. Industrial and Engineering Chemistry Research, 58(28), 12712–12722.
[3] Isenberg, N. M., Akula, P., Eslick, J. C., Bhattacharyya, D., Miller, D. C., & Gounaris, C. E. (2021). A generalized cutting-set approach for nonlinear robust optimization in process systems engineering. AIChE Journal, 67(5), e17175.
[4] Wiebe, J., & Misener, R. (2022). ROmodel: modeling robust optimization problems in Pyomo. Optimization and Engineering, 23(4), 1873–1894.
[5] Schweidtmann, A. M., Bongartz, D., Grothe, D., Kerkenhoff, T., Lin, X., Najman, J., & Mitsos, A. (2021). Deterministic global optimization with Gaussian processes embedded. Mathematical Programming Computation, 13(3), 553–581.
[6] Li, J., Misener, R., & Floudas, C. A. (2011). Scheduling of Crude Oil Operations Under Demand Uncertainty: A Robust Optimization Framework Coupled with Global Optimization. AIChE Journal, 58, 2373–2396.
[7] Zhang, L., Yuan, Z., & Chen, B. (2021). Refinery-wide planning operations under uncertainty via robust optimization approach coupled with global optimization. Computers & Chemical Engineering, 146, 107205.
|