pyMAISE: Michigan Artificial Intelligence Standard Environment
pyMAISE is an artificial intelligence (AI) and machine learning (ML) benchmarking library for nuclear reactor applications. It offers to streamline the building, tuning, and comparison of various ML models for user-provided data sets. Also, pyMAISE offers benchmarked data sets, written in Jupyter Notebooks, for AI/ML comparison. Current ML algorithm support includes
linear regression,
lasso regression,
ridge regression,
elastic net regression,
logistic regression,
decision tree regression and classification,
extra trees regression and classification,
adaboost regression and classification,
support vector regression and classification,
random forest regression and classification,
k-nearest neighbors regression and classification,
gaussian process regression and classification,
gradient boosting regression and classification,
sequential neural networks.
pyMAISE now includes support for ensemble methods, expanding its capabilities to enhance classical models. Currently, it offers the following options
stacking,
multi-output,
These models are built using scikit-learn and Keras [C+15, PVG+11]. pyMAISE supports the following neural network layers:
dense,
dropout,
LSTM,
GRU,
1D, 2D, and 3D convolutional,
1D, 2D, and 3D max pooling,
flatten,
and reshape.
Request further neural network layer support as an issue on the pyMAISE repository. Refer to the sections below for more information, including installation, examples, and use. Use the Benchmark Jupyter Notebooks as examples on pyMAISE functionality.
Contents
Data References
Jean-Marie Le Corre, Gregory Delipei, and Xingang Zhao. Benchmark on artificial intelligence and machine learning for scientific computing in nuclear engineering. NEA Working Papers, 2024. URL: https://www.oecd-nea.org/jcms/pl_89619/benchmark-on-artificial-intelligence-and-machine-learning-for-scientific-computing-in-nuclear-engineering-phase-1-critical-heat-flux-exercise-specifications?details=true.
Dean Price, Majdi I. Radaideh, and Brendan Kochunas. Multiobjective optimization of nuclear microreactor reactivity control system operation with swarm and evolutionary algorithms. Nuclear Engineering and Design, 393:111776, 2022. URL: https://www.sciencedirect.com/science/article/pii/S0029549322001303, doi:https://doi.org/10.1016/j.nucengdes.2022.111776.
Majdi I Radaideh, Chris Pappas, Mark Wezensky, Pradeep Ramuhalli, and Sarah Cousineau. Early fault detection in particle accelerator power electronics using ensemble learning. International Journal of Prognostics and Health Management, 14(1):1–19, 2023. doi:https://doi.org/10.36001/ijphm.2023.v14i1.3419.
Majdi I. Radaideh, Katelin Du, Paul Seurin, Devin Seyler, Xubo Gu, Haijia Wang, and Koroush Shirvan. Neorl: neuroevolution optimization with reinforcement learning—applications to carbon-free energy systems. Nuclear Engineering and Design, 412:112423, 2023. URL: https://www.sciencedirect.com/science/article/pii/S0029549323002728, doi:https://doi.org/10.1016/j.nucengdes.2023.112423.
Majdi I. Radaideh, Benoit Forget, and Koroush Shirvan. Large-scale design optimisation of boiling water reactor bundles with neuroevolution. Annals of Nuclear Energy, 160:108355, 2021. URL: https://www.sciencedirect.com/science/article/pii/S0306454921002310, doi:https://doi.org/10.1016/j.anucene.2021.108355.
Majdi I. Radaideh and Tomasz Kozlowski. Surrogate modeling of advanced computer simulations using deep gaussian processes. Reliability Engineering & System Safety, 195:106731, 2020. URL: https://www.sciencedirect.com/science/article/pii/S0951832019301711, doi:https://doi.org/10.1016/j.ress.2019.106731.
Majdi I. Radaideh, Chris Pappas, Jared Walden, Dan Lu, Lasitha Vidyaratne, Thomas Britton, Kishansingh Rajput, Malachi Schram, and Sarah Cousineau. Time series anomaly detection in power electronics signals with recurrent and convlstm autoencoders. Digital Signal Processing, 130:103704, 2022. doi:https://doi.org/10.1016/j.dsp.2022.103704.
Majdi I. Radaideh, Connor Pigg, Tomasz Kozlowski, Yujia Deng, and Annie Qu. Neural-based time series forecasting of loss of coolant accidents in nuclear power plants. Expert Systems with Applications, 160:113699, 2020. URL: https://www.sciencedirect.com/science/article/pii/S0957417420305236, doi:https://doi.org/10.1016/j.eswa.2020.113699.
Majdi I. Radaideh, Stuti Surani, Daniel O’Grady, and Tomasz Kozlowski. Shapley effect application for variance-based sensitivity analysis of the few-group cross-sections. Annals of Nuclear Energy, 129:264–279, 2019. URL: https://www.sciencedirect.com/science/article/pii/S0306454919300714, doi:https://doi.org/10.1016/j.anucene.2019.02.002.
Software References
Scikit-optimize. https://scikit-optimize.github.io/stable/index.html, 2016.
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Yangqing Jia, Rafal Jozefowicz, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dandelion Mané, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Mike Schuster, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. TensorFlow: large-scale machine learning on heterogeneous systems. 2015. Software available from tensorflow.org. URL: https://www.tensorflow.org/.
François Chollet and others. Keras. https://keras.io, 2015.
Charles R. Harris, K. Jarrod Millman, Stéfan J. van der Walt, Ralf Gommers, Pauli Virtanen, David Cournapeau, Eric Wieser, Julian Taylor, Sebastian Berg, Nathaniel J. Smith, Robert Kern, Matti Picus, Stephan Hoyer, Marten H. van Kerkwijk, Matthew Brett, Allan Haldane, Jaime Ferná ndez del Río, Mark Wiebe, Pearu Peterson, Pierre Gérard-Marchant, Kevin Sheppard, Tyler Reddy, Warren Weckesser, Hameer Abbasi, Christoph Gohlke, and Travis E. Oliphant. Array programming with NumPy. Nature, 585(7825):357–362, September 2020. URL: https://doi.org/10.1038/s41586-020-2649-2, doi:10.1038/s41586-020-2649-2.
J. D. Hunter. Matplotlib: a 2d graphics environment. Computing in Science & Engineering, 9(3):90–95, 2007. doi:10.1109/MCSE.2007.55.
Wes McKinney and others. Data structures for statistical computing in python. In Proceedings of the 9th Python in Science Conference, volume 445, 51–56. Austin, TX, 2010.
Tom O'Malley, Elie Bursztein, James Long, François Chollet, Haifeng Jin, Luca Invernizzi, and others. Kerastuner. https://github.com/keras-team/keras-tuner, 2019.
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.