Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Parallel Prefetching for Canonical Ensemble Monte Carlo Simulations



Harold Wickes Hatch


In order to enable large-scale molecular simulations, algorithms must efficiently utilize multi-core processors that continue to increase in total core count over time with relatively stagnant clock speeds. Although parallelized molecular dynamics (MD) software has taken advantage of this trend in computer hardware, single-particle perturbations with Monte Carlo (MC) are more difficult to parallelize than system-wide updates in MD using domain decomposition. Instead, prefetching reconstructs the serial Markov chain after computing multiple MC trials in parallel. Canonical ensemble MC simulations of a Lennard-Jones fluid with prefetching resulted in up to a factor of 1.7 speedup using 2 threads, and a factor of 3 speedup using 4 threads. Strategies for maximizing efficiency of prefetching simulations are discussed, including the potentially counter-intuitive benefit of reduced acceptance probabilities. Determination of the optimal acceptance probability for a parallel simulation is simplified by theoretical prediction from serial simulation data. Finally, complete open-source code for parallel prefetch simulations was made available in the Free Energy and Advance Sampling Simulation Toolkit (FEASST).
Journal of Physical Chemistry A


Monte Carlo, parallelization, statistical mechanics


, H. (2020), Parallel Prefetching for Canonical Ensemble Monte Carlo Simulations, Journal of Physical Chemistry A, [online], (Accessed May 30, 2024)


If you have any questions about this publication or are having problems accessing it, please contact

Created August 24, 2020, Updated August 25, 2020