Abstract
Environmental sampling is a common technique employed by inspectors and facility operators in nuclear safeguards, proliferation detection, and process monitoring contexts. Interpreting measurements performed on samples or collections of samples and ensuring the information extracted is accurate and precise is difficult. To date, these analyses have relied on simulated data to enable systematic studies; however, these models are inherently limited by the fidelity of the models and the implicit spatial averaging of isotopic composition or other signatures of interest. To advance this capability, we have refined the spatial discretization and expanded the range of physics in the simulation codes we use to perform reactor simulations and depletion calculations. This allows us to generate data that are more representative of real environmental samples, especially for the length scale of the isotopic composition and associated variation. Accordingly, these new data allow a more realistic assessment of traditional and new data analytic analysis methods. Here we present motivation for developing reactor simulations using high-performance computing methods and resources, impacts of these new simulations on our assessment of data analysis and interpretation methods, and initial results of developing and systematically testing data analytic methods designed to overcome the challenges expected of real-world samples. We also quantify the performance of these analyses using defensible statistical methods.