Dead time corrections on the Feynman-Y curve using the backward extrapolation method Academic Article uri icon


  • Dead time losses in neutron detection, caused by both the detector and the electronics dead time, is a highly nonlinear effect, known to create high biasing in physical experiments as the power grows over a certain threshold. Analytic modeling of the dead time losses is a highly complicated task due to the different nature of the dead time in the different components of the monitoring system (paralyzing vs. non-paralyzing), and the stochastic nature of the fission chains. The most basic analytic models for a paralyzing dead time correction assume a non-correlated source, resulting in an exponential model for the dead time correction. While this model is often used and very useful for correcting the average count rate in low count rates, it is totally impractical in noise experiments and the so-called Feynman-α experiments. In the present study, a new technique is introduced for dead time corrections, based on backward extrapolation of the losses, created by imposing increasing artificial dead time on the data, back to zero. The method is implemented on neutron noise measurements carried out in the MINERVE reactor, demonstrating high accuracy in restoring the corrected values of the Feynman-Y variance-to-mean-ratio.

publication date

  • January 1, 2018