Abstract
We consider a networked system of sensors that measure the intensity
of a source amidst background inside a two-dimensional monitoring area.
The source intensity decays away from it, and the corresponding
sensor measurements are random with a parameter determined by the intensity
at sensor location.
The detection problem is to infer the presence of a source based on measurements.
Under a statistical independence condition, we show that a
detection method based on maximum likelihood fuser performs
below the individual sensors in presence of network losses.
It has been previously shown that the localization of a source
using measurements from multiple
sensors leads to an improved detection, thereby establishing the
effectiveness of a network over single or co-located sensors.
We show that the communication losses degrade such network detection
performance, particularly to levels below that of a
single sensor under heavy losses.
Under fairly general conditions on the source intensity
decay functions and underlying measurement distributions,
we quantify the loss of performance of
the localization-based detection
as a function of loss rate and packing number of state space.
We present simulation and experimental results that illustrate the
performance degradations due to network losses in detecting radiation
sources.