Abstract
The need to accurately measure high-frequency content in power system voltage and current phenomena is increasingly becoming more of a priority. As the amount of distributed energy resources (DER) and nonlinear loads penetrating the grid increases, so do challenges associated with traditional measurement and metering applications. In this paper, three commercially-available medium-voltage-level current sensors are characterized in terms of their harmonic amplitude and phase performance against reference signals that are “played back” through the sensors through the use of an arbitrary waveform generator. It is shown that none of the three sensors studied are able to faithfully replicate all of the input signals completely, though there are advantages and disadvantages to each in terms of noise, resonance, and induced phase drift. Additionally, the Goodness-of-Fit metric, typically used for PMU model validation, is used to generate side-by-side comparisons of sensor accuracy over a small window around the events under study.