I tried what I thought would be simple power calculations, but the calculations are not balancing. It seems that I'm doing something fundamentally incorrect. Here's a hypothetical situation that illustrates the problem.
There is a new 120/208 panel that has only one 2-pole breaker installed. The load is 208 and the neutral is not used by the load. The panel power meter shows line currents of 10, 10, 0. The power factor is .90. With the above readings, I believe the power for the panel (which represents only the one load) would be:
Average of line currents * line voltage * power factor * sqrt(3)
6.67 * 208 * .9 * 1.732 = 2,162 Watts
Then I go out to the field with my multi-meter and calculate power at the load:
Line current * line voltage * power factor (assumed to be same as panel)
10 * 208 * .9 = 1,872 Watts.
Why do I calculate ~15% lower power at the load?