I would use the "True RMS" split core current transformers with the 4-20 ma output (better for long cable runs and is more noise immune).

You should also be measuring voltage as well as WATTS = VOLTAGE * CURRENT, but again, the voltage probably doesn't change all that much and you could get away with using a standard measured value.

You would have to provide a terminating resistor for the 4-20 milliamp output to convert it to voltage at the Analog to Digital input of the PH Anderson device. Note that you want to span the entire range of allowable voltage input in order to maximize the full scale bit resolution of the A-D; but, at the same time you have to make sure you don't over-range that allowable maximum limit. For instance if you will be reading 10 volts (which is probably near the maximum input for the Analog to Digital board) you will want 10 volts / 20 milliamps = 500 ohm resistor tied between the plus and minus inputs (use a 1 percent precision resistor). Note that you will have a "Y" intercept that will be something other than zero as your lower end will be the 4 milliamps x 500 ohms (my A-D guide mentioned below will take this into consideration).

I noticed this excerpt from the link you provided from PH Anderson:

The DS2438 A/D converter is a ten bit A/D and the device uses an internal band gap reference such that the readings are reported in VDC over the range of 0.0 to 10.23 VDC. The resolution is 10 mV.

The DS2450 Quad A/D is operated in a 12 bit mode using an internal badgap reference which quantizes the input voltage on each of the four channels over the range of 0.0 to 5.12 VDC. The resolution is 1.25 mV.

I have to admit I'm not certain what the voltage min/max is with that language

. I'm not sure if he means the input range is 0 to 10.23 volts with 10 bit resolution for one and 0-5.12 volts with 12 bit resolution for the other model. Perhaps Email Professor Anderson (he is a very responsive/friendly person).

In any case once you determine the range of voltage output your current sensor will give you and after you made sure it is within the range of your A-D converter's input capability, you will then have to determine an equation to convert measured AC amps to the DC voltage input measured by this channel.

I have written a "How-To" guide on

Analog to Digital Converters to help with just this purpose (as a matter of fact an example in that How-To shows how to use a current sensor).

You will then have to write some code to interpret the value of the serial interface output, then poll the device and store those values. You will then have to use that “voltage” value with your equation to convert it to AC amps, then multiply that by your AC voltage to get the Watts dissipated at that time. You will then have to look at total watts during a set amount of time to get Watt-Hours. This measurement's accuracy will greatly depend on the sampling rate/storage of the polled value. I'm not certain how much accuracy you will loose using a "set" AC voltage value (since you are not measuring it).

In cases like this I also like to get guidance from fellow Cocooners such as Guy Lavoie and Michael McSharry to make sure my methodology makes sense!

. (Afterall I did a whopping ten minutes of analysis on this post, so I'm sure I missed/said something incorretly). Hopefully they will chime in and give some more guidance and suggestions.