In a home lights draw 20 A and are on 10 hrs every day lamps
In a home lights draw 20 A and are on 10 hrs every day, lamps are about 100 ft from wiring panel. 12 gauge wire carries enough current but you are considering whether to buy 10 gauge wire to save money as potential lower wire loss. Assume “Romex” wire (2 wire + ground) costs $0.50 per ft for 12 gauge and $0.70 per ft for 10 gauge, and utility rates are $0.10 per KWh. How many years (simple payback period) would it take to pay the extra cost of heavier duty wire?
Solution
Given that home lighting system draw Is = 20 A of current and on for T = 10 hrs every day.
Energy consumed per day considering Vs = 120 Vrms is
E = Vs * I * T = 120 * 20 * 10 = 24000 WH or 24 kWH;
Utility rate per day Cost = E * 0.10 = 24*0.10 = 2.4 $/day
Given distance from wiring panel to load is 100 ft
If 10 gauge wire is used then cost of 100 ft is = 100*0.70 = 70 $
If 12 gauge wire is used then cost of 100 ft is = 100*0.50 = 50 $
The extra income need to invest if 10 gauge wire was selected instead of 12 gauge wire is 20 $ ;
Considering copper romex wire of (2 wire + groung)
Resistance per foot for 12 gauge wire is = 1.588*10-3 ohm/foot R12 = 200*1.588*10-3 = 0.3176 ohm
Resitance per foot for 10 gauge wire is = 0.9989*10-3 ohm/foot R10 = 200*0.9989*10-3 = 0.19978 ohm
Energy loss due resistance of wire is
If 12 gauge wire is considered then E12 = Is2 * R12 * T = 202 * 0.3176 * 10 = 1.270 kWH
If 10 gauge wire is considered then E10 = Is2 * R10 * T = 202 * 0.19978 * 10 = 0.799 kWH
Cost of Energy lost per day if 12 gauge wire is used is = 1.27*0.1 = 0.127 $/day
Cost of Energy lost per day if 10 gauge wire is used is = 0.799*0.1 = 0.0799 $/day
Money saved if 10 gauge used instead of 12 gauge is E12 - E10 = 0.127 - 0.0799 = 0.0471 $/day
The additional investment will be recovered in = 20 /0.0471 = 425 days
The time of recovery in years is given by 425 days = 1 year 2 months or 1.2 years

